

# Configure Node-RED flows for Amazon IoT SiteWise data integration
Configure Node-RED flows

With Node-RED®, you can implement two flows to manage data between your devices and Amazon IoT SiteWise. These flows work together to create a comprehensive data management solution that addresses both local and cloud data flow.
+ **Data publish flow** – Publishes to the cloud. The data publish flow sends data to Amazon IoT SiteWise. This flow simulates a turbine device by generating sensor data, translating it to Amazon IoT SiteWise format, and publishing to the SiteWise Edge MQTT broker. This enables you to leverage Amazon IoT SiteWise's cloud capabilities for storage, analytics, and integration with other Amazon services.

  For more information, see [Configure the data publish flow](windows-nodered-data-publish-flow.md).
+ **Data retention flow** – Stores data at the edge. The data retention flow subscribes to the SiteWise Edge MQTT broker to receive data, translate it into InfluxDB® format, and stores it locally for monitoring. This local storage provides immediate access to operational data, reduces latency for time-critical applications, and ensures continuity during network disruptions.

  For more information, see [Configure the data retention flow](windows-nodered-data-retention-flow.md).

These two flows work together to ensure data is both sent to Amazon IoT SiteWise and stored locally for immediate access.

To access your Node-RED console, go to [http://127.0.0.1:1880](http://127.0.0.1:1880). For information about enabling TLS, see [Enable TLS encryption](https://docs.influxdata.com/influxdb/v2/admin/security/enable-tls/).

# Configure the data publish flow


The data publish flow uses three nodes to create a pipeline that sends your industrial data to the cloud. This flow is essential for enabling cloud-based analytics, long-term storage, and integration with other Amazon services. First, simulated device data is sent to the SiteWise Edge MQTT broker. The gateway picks up the data from the broker which allows for transmission to the Amazon IoT SiteWise cloud, where you can leverage powerful analytics and visualization capabilities.
+ **Data input** - Receives device data from your industrial equipment or simulators
+ **Data translator for Amazon IoT SiteWise** - Translates data to Amazon IoT SiteWise format to ensure compatibility with the SiteWise Edge gateway
+ **MQTT publisher** - Publishes data to SiteWise Edge MQTT broker, making it available to both local and cloud consumers

![\[A diagram showing the Node-RED data publishing flow. It sends simulated device data to the SiteWise Edge MQTT broker for pickup by SiteWise Edge Gateway and then onto the Amazon IoT SiteWise Cloud.\]](http://docs.amazonaws.cn/en_us/iot-sitewise/latest/userguide/images/gateway-open-source-nodered-publish-flow.png)


## Configure the data input node


In this example, the data input node uses a simulated wind turbine device that generates wind speed data. This node serves as the entry point for your industrial data, whether it comes from simulated sources (as in our example) or from actual industrial equipment in production environments.

We use a custom JSON format for the data payload to provide a standardized structure that works efficiently with both local processing tools and the Amazon IoT SiteWise cloud service. This format includes essential metadata like timestamps and quality indicators alongside the actual measurement values, enabling comprehensive data management and quality tracking throughout your pipeline. Import the inject node to receive simulated data in this standardized JSON format with timestamps, quality indicators, and values.

For more information on the Node-RED inject node, see the [Inject](https://nodered.org/docs/user-guide/nodes#inject) section in the *Node-RED Documentation*.

The turbine simulator generates wind speed data every second in this standardized JSON format:

**Example : Turbine data payload**  

```
{
    name: string,         // Property name/identifier
    timestamp: number,    // Epoch time in nanoseconds
    quality: "GOOD" | "UNCERTAIN" | "BAD",
    value: number | string | boolean
}
```

This format provides several benefits:
+ The `name` field identifies the specific property or measurement, allowing you to track multiple data points from the same device
+ The `timestamp` in nanoseconds ensures precise time tracking for accurate historical analysis
+ The `quality` indicator helps you filter and manage data based on its reliability
+ The flexible `value` field supports different data types to accommodate various sensor outputs

**Example : Inject node of a turbine simulator**  

```
[
    {
        "id": "string",
        "type": "inject",
        "z": "string",
        "name": "Turbine Simulator",
        "props": [
            {
                "p": "payload.timestamp",
                "v": "",
                "vt": "date"
            },
            {
                "p": "payload.quality",
                "v": "GOOD",
                "vt": "str"
            },
            {
                "p": "payload.value",
                "v": "$random()",
                "vt": "jsonata"
            },
            {
                "p": "payload.name",
                "v": "/Renton/WindFarm/Turbine/WindSpeed",
                "vt": "str"
            }
        ],
        "repeat": "1",
        "crontab": "",
        "once": false,
        "onceDelay": "",
        "topic": "",
        "x": 270,
        "y": 200,
        "wires": [
            [
                "string"
            ]
        ]
    }
]
```

## Configure a node for data translation


The SiteWise Edge gateway requires data in a specific format to ensure compatibility with Amazon IoT SiteWise cloud. The translator node is an important component that converts your input data to the required Amazon IoT SiteWise payload format. This translation step ensures that your industrial data can be properly processed, stored, and later analyzed in the Amazon IoT SiteWise cloud environment.

By standardizing the data format at this stage, you enable integration between your edge devices and the cloud service where you can use asset modeling, analytics, and visualization capabilities. Use this structure:

**Example : Payload structure for SiteWise Edge data parsing**  

```
{
  "propertyAlias": "string",  
  "propertyValues": [
    {
      "value": { 
          "booleanValue": boolean, 
          "doubleValue": number, 
          "integerValue": number,
          "stringValue": "string" 
     },
      "timestamp": {
          "timeInSeconds": number,
          "offsetInNanos": number
      },
      "quality": "GOOD" | "UNCERTAIN" | "BAD",
  }]
}
```

**Note**  
Match the `propertyAlias` to your MQTT topic hierarchy (for example, `/Renton/WindFarm/Turbine/WindSpeed`). This ensures that your data is properly associated with the correct asset property in Amazon IoT SiteWise. For more information, see the "Data stream alias" concept in [Amazon IoT SiteWise concepts](concept-overview.md). 

1. Import the example function node for Amazon IoT SiteWise payload translation. This function handles the conversion from your standardized input format to the Amazon IoT SiteWise-compatible format, ensuring proper timestamp formatting, quality indicators, and value typing.

   ```
   [
       {
           "id": "string",
           "type": "function",
           "z": "string",
           "name": "Translate to SiteWise payload",
           "func": "let input = msg.payload;\nlet output = {};\n\noutput[\"propertyAlias\"] = input.name;\n\nlet propertyVal = {}\n\nlet timeInSeconds = Math.floor(input.timestamp / 1000);\nlet offsetInNanos = (input.timestamp % 1000) * 1000000;\n\npropertyVal[\"timestamp\"] = {\n    \"timeInSeconds\": timeInSeconds,\n    \"offsetInNanos\": offsetInNanos,\n};\n\npropertyVal[\"quality\"] = input.quality\n\nlet typeNameConverter = {\n    \"number\": (x) => Number.isInteger(x) ? \"integerValue\" : \"doubleValue\",\n    \"boolean\": (x) => \"booleanValue\",\n    \"string\": (x) => \"stringValue\", \n}\nlet typeName = typeNameConverter[typeof input.value](input.value)\npropertyVal[\"value\"] = {}\npropertyVal[\"value\"][typeName] = input.value;\n\noutput[\"propertyValues\"] = [propertyVal]\n\nreturn {\n    payload: JSON.stringify(output)\n};",
           "outputs": 1,
           "timeout": "",
           "noerr": 0,
           "initialize": "",
           "finalize": "",
           "libs": [],
           "x": 530,
           "y": 200,
           "wires": [
               [
                   "string"
               ]
           ]
       }
   ]
   ```

1. Verify that the JavaScript code translates wind speed data correctly. The function performs several important tasks:
   + Extracts the property name from the input and sets it as the propertyAlias
   + Converts the timestamp from milliseconds to the required seconds and nanoseconds format
   + Preserves the quality indicator for data reliability tracking
   + Automatically detects the value type and formats it according to Amazon IoT SiteWise requirements

1. Connect the node to your flow, linking it between the data input node and the MQTT publisher.

For guidance on writing a function specific to your business needs, see [Writing Functions](https://nodered.org/docs/user-guide/writing-functions) in the *Node-RED Documentation*

## Configure the MQTT publisher


After translation, the data is ready for publication to the SiteWise Edge MQTT broker.

Configure the MQTT publisher with these settings to send data to the SiteWise Edge MQTT broker:

**To import the MQTT out node**

1. Import an MQTT out configuration node using `"type": "mqtt out"`. MQTT out nodes let you share a broker's configuration.

1. Enter key-value pairs for information relevant to MQTT broker connection and message routing.

Import the example `mqtt out` node.

**Example**  

```
[
    {
        "id": "string",
        "type": "mqtt out",
        "z": "string",
        "name": "Publish to MQTT broker",
        "topic": "/Renton/WindFarm/Turbine/WindSpeed",
        "qos": "1",
        "retain": "",
        "respTopic": "",
        "contentType": "",
        "userProps": "",
        "correl": "",
        "expiry": "",
        "broker": "string",
        "x": 830,
        "y": 200,
        "wires": []
    },
    {
        "id": "string",
        "type": "mqtt-broker",
        "name": "emqx",
        "broker": "127.0.0.1",
        "port": "1883",
        "clientid": "",
        "autoConnect": true,
        "usetls": false,
        "protocolVersion": "5",
        "keepalive": 15,
        "cleansession": true,
        "autoUnsubscribe": true,
        "birthTopic": "",
        "birthQos": "0",
        "birthPayload": "",
        "birthMsg": {},
        "closeTopic": "",
        "closePayload": "",
        "closeMsg": {},
        "willTopic": "",
        "willQos": "0",
        "willPayload": "",
        "willMsg": {},
        "userProps": "",
        "sessionExpiry": ""
    }
]
```

The example MQTT out node creates the MQTT connection with the following information:
+ Server: `127.0.0.1`
+ Port: `1883`
+ Protocol: `MQTT V5`

Then, the MQTT out node configures message routing with the following information:
+ Topic: `/Renton/WindFarm/Turbine/WindSpeed`
+ QoS: `1`

## Deploy and verify the nodes


After configuring the three data publish flow nodes, follow these steps to deploy the flow and verify that data is being transmitted correctly to Amazon IoT SiteWise

**To deploy and verify connections**

1. Connect the three nodes as shown in the data publish flow.  
![\[Data publish flow diagram showing input from turbine simulator to Amazon IoT SiteWise to MQTT broker.\]](http://docs.amazonaws.cn/en_us/iot-sitewise/latest/userguide/images/gateway-open-source-nodered-publish-flow.png)

1. Choose **Deploy** to apply all node connection changes.

1. Navigate to the [Amazon IoT SiteWise console](https://console.amazonaws.cn/iotsitewise/) and choose **Data streams**.

1. Ensure **Alias prefix** is selected in the dropdown menu. Then, search for the `/Renton/WindFarm/Turbine/WindSpeed` alias.

If you see the correct alias in your search, you have deployed the flow and verified data transmission.

# Configure the data retention flow


The data retention flow is can be used to maintain operational visibility at the edge. This is useful during network disruptions or when you need immediate access to your data. This flow subscribes to the MQTT broker to receive device data, converts it to InfluxDB® format, and stores it locally. By implementing this flow, you create a resilient local data store that operators can access without cloud dependencies, enabling real-time monitoring and decision-making at the edge.

The flow consists of three key components working together to ensure your data is properly captured and stored:
+ **MQTT subscription client** - Receives data from the broker, ensuring you capture all relevant industrial data
+ **InfluxDB translator** - Converts Amazon IoT SiteWise payload to InfluxDB format, preparing the data for efficient time-series storage
+ **InfluxDB writer** - Handles local storage, ensuring data persistence and availability for local applications

![\[Node-RED data retention flow\]](http://docs.amazonaws.cn/en_us/iot-sitewise/latest/userguide/images/gateway-open-source-nodered-data-retention.png)


## Set up the MQTT subscription client

+ Configure the MQTT subscription client in Node-RED to receive data from the MQTT EMQX broker in Amazon IoT SiteWise by importing the example below.  
**Example : MQTT in node**  

  ```
  [
      {
          "id": "string",
          "type": "mqtt in",
          "z": "string",
          "name": "Subscribe to MQTT broker",
          "topic": "/Renton/WindFarm/Turbine/WindSpeed",
          "qos": "1",
          "datatype": "auto-detect",
          "broker": "string",
          "nl": false,
          "rap": true,
          "rh": 0,
          "inputs": 0,
          "x": 290,
          "y": 340,
          "wires": [
              [
                  "string"
              ]
          ]
      },
      {
          "id": "string",
          "type": "mqtt-broker",
          "name": "emqx",
          "broker": "127.0.0.1",
          "port": "1883",
          "clientid": "",
          "autoConnect": true,
          "usetls": false,
          "protocolVersion": "5",
          "keepalive": 15,
          "cleansession": true,
          "autoUnsubscribe": true,
          "birthTopic": "",
          "birthQos": "0",
          "birthPayload": "",
          "birthMsg": {},
          "closeTopic": "",
          "closePayload": "",
          "closeMsg": {},
          "willTopic": "",
          "willQos": "0",
          "willPayload": "",
          "willMsg": {},
          "userProps": "",
          "sessionExpiry": ""
      }
  ]
  ```

This subscription ensures that all relevant data published to the broker is captured for local storage, providing a complete record of your industrial operations. The node uses the same MQTT connection parameters as the [Configure the MQTT publisher](windows-nodered-data-publish-flow.md#windows-nodered-mqtt-publisher-config) section, with the following subscription settings:
+ Topic – `/Renton/WindFarm/Turbine/WindSpeed`
+ QoS – `1`

For more information, see [Connect to an MQTT Broker](https://cookbook.nodered.org/mqtt/connect-to-broker) in the *Node-RED Documentation*.

## Configure the InfluxDB translator


InfluxDB organizes data using [tags](https://docs.influxdata.com/influxdb/v1/concepts/glossary/#tag) for indexing and [fields](https://docs.influxdata.com/influxdb/v1/concepts/glossary/#field) for values. This organization optimizes query performance and storage efficiency for time-series data. Import the example function node that contains JavaScript code to convert Amazon IoT SiteWise payload to InfluxDB format. The translator splits the properties into two groups:
+ Tags – Quality and name properties for efficient indexing
+ Fields – Timestamp (in milliseconds since epoch) and value

**Example : Function node of translating to an InfluxDB payload**  

```
[
    {
        "id": "string",
        "type": "function",
        "z": "string",
        "name": "Translate to InfluxDB payload",
        "func": "let data = msg.payload;\n\nlet timeInSeconds = data.propertyValues[0].timestamp.timeInSeconds;\nlet offsetInNanos = data.propertyValues[0].timestamp.offsetInNanos;\nlet timestampInMilliseconds = (timeInSeconds * 1000) + (offsetInNanos / 1000000);\n\nmsg.payload = [\n    {\n        \"timestamp(milliseconds_since_epoch)\": timestampInMilliseconds,\n        \"value\": data.propertyValues[0].value.doubleValue\n    },\n    {\n        \"name\": data.propertyAlias,\n        \"quality\": data.propertyValues[0].quality\n    }\n]\n\nreturn msg",
        "outputs": 1,
        "timeout": "",
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 560,
        "y": 340,
        "wires": [
            [
                "string"
            ]
        ]
    }
]
```

For additional configuration options, see the [node-red-contrib-influxdb](https://github.com/mblackstock/node-red-contrib-influxdb) in the Node-RED GitHub repository.

## Set up the InfluxDB writer


The InfluxDB writer node is the final component in your data retention flow, responsible for storing your industrial data in the local InfluxDB database. This local storage is important for maintaining operational visibility during network disruptions and providing immediate access to data for time-critical applications.

1. Install the node-red-contrib-influxdb package through the Manage palette option. This package provides the necessary nodes for connecting Node-RED with InfluxDB.

1. Add an InfluxDB out node to your flow. This node will handle the actual writing of data to your InfluxDB database.

1. Configure the server properties to establish a secure connection to your InfluxDB instance:

   1. Set Version to 2.0 - This specifies that you're connecting to InfluxDB v2.x, which uses a different API than earlier versions

   1. Set URL to `http://127.0.0.1:8086` - This points to your local InfluxDB instance

   1. Enter your InfluxDB authentication token. This secure token authorizes the connection to your database. You generated the token during the [Set up local storage with InfluxDB](windows-influxdb-setup.md) procedure.

1. Specify the storage location parameters to define where and how your data will be stored:

   1. Enter your InfluxDB Organization name – The organization is a workspace for a group of users, where your buckets and dashboards belong. For more information, see [Manage organizations](https://docs.influxdata.com/influxdb/v2/admin/organizations/) in the *InfluxData Documentation*.

   1. Specify the InfluxDB Bucket (for example, `WindFarmData`) – The bucket is equivalent to a database in traditional systems, serving as a container for your time series data

   1. Set the InfluxDB Measurement (for example, `TurbineData`) – The measurement is similar to a table in relational databases, organizing related data points

**Note**  
Find your organization name in the InfluxDB instance's left sidebar. The organization, bucket, and measurement concepts are fundamental to InfluxDB's data organization model. For more information, see the [InfluxDB documentation](https://docs.influxdata.com/influxdb/v2/admin/organizations/).

## Deploy and verify the retention flow


After configuring all components of the data retention flow, you need to deploy and verify that the system is working correctly. This verification ensures that your industrial data is being properly stored locally for immediate access and analysis.

1. Connect the three nodes as shown in the data retention flow diagram. This creates a complete pipeline from data subscription to local storage.  
![\[Node-RED data retention flow\]](http://docs.amazonaws.cn/en_us/iot-sitewise/latest/userguide/images/gateway-open-source-nodered-data-retention.png)

1. Choose **Deploy** to apply your changes and activate the flow. This starts the data collection and storage process.

1. Use the InfluxDB Data Explorer to query and visualize your data. This tool allows you to verify that data is being properly stored and to create initial visualizations of your time series data.

   In the Data Explorer, you should be able to see your wind speed measurements being recorded over time, confirming that the entire pipeline from data generation to local storage is functioning correctly. 

   For more information, see [Query in Data Explorer](https://docs.influxdata.com/influxdb/v2/query-data/execute-queries/data-explorer/) in the *InfluxData Documentation*.

With both the data publish flow and data retention flow deployed, you now have a complete system that sends data to the Amazon IoT SiteWise cloud while maintaining a local copy for immediate access and resilience. This dual-path approach ensures that you get the benefits of cloud-based analytics and storage while maintaining operational visibility at the edge.