Message processing in an IoT solution

This overview introduces the key concepts around processing messages sent from your assets and devices in a typical Azure IoT solution. Each section includes links to content that provides further detail and guidance.

The following diagram shows a high-level view of the components in a typical edge-based IoT solution. This article focuses on the message processing components of an edge-based IoT solution.

Diagram that shows the high-level edge-based IoT solution architecture highlighting message processing areas.

In Azure IoT, message processing refers to processes such as routing and enriching messages sent by assets and devices. These processes are used to control the flow of messages through the IoT solution and to add additional information to the messages.

Route messages

To route messages from your assets to various endpoints, Azure IoT Operations uses data flows. The destination endpoints might be in the cloud or at the edge. The list of available destination endpoints includes:

Endpoint type Description
MQTT For bi-directional messaging with MQTT brokers, including the one built-in to Azure IoT Operations and Event Grid.
Kafka For bi-directional messaging with Kafka brokers, including Azure Event Hubs.
Data Lake For uploading data to Azure Data Lake Gen2 storage accounts.
Microsoft Fabric OneLake For uploading data to Microsoft Fabric OneLake lakehouses.
Azure Data Explorer For uploading data to Azure Data Explorer databases.
Local storage For sending data to a locally available persistent volume, through which you can upload data via Azure Container Storage enabled by Azure Arc edge volumes.

The operations experience web UI provides a no-code environment for building and running your data flows.

For enhanced security in the data that is routed to your endpoints, cloud/edge synchronized secrets are used in data flow endpoints for authentication.

While data flows let you configure routing at the edge, you can also define routing in the cloud. If your data flow delivers messages to Azure Event Grid you can use its routing capabilities to determine where to send the messages.

To learn more, see Process and route data with data flows.

Enrich, transform, and process messages

Enrichments and transformations can be applied to data during the processing stage to perform various operations. These operations can include:

  • Compute new properties: Based on existing properties in the message
  • Rename properties: To standardize or clarify data
  • Convert units: Convert values to different units of measurement
  • Standardize values: Scale property values to a user-defined range
  • Contextualize data: Add reference data to messages for enrichment and driving insights

The schema registry stores schemas for messages coming from your assets. Data flows use these message schemas to decode messages from various formats so they can be processed by data flows.

The operations experience web UI provides a no-code environment for building and running the transformations in your data flows.

To learn more, see Enrich data by using data flows.

In Azure IoT Operations, you can deploy your own highly available edge applications to the Kubernetes cluster. The edge applications can interact with the built-in MQTT broker to:

  • Use custom message processing logic on the MQTT messages.
  • Build custom application logic to run at the edge.
  • Run Edge AI models for real-time data processing and decision-making at the source of data generation, reducing latency and bandwidth usage.

To learn more, see Develop highly available applications for Azure IoT Operations MQTT broker.

Other cloud services

You can use other cloud services to process messages from your assets and devices.

Data flow endpoints in Azure IoT Operations let you connect to cloud services to send and receive data from your assets. A data flow endpoint is the connection point for the data flow.

To learn more, see:

In IoT Hub and IoT Central, you can route messages to other services. For example, you can forward messages to Azure Stream Analytics to analyze and process large volumes of streaming data or to Azure Functions to run code in response to events. Stream Analytics is also available on the Azure IoT Edge runtime, enabling it to process data at the edge rather than in the cloud.

To learn more, see: