Muokkaa

Jaa


Create a data collection rule (DCR) for metrics export

This article describes how to create a data collection rule (DCR) for metrics export using the Azure portal, Azure CLI, PowerShell, API, or ARM templates.

Important

To send Platform Telemetry data to Storage Accounts or Event Hubs, the resource, data collection rule, and the destination Storage Account or the Event Hubs must all be in the same region.

  1. On the Monitor menu in the Azure portal, select Data Collection Rules then select Create.

  2. To create a DCR to collect platform metrics data, select the link on the top of the page. A screenshot showing the create data collection rule page.

  3. On the Create Data Collection Rule page, enter a rule name, select a Subscription, Resource group, and Region for the DCR.

  4. Select Enable Managed Identity if you want to send metrics to a Storage Account or Event Hubs.

  5. Select Next A screenshot showing the basics tab of the create data collection rule page.

  6. On the Resources page, select Add resources to add the resources you want to collect metrics from.

  7. Select Next to move to the Collect and deliver tab. A screenshot showing the resources tab of the create data collection rule page.

  8. Select Add new dataflow

  9. The resource type of the resource that chose in the previous step is automatically selected. Add more resource types if you want to use this rule to collect metrics from multiple resource types in the future.

  10. Select Next Destinations to move to the Destinations tab. A screenshot showing the collect and deliver tab of the create data collection rule page.

  11. To send metrics to a Log Analytics workspace, select Azure Monitor Logs from the Destination type dropdown.

    1. Select the Subscription and the Log Analytics workspace you want to send the metrics to.
  12. To send metrics to Event Hubs, select Event Hub from the Destination type dropdown.

    1. Select the Subscription, the Event Hub namespace, and the Event Hub instance name.
  13. To send metrics to a Storage Account, select Storage Account from the Destination type dropdown.

    1. Select the Subscription, the Storage Account, and the Blob container where you want to store the metrics.

    Note

    To sent metrics to a Storage Account or Event Hubs, the resource generating the metrics, the DCR, and the Storage Account or Event Hub, must all be in the same region.
    To send metrics to a Log Analytics workspace, the DCR must be in the same region as the Log Analytics workspace. The resource generating the metrics can be in any region.

    To select Storage Account or Event Hubs as the destination, you must enable managed identity for the DCR on the Basics tab.

  14. Select Save , then select Review + create. A screenshot showing the destination tab of collect and deliver page.

After creating the DCR and DCRA, allow up to 30 minutes for the first platform metrics data to appear in the Log Analytics Workspace. Once data starts flowing, the latency for a platform metric time series flowing to a Log Analytics workspace, Storage Account, or Event Hubs is approximately 3 minutes, depending on the resource type.

Verify and troubleshoot data collection

Once you install the DCR, it may take several minutes for the changes to take effect and data to be collected with the updated DCR. If you don't see any data being collected, it can be difficult to determine the root cause of the issue. Use the DCR monitoring features, which include metrics and logs to help troubleshoots.

DCR metrics are collected automatically for all DCRs, and you can analyze them using metrics explorer like the platform metrics for other Azure resources. Enable DCR error logs to get detailed error information when data processing is not successful.

If you don't see data being collected, follow these basic steps to troubleshoot the issue.

  1. Check metrics such as Logs Ingestion Bytes per Min and Logs Rows Received per Min to ensure that the data is reaching Azure Monitor. If not, then check your data source to ensure that it's sending data as expected.
  2. Check Logs Rows Dropped per Min to see if any rows are being dropped. This may not indicate an error since the rows could be dropped by a transformation. If the rows dropped is the same as Logs Rows Dropped per Min though, then no data will be ingested in the workspace. Examine the Logs Transformation Errors per Min to see if there are any transformation errors.
  3. Check Logs Transformation Errors per Min to determine if there are any errors from transformations applied to the incoming data. This could be due to changes in the data structure or the transformation itself.
  4. Check the DCRLogErrors table for any ingestion errors that may have been logged. This can provide additional detail in identifying the root cause of the issue.

Next steps