How to collect multiple text logs from a single Arc server with Data Collection Rules? I have deployed multiple DCR's, but only the first configured rule is collecting data.

Martin Mikkelsen 0 Reputation points
2025-01-16T08:45:02.5966667+00:00

I have provided a sample bicep template that we use to deploy the DCR's. We manually associate the DCR with the Arc server after deployment.

The Arc servers are running RHEL 9.4.

When I deploy the first DCR, logs appear as expected in the IngestionTest_CL table. When I deploy a second DCR similar to the one below with a different filepath, it doesn't work. If I delete the first DCR and redeploy the second one, then it works for the second filepath, but now I am missing the first DCR. I have tried many combinations of destination tables, stream names, and with and without transform.

I have also tried using wildcards in the filepath. This works for collecting multiple files with a single DCR, but then I am not able to use different tables or transforms.

I have verified that the AMA agent extension is collecting the config for both DCRs, and no errors appear in the logs.

The AMA agent is also sending heartbeats, which are visible in Log Analytics.

param dce_ResourceId string
param log_ResourceId  string
param filePath string
param env string
param transform string
param location string = resourceGroup().location
param timestampFormat string = 'YYYY-MM-DD HH:MM:SS'

resource dcr 'Microsoft.Insights/dataCollectionRules@2023-03-11' = {
  name: 'dcr-${env}-applogs-${location}'
  tags: {
    environment: env
  }
  location: location
  kind: 'Linux'
  properties: {
    dataCollectionEndpointId: dce_ResourceId
    streamDeclarations: {
      //streamDeclaration name must be <= 32 characters
      'Custom-Text-integration_CL': {
        columns: [
          {
            name: 'TimeGenerated'
            type: 'datetime'
          }
          {
            name: 'RawData'
            type: 'string'
          }
          {
            name: 'FilePath'
            type: 'string'
          }        ]
      }
    }
    dataSources: {
      logFiles: [
        {
          streams: [
            'Custom-Text-integration_CL'
          ]
          filePatterns: [
            filePath
          ]
          format: 'text'
          settings: {
            text: {
              recordStartTimestampFormat: timestampFormat
            }
          }
          name: 'Custom-Text-integration_CL'
        }
      ]
    }
    destinations: {
      logAnalytics: [
        {
          name: 'loganalytics'
          workspaceResourceId: log_ResourceId
        }
      ]
    }
    dataFlows: [
      {
        transformKql: transform
        streams: [
          'Custom-Text-integration_CL'
        ]
        destinations: [
          'loganalytics'
        ]
        outputStream: 'Custom-IngestionTest_CL'

      }
    ]
  }

  }

output dataCollectionRuleId string = dcr.id

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
3,428 questions
Azure Arc
Azure Arc
A Microsoft cloud service that enables deployment of Azure services across hybrid and multicloud environments.
469 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Rahul Podila 1,395 Reputation points Microsoft Vendor
    2025-01-17T08:31:00.0633333+00:00

    Hi @Martin Mikkelsen

    Welcome to the Microsoft Q&A Platform! Thank you for asking your question here.

    When trying to use a second Data Collection Rule (DCR) — it works fine for the first, but the second DCR doesn’t collect logs until it removes the first It does if two DCRs conflict with each other when trying to send data to the same Log Analytics workstation.

    To fix this, you want to make sure that each DCR is configured slightly differently. First, each DCR must have a unique stream name. For example, you can name Custom-Text-integration_CL-1 for the first and Custom-Text-integration_CL-2 for the second. This ensures that logs are not mixed between the two DCRs.

    Now, each DCR in Log Analytics can have a different output stream. This will ensure that logs from each DCR go into their own separate tables, rather than overwriting each other. You can name Custom-IngestionTest_CL-1 for the first DCR and Custom-IngestionTest_CL-2 for the second.

    Finally, if both DCRs are set to save from the same file, they may interfere with each other. You will need to ensure that the file paths on each DCR are unique and do not overlap. If you use wildcards (like *.log) just make sure each DCR saves it from files.

    After updating the DCRs with different names, output streams, and file paths, try reloading them and checking Log Analytics to ensure the logs are working properly

    Let me know if you need more help with this or have any other questions. I’m happy to help!

    If you have any concerns, please go through this link: -

    https://learn.microsoft.com/en-us/azure/azure-monitor/essentials/data-collection-rule-overview

    https://learn.microsoft.com/en-us/azure/azure-monitor/logs/log-analytics-workspace-overview

    If you have any further queries, do let us know


    If the answer is helpful, please click "Accept Answer" and "Upvote it"


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.