Data ingestion into new log Analytics custom table

Prashanth Kumar 0 Reputation points
2025-01-20T12:44:09.4266667+00:00

Hi Team,

need your help i am trying to ingest some of my custom logs to Log Analytics custom table, however when i try to ingest data to my custom table my rest API shows data has been ingested but when i check table i dont see any data.

and then if i try to ingest data to Custom table(Classic) data gets loaded.

so would like to understand what is the best way to deal with this and if anyone had go through the same pain points?

Here is my code link which i have used: https://stackoverflow.com/questions/77369094/send-custom-logs-to-log-analytics-workspace-via-rest

I tried running my tablename in kusto query but i dont see any data, but if i run kusto query to check for custom table (classic) data shows up. Also another question as my table schema keeps changing so how to deal with this scenario?

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
3,428 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Sina Salam 16,446 Reputation points
    2025-01-20T15:07:06.5533333+00:00

    Hello Prashanth Kumar,

    Welcome to the Microsoft Q&A and thank you for posting your questions here.

    I understand that you are having issues with ingesting custom logs into a Log Analytics custom table and challenges with changing table schemas.

    To troubleshoot this use the below steps:

    1. Ensure that the DCR is correctly configured to map the incoming data to the custom table. Check the DCR settings in the Azure portal to confirm that the data transformation and mapping are correct.
    2. Verify that the schema of the data being sent matches the schema defined in the custom table. Use the DCR to transform the incoming data to fit the table's schema if necessary.
    3. Make sure that the data is being sent to the correct DCE. Verify the endpoint URL and configuration in your REST API calls.
    4. Check that the application or service principal has the necessary permissions to write to the custom table. Ensure that it has access to the DCR and the Log Analytics workspace.
    5. Since you mentioned running Kusto queries, ensure that the queries are correctly formatted and targeting the right table. Verify that the table name and schema are correctly referenced in the queries.
    6. You can use dynamic schema updates in the DCR to accommodate changes in the data structure. This allows the DCR to adapt to new fields without manual intervention.
    7. Make sure that the data being sent to the custom table is in the correct format (e.g., JSON) and adheres to the expected schema.
    8. If the issue persists, consider reaching out to Azure support via your Portal for further assistance.

    I hope this is helpful! Do not hesitate to let me know if you have any other questions.


    Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.