@BalaKrishna Mahamkali Thanks for reaching out or posting your question in Microsoft Q&A.
Based on the share information I understand you want to store the output of above KQL queries as blobs in storage accounts and then read the data from data bricks cluster. Correct me if my understanding is wrong.
You need to build the custom solution using logic apps or through function apps which should have the below actions in your solution to meet your requirement.
For Logic Apps:
- Trigger might be our choice either HTTP based or schedule (if you want to run it daily).
- Use the Run Query and list Results action in Azure monitor to run the above query.
- Perform some data operations to filter or to transform of the data to required data type.
- Use the Create blob action of Azure Blob storage to create blob inside a container in the storage account.
For Function Apps:
- Create a HTTP trigger function, Call the Log analytics query REST API to run the above KQL query.
- And by using the output bindings to create a blob in storage account and if you want to store the data in table use SDK.
If you don't want to store the data to storage account and directly consume the data present in "AppServiceConsoleLogs" table.
- Call the Log Analytics Rest Api from data bricks (blog post to call Rest API from data bricks) and still if you want to store the queried data using the cotainer urls you can create the blobs in storage account directly as described in this blog post.
Hope this helps and let me know if you have any further questions on this.