Read AppServiceConsoleLogs and push to Blob Storage container/table

BalaKrishna Mahamkali 20 Reputation points
2024-11-20T09:31:14.9233333+00:00

Hello Team,

we have a webapp python Django based. We are generating custom logs for user actions using code below.

Settings.py:

LOGGING = {

"version": 1,

"disable_existing_loggers": False,

"formatters": {

    "formatter_ABC": {

        "format": "ABC {levelname};{message}",

        "style": "{"

    }

},

"handlers": {

    "console": {

        "class": "logging.StreamHandler",

    },

    "console_ABC": {

        "level": "INFO",

        "class": "logging.StreamHandler",

        "formatter": "formatter_ABC"

    }

},

"root": {

    "handlers": ["console"],

    "level": "WARNING"

},

"loggers": {

    "ABC": {

        "handlers": ["console_ABC"],

        "level": "INFO",

        "propagate": False,

    }

}
```}

app.py:importing:

import logging

ABC_logger = logging.getLogger('ABC')

#user action

ABC_logger.info("user action")

Now, these user action logs i can see in "AppServiceConsoleLogs".

Currently, we are doing the following manual step:

Login to Azure portal

Select proper orchestrator service depending on which environment you target

UAT:  webapp-uat-test-python-django-app

Select Logs in Monitoring section

Select desired time range(one week)

Copy below query in KQL mode:

AppServiceConsoleLogs

| project TimeGenerated, ResultDescription

| where ResultDescription hasprefix "ABC INFO"

| order by TimeGenerated desc

 

export the data as CSV once loaded & share with Qlinksense team.

As per the security approach, we dont store the logs directly in blob storage.

We want to avoid this manual process and make it automatic daily.

We have two storage account one is specific BLOB srorages and other is for Data Lake Storage something.

We created a container & table for in BLOB storage account and one more ADDS container in data-lake storage account.

So, now i want to know how to read and push the AppServiceConsoleLogs logs and filter them by "ABC INFO" and consolidated logs to both storages daily job.

As far as i discussed internally, people are say use triggers job or functions apps to do that.

But i am not sure which is the correct way.

 

if there is any alternative way with less config/services. Please let me know.

Please share examples links if any.

thanks,

Bala Krishna

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,527 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
3,036 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,305 questions
Azure App Service
Azure App Service
Azure App Service is a service used to create and deploy scalable, mission-critical web apps.
8,180 questions
Azure Static Web Apps
Azure Static Web Apps
An Azure service that provides streamlined full-stack web app development.
1,046 questions
{count} votes

Accepted answer
  1. VenkateshDodda-MSFT 23,686 Reputation points Microsoft Employee
    2024-11-21T05:52:05.7+00:00

    @BalaKrishna Mahamkali Thanks for reaching out or posting your question in Microsoft Q&A.

    Based on the share information I understand you want to store the output of above KQL queries as blobs in storage accounts and then read the data from data bricks cluster. Correct me if my understanding is wrong.

    You need to build the custom solution using logic apps or through function apps which should have the below actions in your solution to meet your requirement.

    For Logic Apps:

    1. Trigger might be our choice either HTTP based or schedule (if you want to run it daily).
    2. Use the Run Query and list Results action in Azure monitor to run the above query.
    3. Perform some data operations to filter or to transform of the data to required data type.
    4. Use the Create blob action of Azure Blob storage to create blob inside a container in the storage account.

    For Function Apps:

    1. Create a HTTP trigger function, Call the Log analytics query REST API to run the above KQL query.
    2. And by using the output bindings to create a blob in storage account and if you want to store the data in table use SDK.

    If you don't want to store the data to storage account and directly consume the data present in "AppServiceConsoleLogs" table.

    1. Call the Log Analytics Rest Api from data bricks (blog post to call Rest API from data bricks) and still if you want to store the queried data using the cotainer urls you can create the blobs in storage account directly as described in this blog post.

    Hope this helps and let me know if you have any further questions on this.

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.