Read AppServiceConsoleLogs and push to Blob Storage container/table

BalaKrishna Mahamkali 0 Reputation points
2024-11-20T09:31:14.9233333+00:00

Hello Team,

we have a webapp python Django based. We are generating custom logs for user actions using code below.

Settings.py:

LOGGING = {

"version": 1,

"disable_existing_loggers": False,

"formatters": {

    "formatter_ABC": {

        "format": "ABC {levelname};{message}",

        "style": "{"

    }

},

"handlers": {

    "console": {

        "class": "logging.StreamHandler",

    },

    "console_ABC": {

        "level": "INFO",

        "class": "logging.StreamHandler",

        "formatter": "formatter_ABC"

    }

},

"root": {

    "handlers": ["console"],

    "level": "WARNING"

},

"loggers": {

    "ABC": {

        "handlers": ["console_ABC"],

        "level": "INFO",

        "propagate": False,

    }

}
```}

app.py:importing:

import logging

ABC_logger = logging.getLogger('ABC')

#user action

ABC_logger.info("user action")

Now, these user action logs i can see in "AppServiceConsoleLogs".

Currently, we are doing the following manual step:

Login to Azure portal

Select proper orchestrator service depending on which environment you target

UAT:  webapp-uat-test-python-django-app

Select Logs in Monitoring section

Select desired time range(one week)

Copy below query in KQL mode:

AppServiceConsoleLogs

| project TimeGenerated, ResultDescription

| where ResultDescription hasprefix "ABC INFO"

| order by TimeGenerated desc

 

export the data as CSV once loaded & share with Qlinksense team.

As per the security approach, we dont store the logs directly in blob storage.

We want to avoid this manual process and make it automatic daily.

We have two storage account one is specific BLOB srorages and other is for Data Lake Storage something.

We created a container & table for in BLOB storage account and one more ADDS container in data-lake storage account.

So, now i want to know how to read and push the AppServiceConsoleLogs logs and filter them by "ABC INFO" and consolidated logs to both storages daily job.

As far as i discussed internally, people are say use triggers job or functions apps to do that.

But i am not sure which is the correct way.

 

if there is any alternative way with less config/services. Please let me know.

Please share examples links if any.

thanks,

Bala Krishna

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,494 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,942 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,235 questions
Azure App Service
Azure App Service
Azure App Service is a service used to create and deploy scalable, mission-critical web apps.
7,951 questions
Azure Static Web Apps
Azure Static Web Apps
An Azure service that provides streamlined full-stack web app development.
987 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.