We are looking for a spark structured based streaming to consume data from activeMQ
We are looking for a spark structured based streaming to consume data from activeMQ. However, we dont see any default connector or support as such available for the same. Can you please suggest, is there any roadmap in place to support this ?
tagging strategy implementation
Hello, when the Azure Data Lake design is done, partitioning and tagging are key performance factors as they ensure smooth transaction patterns. I am not quite clear on the tagging aspects, where and what all components should be covered, and what needs…
Data Share support ADLS Gen2 with private endpoint
Do data share support ADLS Gen2 that is deployed with Private Endpoint?
Read AppServiceConsoleLogs and push to Blob Storage container/table
Hello Team, we have a webapp python Django based. We are generating custom logs for user actions using code below. Settings.py: LOGGING = { "version": 1, "disable_existing_loggers": False, "formatters": { …
Azure ML | Support for ADLS Gen2 Datastore with SAS Token Authentication
Hello, I’m currently working on an application where we’re connecting various data sources—such as file shares and ADLS Gen2—to Azure Machine Learning. While we can create file-share datastores with SAS token authentication, I noticed that this option…
Getting "storage_root does not specify a URI scheme." while setting ADLS path in default metastore.
Hi Team, I am facing while setting the ADLS path for the existing default metastore in account console page in azure. Below is the image showing i have already created the container and have enabled Hierarchical namespace as well. But getting below…
Data Quality issue in the Purview
Hello Team, We have configured the ADLS2 as source and scan it. For the data Quality, we did the following the steps : Create the Governance domain and publish it. Create the Data Product and add the tables. In the Data Quality section, add the…
Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName.
This exception occurred when I use the pipeline of data factory to copy data from sql server to lakehouse. But I didn't find any problems with the raw data.
error code 2011
i am testing a pipeline, i introduced a repeated row in one of the files that i want to upload. i was expecting that the pipeline would have run anyway, uploading the correct files and not the incorrect one. actually, the entire pipeline did not work...…
In MS fabric users able to view data but cannot download it
In MS Fabric, I want users to be able to view data in the workspace but not be able to download it. Please provide clear steps, along with links to verify the authenticity of the solution provided.
Copy Files from sharepoint online site to azure datalake storage
Hello We are trying to setup the flow which will copy files from sharepoint online site to azure datalake storage. As per my understanding there are 2 options : Using ADF to pull the files as mentioned in the link below…
504.0 GatewayTimeout & Invoking Azure function failed with HttpStatusCode - 499.
We've developed an Azure Function in python that connect to a Blob Storage, reads files and writes into in Azure tables. During the process, using Azure Functions & it's running fine for small size files (Less than 100 MB). The problem is that, when…
Alternative Methods for Capturing Data Lake Size in Less Time
Need assistance in capturing the size of the data lake per environment (e.g., Dev, SIT, Prod). Currently, a PowerShell script is used to fetch details, generating a CSV file for each environment with the medallion, folder, subfolder, and size. The…
Transforming JSON files using data flow
Hello! I currently have about 60 json files inside a blob container which most of them have different fields and values. I have created a pipeline with a get metadata activity that points to the container, with the field list set to Child items. I have…
Data lake solutions
We are in the process of Data Lake and going further down the line we are really getting confused whether to go for delta lake , datalakehouse, or synapse analytics. The subtle nuances making things not easier such as " A Data Lake House merges…
Why function could not find file
Hi there, I built an Azure Function to process json data from external requests, and then saved the json to a local file, uploaded it to the Container through the storage client. It worked well locally, and once deployed to Azure, it would prompt that…
Data lake schema enforcement
Hello, In Data Lake data is processed or ingested as schema on read and that is data is read in it format that it comes from the source. But I read an article that says schema enforcement makes data lakes high-performance and data readable. Please…
How to send a mail notification for a failed pipeline in Azure Synapse Analytics?
How can I send a notification email to a specific email address without using a logic app when one of my Synapse Analytics pipelines fails? I would like to include the error message in the email notification.
How cn i restrict someone from not downloading any data from Fabric Lakehouse or Fabric warehouse
As a data admin, i want to control data access for a user in your Microsoft Fabric Warehouse. The goal is to allow this user, who has a Contributor role, to view data directly in the workspace without being able to download it as a file. This scenario…
Getting Issue while upload files on azure data lake storage
I have two application first application for frontend :- which is responsible to take file from local storage and send that files azure functions via API 2nd application for backend:- which is responsible to take file from form-data (multipart) and…