Can anyone tell me why there are temp tables in lakehouse after successfull run of notebook activity.
i am creating temp tables using copy activity. then after successful run of copy activity, i am running notebook activity to process those temp table. if temp table exists then i rename them by remove underscore _temp and delete previous table. below is…
How to remove escape character '//' in the output file of the JSON sink dataset
Good day community, I'm seeking your help in removing the escape character \ from the output of the dataflow with JSON as the sink dataset. The objective is to produce a JSON output file with a row containing the value: ["all.be"] In the…
Wildecard in ADF copy activity
Hello , Is there a way in wildcard pattern in ADF copy activity to pick files from folders with certain naming convection. For example I have a following folder directory under bucket Myfolder **Main Folder** …
Unable to create HDInsight On Demand Cluster in Azure Data Factory
I am getting the issue to creating the On Demand HDInsights in Azure Data Factory. The status is stuck at "PreparingCluster". Here's my LinkedService configuration. OS type : Linux
How do I dynamically read multiple file names?
I am trying to read name of about 20+ csv files, and ingest data dynamically. Below is diagram of how I ingest data from each csv file individually. So, I have at least 20+ pipelines for each csv files. Below is how each csv files' name look like: If I…
How to get return value from Oracle stored procedure?
Hi, I'm trying to use a Script activity to call a Stored Procedure which exists in my on-prem Oracle database. I used script parameters to pass the inputs which works fine. But when I add my output parameter to the script activity, it is returning a…
Azure Data Factory Linked Service Connection Timeout Error with MongoDB
Experiencing an error while trying to connect to MongoDB from a linked service in Azure Data Factory. Connection to MongoDB server is timeout. A timeout occurred after 30000ms selecting a server using CompositeServerSelector{ Selectors =…
Best way to read from MongoDB change streams from within Azure
Hello! We have an application that needs events from our MongoDB change stream and insert the change event into Data Lake storage. We cannot use MongoDB Triggers to call an Azure Function because we have no way of authenticating the Trigger (we have been…
Extraction and Loading of OData to Postgresql issue with ADF pipeline with SOOM
our requirement to perform Extraction and Loading of OData to Postgresql using ADF pipeline is having issues related to System Out Of Memory exception randomly and sometimes with the max buffer issues. For the same pipeline run which worked and completed…
We are looking for a spark structured based streaming to consume data from activeMQ
We are looking for a spark structured based streaming to consume data from activeMQ. However, we dont see any default connector or support as such available for the same. Can you please suggest, is there any roadmap in place to support this ?
Unable to copy payload from snowflake sql rest api to Azure blob
Hi , I am using web activity to get oauth token and passing subscription key and token to get method source dataset in a copy activity. postman is giving correct result but getting below error:- "Code": 22755, "Message":…
How to use Dynamic add content in Compression type in copy activity
Hi , I created the dynamic dataset and linker service. In copy activity i used that dynamic dataset and want to pass parameter in copy activity or use Compression type option as parameter/Dynamic when i click on Compression type there is drop down menu…
Auto update on IR onpremise hosting
I'm using the self-hosted IR for on-premise servers. I found that there is an auto update setting in IR, but the comments is weird. Can anyone tell me if it is right for auto updating?
Is there a way to deploy Azure Data Factory pipeline using DevOps CI/CD without using ARM template?
I want to deploy Azure Data Factory pipelines using DevOps release pipeline without using ARM template.
ADF for OnPrem Target
What's is recommended ETL approach from OnPrem Source to OnPrem target/sink, inside Enterprise network, using ADF, without data movement to azure/cloud.
In ADF cannot skip columns in Copy activity Upsert
I have generic pipeline to upsert data in target synapse tables. Column mapping is also dynamic(using json). I want to skip few column in mapping but I am getting error of column count mismatch.
I am getting Negsignal.SIGKILL Airflow
I have multiple integration running but for one integration when I extract zip file size of zip is 2-3 gbs, extract and upload each file to our storage.It shows Negsignal.SIGKILL. It was running fine a month ago but now its this error. The file is same…
Copy activity in ADF errors out while using dynamic range partitioning
Hey, I've been trying to copy some data using the Copy Activity from SQL Server 16 to Snowflake through a self hosted IR that has direct Line of Sight to the Server. I'm trying to enable dynamic range partitioning on one of the column so that the…
How to parametrize Email ID while sending notifications using ADF
Hi, We are sending email notifications while executing the ADF pipelines for success and failures. At multiple activities, we are using these Emails. How to parametrize Email ID while sending notifications using ADF? Thanks.
audit table
Hi, how to set a up a audit log table in in adf pipeline.... i know single copy activity can be logged using sp for failure and success but how about multiple copy activity or any other activity failed have tried below for single copy…