How to find empty folder and empty files by using synapse
Hi, How can I find if there are empty folder or files in gen2 container by using synapse? Either sql or notebook python. Thanks!
How to use Dynamic add content in Compression type in copy activity
Hi , I created the dynamic dataset and linker service. In copy activity i used that dynamic dataset and want to pass parameter in copy activity or use Compression type option as parameter/Dynamic when i click on Compression type there is drop down menu…
Synapse analytics Managed Identity authentication issue
Last year when my team tried to authenticate to dataverse from synapse workspace using managed identity, we hit an issue because we were not able to attach the synapse workspace managed identity (MSI) to the inner spark pool. With the SFIs requiring…
how to fix dedicated sql pool in azure synapse analytics
Started executing query at Line 1 Database 'hj' on server 'resourcesynnapse3' is not currently available. Please retry the connection later. If the problem persists, contact customer support, and provide them the session tracing ID of…
Need to automate a Synapse Analytics Pipeline to gather the information from the previous day. every day
I have a Synapse Analytics Pipeline that gather information from SharePoint and put it into a storage account container in Azure. It is a Copy data action. Right now, every time I need to get more data, I need to change the date on Source /Date filter…
In Azure SQL Pool we have metrics like Failed Connections : System Errors and Failed Connections : User Errors, but how can we find out what exactly was the error?
In Azure SQL Pool we have metrics like Failed Connections : System Errors and Failed Connections : User Errors, but how can we find out what exactly was the error? How to find out which session was impacted and what error was associated with which…
How to read secrets from AKV and access an API using those secrets in Azure Synapse Notebooks ?
I have a usecase to be able to read secrets from an AKV in another subscription and use them to get data from an API in the same subscription. I have created a linked Service for Azure Key Vault in Synapse Analytics that has the secrets for API…
Select transformation output schema is empty after renaming columns
I've followed the example from the link: https://learn.microsoft.com/en-us/azure/data-factory/tutorial-data-flow-dynamic-columns It's exactly what I need for my project, but unfortunately, no matter what I do, the output schema of the select…
How to copy all tables of a Schema to .csv file format in respective folders
Hi, I am trying to copy all tables of Schema-SALES from SQL Server to .csv file format in ADF and files of each table must copy in respetive folder for example. source (SQL SERVER Tables) SALES.Customer SALES. Address …
How to add expression on result of aggregate function when create Materialized View in Synapse
Hi there, When I creating materialized view in synapse, i got an error, it says "because the select list of the view contains an expression on result of aggregate function or grouping column. Consider removing expression on result of aggregate…
New Synapse/ADF Postgres Linked Service does not work with 'time without time zone' datatype.
The current legacy linked service driver supports tables that use the 'time without timezone' datatype. The new linked service driver gives the following error when trying to read tables with this…
Datetime2 error when querying parquet file produced from Synapse copy activity from OData Business Central Source (data seems to be changed from 0001-01-01 to 0000-12-30)
I have a pipeline running a copy activity to copy data from Business Central OData source to parquet file everyday. Everything works fine before 16/07/2024 but problem arises starting from 17/07/2024. The parquet file no longer be able to query from…
Get Pipeline Runs by Workspace API not functioning as expected
Calling the Get Pipeline Runs by Workspace API does not produce the expected results. Calling the API without any filters / sorts in the body produces a paged list of 100 pipeline runs, but as soon as any of the following are passed as body the result…
Dynamic Data Masking On Synapse Serverless SQL Database
Is it possible to implement dynamic data masking on an external Synapse Serverless pool?For context we are implementing a delta lake and the analytics teams will be accessing data via a SQL Serverless pool.
How can I write to a Dataverse lookup using an alternate key?
Hi I want to write data to Dataverse via a Synapse Copy Data task in a pipeline. The destination tables all have alternate keys set up. So for example, I could have a Country entity, with CountryAlternateKey (e.g. ISO code) and Name, and a City with…
Identifying Unused Tables in AAS Cube Over the Last 30 Days
I have a question regarding identifying unused tables in Azure Analysis Services (AAS). In our organization, we’re using AAS, and due to limited storage, our cube refresh is failing. I’ve been tasked with finding a list of tables that haven’t been used…
How to reference to an unpublished notebook in a pipeline in synapse
Hi, I have a similar question to the post here which does not seem to be…
Dataflow debug only shows 100 rows even if the query returns >100 records. Row limit is set to 1000.
Dataflow debug only shows 100 rows in the web UI in tabular format even if the query returns >100 records. Row limit is set to 1000. Is this the default behavior in azure synapse analytics? I know this is a display only behavior as I am able to join…
Getting error "Login failed for user '
I have a resource group under which I have created a sql pool and Synapse workspace. Whenever I am running a query like "CREATE SCHEMA foodmart" it is giving the following error. "Login failed for user '
How to save complete web activity output to ADLS Gen2 in JSON format? Azure Synapse Studio
I want to save the output of my Web activity (which hits an Api and then fetches data from that Api in Json format) into blob storage account container. Also, my web activity is within a foreach - so multiple Api links are being hit and multiple files…