Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,340 questions
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
I am experiencing an issue where the file-sink of the Copy Data
command (SnowflakeExportCopyCommand
) is producing incomplete files when uploading data from Snowflake to Azure Blob Storage in our Azure Data Factory pipeline.
Observations:
COPY
command, the resulting Parquet files in Azure storage have incorrect sizes and row counts.900000000
but the issue persists.Has anyone encountered similar behavior, and are there any known solutions or workarounds?
Would appreciate any insights into possible causes or additional configurations that might resolve this.