Limitations of Microsoft Fabric Data Warehouse
Applies to: ✅ SQL analytics endpoint and Warehouse in Microsoft Fabric
This article details the current limitations in Microsoft Fabric.
These limitations apply only to Warehouse and SQL analytics endpoint items in Fabric Synapse Data Warehouse. For limitations of SQL Database in Fabric, see Limitations in SQL Database in Microsoft Fabric (Preview).
Limitations
Current general product limitations for Data Warehousing in Microsoft Fabric are listed in this article, with feature level limitations called out in the corresponding feature article. More functionality will build upon the world class, industry-leading performance and concurrency story, and will land incrementally. For more information on the future of Microsoft Fabric, see Fabric Roadmap.
- Data warehousing is not supported for multiple geographies at this time.
- Currently, parquet files that are no longer needed are not removed from storage by garbage collection.
For more limitations in specific areas, see:
- Clone table
- Connectivity
- Data types in Microsoft Fabric
- Semantic models
- Delta lake logs
- Pause and resume in Fabric data warehousing
- Share your data and manage permissions
- Limitations in source control
- Statistics
- Tables
- Transactions
- Visual Query editor
Limitations of the SQL analytics endpoint
The following limitations apply to SQL analytics endpoint automatic schema generation and metadata discovery.
Data should be in Delta Parquet format to be autodiscovered in the SQL analytics endpoint. Delta Lake is an open-source storage framework that enables building Lakehouse architecture.
Tables with renamed columns aren't supported in the SQL analytics endpoint.
Delta column mapping by name is supported, but Delta column mapping by ID is not supported. For more information, see Delta Lake features and Fabric experiences.
- Delta column mapping in the SQL analytics endpoint is currently in preview.
Delta tables created outside of the
/tables
folder aren't available in the SQL analytics endpoint.If you don't see a Lakehouse table in the warehouse, check the location of the table. Only the tables that are referencing data in the
/tables
folder are available in the warehouse. The tables that reference data in the/files
folder in the lake aren't exposed in the SQL analytics endpoint. As a workaround, move your data to the/tables
folder.Some columns that exist in the Spark Delta tables might not be available in the tables in the SQL analytics endpoint. Refer to the Data types for a full list of supported data types.
If you add a foreign key constraint between tables in the SQL analytics endpoint, you won't be able to make any further schema changes (for example, adding the new columns). If you don't see the Delta Lake columns with the types that should be supported in SQL analytics endpoint, check if there is a foreign key constraint that might prevent updates on the table.
For information and recommendations on performance of the SQL analytics endpoint, see SQL analytics endpoint performance considerations.
Known issues
For known issues in Microsoft Fabric, visit Microsoft Fabric Known Issues.