Get started with data warehousing using Databricks SQL

If you’re a data analyst who works primarily with SQL queries and your favorite BI tools, Databricks SQL provides an intuitive environment for running ad-hoc queries and creating dashboards on data stored in your data lake. These articles can help you get started.

Note

Databricks SQL Serverless is not available in Azure China. Databricks SQL is not available in Azure Government regions.

Basic Databricks SQL concepts

To start, familiarize yourself with some basic Databricks SQL concepts. See Databricks SQL concepts.

Interact with sample dashboards

Then, learn how to import and use dashboards in the Dashboard Samples Gallery that visualize queries. See Tutorial: Use sample dashboards.

Visualize queries and create a dashboard

Next, use dashboards to explore data and create a dashboard that you can share. See Dashboards.

Use Databricks SQL in an Azure Databricks job

Next, use the SQL task type in an Azure Databricks job, allowing you to create, schedule, operate, and monitor workflows that include Databricks SQL objects such as queries, legacy dashboards, and alerts. See SQL task for jobs.

Use Databricks SQL with a notebook

You can also attach a notebook to a SQL warehouse. See Notebooks and SQL warehouses for more information and limitations.

Use COPY INTO to load data

Next, learn how to use COPY INTO in Databricks SQL. See Tutorial: Use COPY INTO with Databricks SQL.

Create a SQL warehouse

To create a SQL warehouse, see Configure SQL warehouse.

Work with technology partners

You can also connect your Databricks workspace to a BI and visualization partner solution using Partner Connect. See Connect to BI partners using Partner Connect.