What is Databricks Apps?

Important

Databricks Apps is in Public Preview.

Databricks Apps lets developers create secure data and AI applications on the Databricks platform and share those apps with users.

Previously, creating data and AI applications that use data managed by a Databricks workspace and the data analytics features of the Databricks platform required deploying separate infrastructure to host applications, ensuring compliance with data governance controls, managing application security, including authentication and authorization, and so forth. With Databricks Apps, Databricks hosts your apps, so you don’t need to configure or deploy additional infrastructure.

Your apps can use the resources and features of the Databricks platform, including Unity Catalog for governance, Databricks SQL to query data, AI features such as model serving, Databricks Jobs for ETL, and the already configured security rules in the workspace, including the rules that control access to the data used by your app. Authentication and authorization use existing Azure Databricks functionality, including OAuth and service principals.

Databricks designed Databricks Apps for developers. You develop your apps in Python using any framework such as Dash, Streamlit, or Gradio. Because your apps are portable, you can create and debug them locally, deploy them to a Databricks workspace, and then move them to another workspace.

Requirements

Workspace requirements

To deploy and run apps in your Azure Databricks workspace, the workspace must meet the following requirements:

  • You must ensure that your firewall does not block the domain *.databricksapps.com.
  • Your Databricks workspace must be in a supported region. See serverless feature availability..

Development environment requirements

To create apps locally and deploy those apps to your Azure Databricks workspace, your development environment must meet the following requirements:

  • Python 3.11 or above.

  • The Databricks command-line interface (Databricks CLI), version v0.229.0 or above, configured to access your Databricks workspace. To install or update and configure the Databricks CLI, see Install or update the Databricks CLI and Authentication for the Databricks CLI.

  • The Databricks SDK for Python. You can install the SDK with pip3:

    pip3 install databricks-sdk

    See Databricks SDK for Python.

  • (Optional) If your app needs to access Databricks SQL, install the Databricks SQL Connector for Python. You can install the connector with pip3:

    pip3 install databricks-sql-connector

Where do I develop a Databricks app?

You can write and test apps in any IDE that supports Python, such as PyCharm, IntelliJ IDEA, or Visual Studio Code. Databricks recommends developing your apps using Visual Studio Code and the Databricks extension for Visual Studio Code, but you can also use the Databricks notebook and file editor to edit your code directly in your Databricks workspace.

How do I develop and deploy a Databricks app?

To develop an app locally, the following is a typical workflow:

  • Develop your app in your preferred IDE such as Visual Studio Code.
  • Run your app locally at the command line and view it in your browser.
  • When the code is complete and tested, move the code and required artifacts to your Databricks workspace.

See Get started with Databricks Apps.

To create an app in the UI or using a pre-built example, see How do I create an app in the Databricks Apps UI?.

Can I use Python frameworks with my Databricks app?

You can develop your app using your favorite Python frameworks, such as Dash, Streamlit, or Gradio. You can see examples that use popular Python frameworks in the Databricks Apps UI. See How do I create an app in the Databricks Apps UI?.

How does Databricks Apps manage authorization?

The Databricks Apps authorization model includes the user accessing the app and an Azure Databricks managed service principal assigned to the app:

  • To access an app, a user must have CAN_USE or CAN_MANAGE permissions on the app. To learn more about assigning permissions to an app, see Configure permissions for your Databricks app.

  • When an app is created, Databricks Apps automatically creates an Azure Databricks managed service principal and assigns that service principal to the app. This service principal has access to only the workspace that the app is created in and is used to authenticate and authorize access to resources in the workspace, such as SQL warehouses, model serving endpoints, or securable objects in Unity Catalog. All access by an app to data or other workspace resources is performed on behalf of the service principal, not the app owner or user.

    Databricks Apps automatically grants the service principal permissions to any resources assigned to the app if the user deploying the app has CAN MANAGE permission on those resources. If access by the service principal to other resources is required, for example, tables or workspace files, an account or workspace admin must grant the service principal access to those resources. When granting access to resources, Databricks recommends following the principle of least privilege and granting the service principal only the minimal permissions required. See Manage service principals.

    The service principal name can be found on the app details page in the App resources card. The service principal includes the app name, for example, for an app named my-hello-world-app, the service principal name is app-22ixod my-hello-world-app.

Who can create Databricks apps?

Any user in a workspace can create apps. However, to manage the permissions of the service principal assigned to an app, you must be an account or workspace admin.

How do I configure my Databricks app?

Databricks Apps automatically sets several environment variables your app can access, such as the Databricks host on which your app is running. You can also set custom parameters using a YAML file. See Databricks Apps configuration.

How do I integrate my Databricks app with Azure Databricks services?

Your apps can use Databricks platform features such as Databricks SQL to query data, Databricks Jobs for data ingestion and processing, Mosaic AI Model Serving to access generative AI models, and Databricks secrets to manage sensitive information. When configuring your app, these Databricks platform features are referred to as resources.

However, because apps are designed to be portable, Databricks recommends that apps do not depend on specific resources. For example, your app should not be hardcoded to use a particular SQL warehouse. Instead, configure the SQL warehouse in the Databricks Apps UI when creating or updating an app.

Additionally, because apps are configured to run with the least required privileges, they should not create new resources. Instead, they must rely on the Databricks platform to resolve existing dependent services. Each app has a Databricks service principal assigned. During app creation or update, the service principal is granted required permissions on defined resource dependencies.

To learn more about adding Databricks platform features as app resources, see Assign Databricks platform features to a Databricks app.

Where can I find audit logs for my Databricks apps?

To find audit events for apps, use the Azure Databricks system tables. You can use the system tables to query:

What is the cost for Databricks Apps?

For information on the pricing for Databricks Apps, see Compute for Apps.

The Databricks Apps system environment

Note

To view the environment for a specific app, including environment variables and installed packages, go to the Environment tab on the details page for the app. See View the details for a Databricks app.

The following describes the system environment your apps run in, resources available to your app, and versions of the installed applications and libraries.

  • Operating System: Ubuntu 22.04 LTS
  • Python: 3.11.0. Your apps run in a Python virtual environment. All dependencies are installed in this virtual environment, including automatically installed libraries and any libraries you install, for example, with a requirements.txt file.
  • System resources: Your apps can use up to two virtual CPUs (vCPU) and 6 GB of memory. Your app might be restarted if it exceeds the allocated resources.

Installed Python libraries

Library Version
databricks-sql-connector 3.4.0
databricks-sdk 0.33.0
mlflow-skinny 2.16.2
gradio 4.44.0
streamlit 1.38.0
shiny 1.1.0
dash 2.18.1
flask 3.0.3
fastapi 0.115.0
uvicorn[standard] 0.30.6
gunicorn 23.0.0
dash-ag-grid 31.2.0
dash-mantine-components 0.14.4
dash-bootstrap-components 1.6.0
plotly 5.24.1
plotly-resampler 0.10.0

Limitations

  • There is a limit of 50 apps in a Databricks workspace.

  • Files used by an app cannot exceed 10 MB in size. If a file in your app’s directory exceeds this limit, app deployment fails with an error.

  • Databricks Apps does not meet HIPAA, PCI, or FedRAMP compliance standards.

  • Logs created by an app are not persisted when the Azure Databricks compute hosting the app is terminated. See Logging from your Databricks app.

  • Because they do not support OAuth, you cannot use legacy regional URLs with Databricks Apps.