Introduction to Databricks notebooks
Notebooks are a common tool in data science and machine learning for developing code and presenting results. In Azure Databricks, notebooks are the primary tool for creating data science and machine learning workflows and collaborating with colleagues. Databricks notebooks provide real-time coauthoring in multiple languages, automatic versioning, and built-in data visualizations.
With Azure Databricks notebooks, you can:
- Develop code using Python, SQL, Scala, and R.
- Customize your environment with the libraries of your choice.
- Create regularly scheduled jobs to automatically run tasks, including multi-notebook workflows.
- Browse and access tables and volumes.
- Export results and notebooks in
.html
or.ipynb
format. - Use a Git-based repository to store your notebooks with associated files and dependencies.
- Build and share dashboards.
- Develop and debug Delta Live Tables pipelines in notebooks.
- (Experimental) Use advanced editing capabilities.
Notebooks are also useful for exploratory data analysis (EDA).
How to import and run example notebooks
The Azure Databricks documentation includes many example notebooks that are intended to illustrate how to use Databricks capabilities. To import one of these notebooks into a Databricks workspace:
Click Copy link for import at the upper right of the notebook preview that appears on the page.
MLflow autologging quickstart Python notebook
In the workspace browser, navigate to the location where you want to import the notebook.
Right-click the folder and select Import from the menu.
Click the URL radio button and paste the link you just copied in the field.
Click Import. The notebook is imported and opens automatically in the workspace. Changes you make to the notebook are saved automatically. For information about editing notebooks in the workspace, see Develop code in Databricks notebooks.
To run the notebook, click at the top of the notebook. For more information about running notebooks and individual notebook cells, see Run Databricks notebooks.
To create a new, blank notebook in your workspace, see Create a notebook.
Notebook orientation
Learn about the notebook interface and controls
Start using Databricks notebooks
- Manage notebooks: create, rename, delete, get the notebook path, configure editor settings.
- Develop and edit code in notebooks.
- Get AI-assisted coding help.
- Use the interactive debugger.
- Work with cell outputs: download results and visualizations, control display of results in the notebook.
- Run notebooks and schedule regular jobs.
- Collaborate using notebooks: share a notebook, use comments in notebooks.
- Import and export notebooks.
- Test notebooks.
- Customize the libraries for your notebook.