Summary
Notebooks are one of the most common ways that data engineers and data analysts implement data ingestion and processing logic in Azure Databricks. Using Azure Data Factory to run notebooks in a pipeline enables you to create data processing solutions that can be run on-demand, at scheduled intervals, or in response to a specific event.
In this module, you learned how to:
- Describe how Azure Databricks notebooks can be run in a pipeline.
- Create an Azure Data Factory linked service for Azure Databricks.
- Use a Notebook activity in a pipeline.
- Pass parameters to a notebook.