Delta Live Tables pipeline task for jobs

This article describes how you can schedule triggered Delta Live Tables pipelines to run as a task in a Databricks job.

Configure a Delta Live Tables pipeline task with the Jobs UI

Delta Live Tables pipelines manage all configurations for source code and compute in the pipeline definition.

To add a pipeline to a job, create and name a new task and select the Delta Live Tables pipeline for the Type.

In the Pipeline drop-down menu, select an existing Delta Live Tables pipeline.

You can optionally trigger a full refresh on the Delta Live Tables pipeline.

Important

You can use only triggered pipelines with the Pipeline task. Continuous pipelines are not supported as a job task. To learn more about triggered and continuous pipelines, see Triggered vs. continuous pipeline mode.

Schedule a pipeline with the pipeline UI

Adding a schedule to a pipeline creates a job with a single pipeline task. You can only configure time-based schedule triggers using this UI. For more advanced triggering options, see Configure a Delta Live Tables pipeline task with the Jobs UI.

Configure a pipeline task in a scheduled job using the pipeline UI by completing the following steps:

  1. Click Delta Live Tables in the sidebar.
  2. Click on the pipeline name. The pipeline UI appears.
  3. Click Schedule.
    • If no schedule exists for the pipeline, the New schedule dialog appears.
    • If one or more schedules already exist, click Add schedule.
  4. Enter a unique name for the job in the Job name field.
  5. (Optional) Update the schedule frequency.
    • Select Advanced for more verbose options including cron syntax.
  6. (Optional) Under More options, configure one or more email addresses to receive alerts on pipeline start, success, or failure.
  7. Click Create.

Note

If the pipeline is included in one or more scheduled jobs, the Schedule button shows the number of existing schedules, for example, Schedule (5).

Add a schedule to a materialized view or streaming table in Databricks SQL

Materialized views and streaming tables defined in Databricks SQL support time-based scheduling specified in CREATE or ALTER commands.

For details, see the following articles: