套件組合設定範例
本文提供 Databricks 資產套件組合功能和常見套件組合使用案例的範例組態。
提示
本文中的一些範例,以及其他範例,都可以在 套件組合範例存放庫中找到。
使用無伺服器計算的工作
Databricks 資產配套支援在無伺服器計算上執行的作業。 若要進行此設定,您可以省略 clusters
作業的設定,也可以指定環境,如下列範例所示。
# A serverless job (no cluster definition)
resources:
jobs:
serverless_job_no_cluster:
name: serverless_job_no_cluster
email_notifications:
on_failure:
- someone@example.com
tasks:
- task_key: notebook_task
notebook_task:
notebook_path: ../src/notebook.ipynb
# A serverless job (environment spec)
resources:
jobs:
serverless_job_environment:
name: serverless_job_environment
tasks:
- task_key: task
spark_python_task:
python_file: ../src/main.py
# The key that references an environment spec in a job.
environment_key: default
# A list of task execution environment specifications that can be referenced by tasks of this job.
environments:
- environment_key: default
# Full documentation of this spec can be found at:
# https://docs.databricks.com/api/workspace/jobs/create#environments-spec
spec:
client: "1"
dependencies:
- cowsay
使用無伺服器計算的管線
Databricks 資產配套支援在無伺服器計算上執行的管線。 若要進行此設定,請將管線 serverless
設定設為 true
。 下列範例組態會定義在無伺服器計算上執行的管線,以及每小時觸發管線重新整理的作業。
# A pipeline that runs on serverless compute
resources:
pipelines:
my_pipeline:
name: my_pipeline
target: ${bundle.environment}
serverless: true
catalog: users
libraries:
- notebook:
path: ../src/my_pipeline.ipynb
configuration:
bundle.sourcePath: /Workspace/${workspace.file_path}/src
# This defines a job to refresh a pipeline that is triggered every hour
resources:
jobs:
my_job:
name: my_job
# Run this job once an hour.
trigger:
periodic:
interval: 1
unit: HOURS
email_notifications:
on_failure:
- someone@example.com
tasks:
- task_key: refresh_pipeline
pipeline_task:
pipeline_id: ${resources.pipelines.my_pipeline.id}
使用 SQL 筆記本的作業
下列範例組態會使用 SQL 筆記本來定義作業。
resources:
jobs:
job_with_sql_notebook:
name: Job to demonstrate using a SQL notebook with a SQL warehouse
tasks:
- task_key: notebook
notebook_task:
notebook_path: ./select.sql
warehouse_id: 799f096837fzzzz4
具有多個轉輪檔案的作業
下列範例組態會定義套件組合,其中包含具有多個 *.whl
檔案的作業。
# job.yml
resources:
jobs:
example_job:
name: "Example with multiple wheels"
tasks:
- task_key: task
spark_python_task:
python_file: ../src/call_wheel.py
libraries:
- whl: ../my_custom_wheel1/dist/*.whl
- whl: ../my_custom_wheel2/dist/*.whl
new_cluster:
node_type_id: i3.xlarge
num_workers: 0
spark_version: 14.3.x-scala2.12
spark_conf:
"spark.databricks.cluster.profile": "singleNode"
"spark.master": "local[*, 4]"
custom_tags:
"ResourceClass": "SingleNode"
# databricks.yml
bundle:
name: job_with_multiple_wheels
include:
- ./resources/job.yml
workspace:
host: https://myworkspace.cloud.databricks.com
artifacts:
my_custom_wheel1:
type: whl
build: poetry build
path: ./my_custom_wheel1
my_custom_wheel2:
type: whl
build: poetry build
path: ./my_custom_wheel2
targets:
dev:
default: true
mode: development
使用requirements.txt檔案的工作
下列範例組態會定義使用 requirements.txt 檔案的作業。
resources:
jobs:
job_with_requirements_txt:
name: Example job that uses a requirements.txt file
tasks:
- task_key: task
job_cluster_key: default
spark_python_task:
python_file: ../src/main.py
libraries:
- requirements: /Workspace/${workspace.file_path}/requirements.txt
將 JAR 檔案上傳至 Unity 目錄的套件組合
您可以將 Unity 目錄磁碟區指定為成品路徑,以便將 JAR 檔案和轉輪檔案等所有成品上傳至 Unity 目錄磁碟區。 下列範例組合會將 JAR 檔案上傳至 Unity 目錄。 如需對應 artifact_path
的相關信息,請參閱 artifact_path。
bundle:
name: jar-bundle
workspace:
host: https://myworkspace.cloud.databricks.com
artifact_path: /Volumes/main/default/my_volume
artifacts:
my_java_code:
path: ./sample-java
build: "javac PrintArgs.java && jar cvfm PrintArgs.jar META-INF/MANIFEST.MF PrintArgs.class"
files:
- source: ./sample-java/PrintArgs.jar
resources:
jobs:
jar_job:
name: "Spark Jar Job"
tasks:
- task_key: SparkJarTask
new_cluster:
num_workers: 1
spark_version: "14.3.x-scala2.12"
node_type_id: "i3.xlarge"
spark_jar_task:
main_class_name: PrintArgs
libraries:
- jar: ./sample-java/PrintArgs.jar