SparkJobEntry Class
Entry for Spark job.
- Inheritance
-
azure.ai.ml.entities._mixins.RestTranslatableMixinSparkJobEntry
Constructor
SparkJobEntry(*, entry: str, type: str = 'SparkJobPythonEntry')
Keyword-Only Parameters
Name | Description |
---|---|
entry
|
The file or class entry point. |
type
|
The entry type. Accepted values are SparkJobEntryType.SPARK_JOB_FILE_ENTRY or SparkJobEntryType.SPARK_JOB_CLASS_ENTRY. Defaults to SparkJobEntryType.SPARK_JOB_FILE_ENTRY. Default value: SparkJobPythonEntry
|
Examples
Creating SparkComponent.
from azure.ai.ml.entities import SparkComponent
component = SparkComponent(
name="add_greeting_column_spark_component",
display_name="Aml Spark add greeting column test module",
description="Aml Spark add greeting column test module",
version="1",
inputs={
"file_input": {"type": "uri_file", "mode": "direct"},
},
driver_cores=2,
driver_memory="1g",
executor_cores=1,
executor_memory="1g",
executor_instances=1,
code="./src",
entry={"file": "add_greeting_column.py"},
py_files=["utils.zip"],
files=["my_files.txt"],
args="--file_input ${{inputs.file_input}}",
base_path="./sdk/ml/azure-ai-ml/tests/test_configs/dsl_pipeline/spark_job_in_pipeline",
)
Collaborate with us on GitHub
The source for this content can be found on GitHub, where you can also create and review issues and pull requests. For more information, see our contributor guide.
Azure SDK for Python