Share via


az synapse spark job

Manage Synapse Spark batch jobs.

Commands

Name Description Type Status
az synapse spark job cancel

Cancel a Spark job.

Core GA
az synapse spark job list

List all Spark jobs.

Core GA
az synapse spark job show

Get a Spark job.

Core GA
az synapse spark job submit

Submit a Spark job.

Core GA

az synapse spark job cancel

Cancel a Spark job.

az synapse spark job cancel --livy-id
                            --spark-pool-name
                            --workspace-name
                            [--yes]

Examples

Cancel a Spark job.

az synapse spark job cancel --livy-id 1 --workspace-name testsynapseworkspace --spark-pool-name testsparkpool

Required Parameters

--livy-id

The id of the Spark job.

--spark-pool-name

The name of the Spark pool.

--workspace-name

The name of the workspace.

Optional Parameters

--yes -y

Do not prompt for confirmation.

Default value: False
Global Parameters
--debug

Increase logging verbosity to show all debug logs.

--help -h

Show this help message and exit.

--only-show-errors

Only show errors, suppressing warnings.

--output -o

Output format.

Accepted values: json, jsonc, none, table, tsv, yaml, yamlc
Default value: json
--query

JMESPath query string. See http://jmespath.org/ for more information and examples.

--subscription

Name or ID of subscription. You can configure the default subscription using az account set -s NAME_OR_ID.

--verbose

Increase logging verbosity. Use --debug for full debug logs.

az synapse spark job list

List all Spark jobs.

az synapse spark job list --spark-pool-name
                          --workspace-name
                          [--from-index]
                          [--size]

Examples

List all Spark jobs.

az synapse spark job list --workspace-name testsynapseworkspace --spark-pool-name testsparkpool

Required Parameters

--spark-pool-name

The name of the Spark pool.

--workspace-name

The name of the workspace.

Optional Parameters

--from-index

Optional parameter specifying which index the list should begin from.

--size

The size of the returned list.By default it is 20 and that is the maximum.

Global Parameters
--debug

Increase logging verbosity to show all debug logs.

--help -h

Show this help message and exit.

--only-show-errors

Only show errors, suppressing warnings.

--output -o

Output format.

Accepted values: json, jsonc, none, table, tsv, yaml, yamlc
Default value: json
--query

JMESPath query string. See http://jmespath.org/ for more information and examples.

--subscription

Name or ID of subscription. You can configure the default subscription using az account set -s NAME_OR_ID.

--verbose

Increase logging verbosity. Use --debug for full debug logs.

az synapse spark job show

Get a Spark job.

az synapse spark job show --livy-id
                          --spark-pool-name
                          --workspace-name

Examples

Get a Spark job.

az synapse spark job show --livy-id 1 --workspace-name testsynapseworkspace --spark-pool-name testsparkpool

Required Parameters

--livy-id

The id of the Spark job.

--spark-pool-name

The name of the Spark pool.

--workspace-name

The name of the workspace.

Global Parameters
--debug

Increase logging verbosity to show all debug logs.

--help -h

Show this help message and exit.

--only-show-errors

Only show errors, suppressing warnings.

--output -o

Output format.

Accepted values: json, jsonc, none, table, tsv, yaml, yamlc
Default value: json
--query

JMESPath query string. See http://jmespath.org/ for more information and examples.

--subscription

Name or ID of subscription. You can configure the default subscription using az account set -s NAME_OR_ID.

--verbose

Increase logging verbosity. Use --debug for full debug logs.

az synapse spark job submit

Submit a Spark job.

az synapse spark job submit --executor-size {Large, Medium, Small}
                            --executors
                            --main-definition-file
                            --name
                            --spark-pool-name
                            --workspace-name
                            [--archives]
                            [--arguments]
                            [--configuration]
                            [--language {CSharp, PySpark, Python, Scala, Spark, SparkDotNet}]
                            [--main-class-name]
                            [--python-files]
                            [--reference-files]
                            [--tags]

Examples

Submit a Java Spark job.

az synapse spark job submit --name WordCount_Java --workspace-name testsynapseworkspace \
--spark-pool-name testsparkpool \
--main-definition-file abfss://testfilesystem@testadlsgen2.dfs.core.windows.net/samples/java/wordcount/wordcount.jar \
--main-class-name WordCount \
--arguments abfss://testfilesystem@testadlsgen2.dfs.core.windows.net/samples/java/wordcount/shakespeare.txt \
abfss://testfilesystem@testadlsgen2.dfs.core.windows.net/samples/java/wordcount/result/ \
--executors 2 --executor-size Small

Required Parameters

--executor-size

The executor size.

Accepted values: Large, Medium, Small
--executors

The number of executors.

--main-definition-file

The main file used for the job.

--name -n

The Spark job name.

--spark-pool-name

The name of the Spark pool.

--workspace-name

The name of the workspace.

Optional Parameters

--archives

The array of archives.

--arguments

Optional arguments to the job (Note: please use storage URIs for file arguments).

--configuration

The configuration of Spark job.

--language

The Spark job language.

Accepted values: CSharp, PySpark, Python, Scala, Spark, SparkDotNet
Default value: Scala
--main-class-name

The fully-qualified identifier or the main class that is in the main definition file.

--python-files

The array of files used for refenence in the main python definition file. Examples include custom whl files and custom python files. May pass multiple files such as "az synapse spark job sumbit <other_args> --python_files abfss://file1 abss://file2".

--reference-files

Additional files used for reference in the main definition file.

--tags

Space-separated tags: key[=value] [key[=value] ...]. Use "" to clear existing tags.

Global Parameters
--debug

Increase logging verbosity to show all debug logs.

--help -h

Show this help message and exit.

--only-show-errors

Only show errors, suppressing warnings.

--output -o

Output format.

Accepted values: json, jsonc, none, table, tsv, yaml, yamlc
Default value: json
--query

JMESPath query string. See http://jmespath.org/ for more information and examples.

--subscription

Name or ID of subscription. You can configure the default subscription using az account set -s NAME_OR_ID.

--verbose

Increase logging verbosity. Use --debug for full debug logs.