Condividi tramite


SynapseSparkJobDefinitionActivity Class

Definition

Execute spark job activity.

[System.Text.Json.Serialization.JsonConverter(typeof(Azure.Analytics.Synapse.Artifacts.Models.SynapseSparkJobDefinitionActivity+SynapseSparkJobDefinitionActivityConverter))]
public class SynapseSparkJobDefinitionActivity : Azure.Analytics.Synapse.Artifacts.Models.ExecutionActivity
[<System.Text.Json.Serialization.JsonConverter(typeof(Azure.Analytics.Synapse.Artifacts.Models.SynapseSparkJobDefinitionActivity+SynapseSparkJobDefinitionActivityConverter))>]
type SynapseSparkJobDefinitionActivity = class
    inherit ExecutionActivity
Public Class SynapseSparkJobDefinitionActivity
Inherits ExecutionActivity
Inheritance
SynapseSparkJobDefinitionActivity
Attributes

Constructors

SynapseSparkJobDefinitionActivity(String, SynapseSparkJobReference)

Initializes a new instance of SynapseSparkJobDefinitionActivity.

Properties

AdditionalProperties

Additional Properties.

(Inherited from Activity)
Arguments

User specified arguments to SynapseSparkJobDefinitionActivity.

ClassName

The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide. Type: string (or Expression with resultType string).

Conf

Spark configuration properties, which will override the 'conf' of the spark job definition you provide.

ConfigurationType

The type of the spark config.

DependsOn

Activity depends on condition.

(Inherited from Activity)
Description

Activity description.

(Inherited from Activity)
DriverSize

Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).

ExecutorSize

Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).

File

The main file used for the job, which will override the 'file' of the spark job definition you provide. Type: string (or Expression with resultType string).

Files

(Deprecated. Please use pythonCodeReference and filesV2) Additional files used for reference in the main definition file, which will override the 'files' of the spark job definition you provide.

FilesV2

Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.

LinkedServiceName

Linked service reference.

(Inherited from ExecutionActivity)
Name

Activity name.

(Inherited from Activity)
NumExecutors

Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide. Type: integer (or Expression with resultType integer).

OnInactiveMarkAs

Status result of the activity when the state is set to Inactive. This is an optional property and if not provided when the activity is inactive, the status will be Succeeded by default.

(Inherited from Activity)
Policy

Activity policy.

(Inherited from ExecutionActivity)
PythonCodeReference

Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.

ScanFolder

Scanning subfolders from the root folder of the main definition file, these files will be added as reference files. The folders named 'jars', 'pyFiles', 'files' or 'archives' will be scanned, and the folders name are case sensitive. Type: boolean (or Expression with resultType boolean).

SparkConfig

Spark configuration property.

SparkJob

Synapse spark job reference.

State

Activity state. This is an optional property and if not provided, the state will be Active by default.

(Inherited from Activity)
TargetBigDataPool

The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.

TargetSparkConfiguration

The spark configuration of the spark job.

UserProperties

Activity user properties.

(Inherited from Activity)

Applies to