Udostępnij za pośrednictwem


AnalyticsSynapseArtifactsModelFactory.BigDataPoolResourceInfo Method

Definition

Initializes a new instance of BigDataPoolResourceInfo.

public static Azure.Analytics.Synapse.Artifacts.Models.BigDataPoolResourceInfo BigDataPoolResourceInfo (string id = default, string name = default, string type = default, System.Collections.Generic.IDictionary<string,string> tags = default, string location = default, string provisioningState = default, Azure.Analytics.Synapse.Artifacts.Models.AutoScaleProperties autoScale = default, DateTimeOffset? creationDate = default, Azure.Analytics.Synapse.Artifacts.Models.AutoPauseProperties autoPause = default, bool? isComputeIsolationEnabled = default, bool? sessionLevelPackagesEnabled = default, int? cacheSize = default, Azure.Analytics.Synapse.Artifacts.Models.DynamicExecutorAllocation dynamicExecutorAllocation = default, string sparkEventsFolder = default, int? nodeCount = default, Azure.Analytics.Synapse.Artifacts.Models.LibraryRequirements libraryRequirements = default, System.Collections.Generic.IEnumerable<Azure.Analytics.Synapse.Artifacts.Models.LibraryInfo> customLibraries = default, Azure.Analytics.Synapse.Artifacts.Models.LibraryRequirements sparkConfigProperties = default, string sparkVersion = default, string defaultSparkLogFolder = default, Azure.Analytics.Synapse.Artifacts.Models.NodeSize? nodeSize = default, Azure.Analytics.Synapse.Artifacts.Models.NodeSizeFamily? nodeSizeFamily = default, DateTimeOffset? lastSucceededTimestamp = default);
static member BigDataPoolResourceInfo : string * string * string * System.Collections.Generic.IDictionary<string, string> * string * string * Azure.Analytics.Synapse.Artifacts.Models.AutoScaleProperties * Nullable<DateTimeOffset> * Azure.Analytics.Synapse.Artifacts.Models.AutoPauseProperties * Nullable<bool> * Nullable<bool> * Nullable<int> * Azure.Analytics.Synapse.Artifacts.Models.DynamicExecutorAllocation * string * Nullable<int> * Azure.Analytics.Synapse.Artifacts.Models.LibraryRequirements * seq<Azure.Analytics.Synapse.Artifacts.Models.LibraryInfo> * Azure.Analytics.Synapse.Artifacts.Models.LibraryRequirements * string * string * Nullable<Azure.Analytics.Synapse.Artifacts.Models.NodeSize> * Nullable<Azure.Analytics.Synapse.Artifacts.Models.NodeSizeFamily> * Nullable<DateTimeOffset> -> Azure.Analytics.Synapse.Artifacts.Models.BigDataPoolResourceInfo
Public Shared Function BigDataPoolResourceInfo (Optional id As String = Nothing, Optional name As String = Nothing, Optional type As String = Nothing, Optional tags As IDictionary(Of String, String) = Nothing, Optional location As String = Nothing, Optional provisioningState As String = Nothing, Optional autoScale As AutoScaleProperties = Nothing, Optional creationDate As Nullable(Of DateTimeOffset) = Nothing, Optional autoPause As AutoPauseProperties = Nothing, Optional isComputeIsolationEnabled As Nullable(Of Boolean) = Nothing, Optional sessionLevelPackagesEnabled As Nullable(Of Boolean) = Nothing, Optional cacheSize As Nullable(Of Integer) = Nothing, Optional dynamicExecutorAllocation As DynamicExecutorAllocation = Nothing, Optional sparkEventsFolder As String = Nothing, Optional nodeCount As Nullable(Of Integer) = Nothing, Optional libraryRequirements As LibraryRequirements = Nothing, Optional customLibraries As IEnumerable(Of LibraryInfo) = Nothing, Optional sparkConfigProperties As LibraryRequirements = Nothing, Optional sparkVersion As String = Nothing, Optional defaultSparkLogFolder As String = Nothing, Optional nodeSize As Nullable(Of NodeSize) = Nothing, Optional nodeSizeFamily As Nullable(Of NodeSizeFamily) = Nothing, Optional lastSucceededTimestamp As Nullable(Of DateTimeOffset) = Nothing) As BigDataPoolResourceInfo

Parameters

id
String

Fully qualified resource ID for the resource. Ex - /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}.

name
String

The name of the resource.

type
String

The type of the resource. E.g. "Microsoft.Compute/virtualMachines" or "Microsoft.Storage/storageAccounts".

tags
IDictionary<String,String>

Resource tags.

location
String

The geo-location where the resource lives.

provisioningState
String

The state of the Big Data pool.

autoScale
AutoScaleProperties

Auto-scaling properties.

creationDate
Nullable<DateTimeOffset>

The time when the Big Data pool was created.

autoPause
AutoPauseProperties

Auto-pausing properties.

isComputeIsolationEnabled
Nullable<Boolean>

Whether compute isolation is required or not.

sessionLevelPackagesEnabled
Nullable<Boolean>

Whether session level packages enabled.

cacheSize
Nullable<Int32>

The cache size.

dynamicExecutorAllocation
DynamicExecutorAllocation

Dynamic Executor Allocation.

sparkEventsFolder
String

The Spark events folder.

nodeCount
Nullable<Int32>

The number of nodes in the Big Data pool.

libraryRequirements
LibraryRequirements

Library version requirements.

customLibraries
IEnumerable<LibraryInfo>

List of custom libraries/packages associated with the spark pool.

sparkConfigProperties
LibraryRequirements

Spark configuration file to specify additional properties.

sparkVersion
String

The Apache Spark version.

defaultSparkLogFolder
String

The default folder where Spark logs will be written.

nodeSize
Nullable<NodeSize>

The level of compute power that each node in the Big Data pool has.

nodeSizeFamily
Nullable<NodeSizeFamily>

The kind of nodes that the Big Data pool provides.

lastSucceededTimestamp
Nullable<DateTimeOffset>

The time when the Big Data pool was updated successfully.

Returns

A new BigDataPoolResourceInfo instance for mocking.

Applies to