AzureOpenAIPromptExecutionSettings Class

Definition

Execution settings for an AzureOpenAI completion request.

[System.Text.Json.Serialization.JsonNumberHandling(System.Text.Json.Serialization.JsonNumberHandling.AllowReadingFromString)]
public sealed class AzureOpenAIPromptExecutionSettings : Microsoft.SemanticKernel.Connectors.OpenAI.OpenAIPromptExecutionSettings
[<System.Text.Json.Serialization.JsonNumberHandling(System.Text.Json.Serialization.JsonNumberHandling.AllowReadingFromString)>]
type AzureOpenAIPromptExecutionSettings = class
    inherit OpenAIPromptExecutionSettings
Public NotInheritable Class AzureOpenAIPromptExecutionSettings
Inherits OpenAIPromptExecutionSettings
Inheritance
AzureOpenAIPromptExecutionSettings
Attributes

Constructors

AzureOpenAIPromptExecutionSettings()

Properties

Audio

Gets or sets the audio options to use for the completion when audio modality is enabled.

(Inherited from OpenAIPromptExecutionSettings)
AzureChatDataSource

An abstraction of additional settings for chat completion, see https://learn.microsoft.com/en-us/dotnet/api/azure.ai.openai.azurechatextensionsoptions. This property is compatible only with Azure OpenAI.

ChatDeveloperPrompt

The system prompt to use when generating text using a chat model. Defaults to "Assistant is a large language model."

(Inherited from OpenAIPromptExecutionSettings)
ChatSystemPrompt

The system prompt to use when generating text using a chat model. Defaults to "Assistant is a large language model."

(Inherited from OpenAIPromptExecutionSettings)
ExtensionData

Extra properties that may be included in the serialized execution settings.

(Inherited from PromptExecutionSettings)
FrequencyPenalty

Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.

(Inherited from OpenAIPromptExecutionSettings)
FunctionChoiceBehavior

Gets or sets the behavior defining the way functions are chosen by LLM and how they are invoked by AI connectors.

(Inherited from PromptExecutionSettings)
IsFrozen

Gets a value that indicates whether the PromptExecutionSettings are currently modifiable.

(Inherited from PromptExecutionSettings)
Logprobs

Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the content of message.

(Inherited from OpenAIPromptExecutionSettings)
MaxTokens

The maximum number of tokens to generate in the completion.

(Inherited from OpenAIPromptExecutionSettings)
Metadata

Developer-defined tags and values used for filtering completions in the OpenAI dashboard.

(Inherited from OpenAIPromptExecutionSettings)
Modalities

Gets or sets the response modalities to use for the completion.

(Inherited from OpenAIPromptExecutionSettings)
ModelId

Model identifier. This identifies the AI model these settings are configured for e.g., gpt-4, gpt-3.5-turbo

(Inherited from PromptExecutionSettings)
PresencePenalty

Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.

(Inherited from OpenAIPromptExecutionSettings)
ReasoningEffort

Gets or sets an object specifying the effort level for the model to use when generating the completion.

(Inherited from OpenAIPromptExecutionSettings)
ResponseFormat

Gets or sets the response format to use for the completion.

(Inherited from OpenAIPromptExecutionSettings)
Seed

If specified, the system will make a best effort to sample deterministically such that repeated requests with the same seed and parameters should return the same result. Determinism is not guaranteed.

(Inherited from OpenAIPromptExecutionSettings)
ServiceId

Service identifier. This identifies the service these settings are configured for e.g., azure_openai_eastus, openai, ollama, huggingface, etc.

(Inherited from PromptExecutionSettings)
SetNewMaxCompletionTokensEnabled

Enabling this property will enforce the new max_completion_tokens parameter to be send the Azure OpenAI API.

StopSequences

Sequences where the completion will stop generating further tokens.

(Inherited from OpenAIPromptExecutionSettings)
Store

Whether or not to store the output of this chat completion request for use in the OpenAI model distillation or evals products.

(Inherited from OpenAIPromptExecutionSettings)
Temperature

Temperature controls the randomness of the completion. The higher the temperature, the more random the completion. Default is 1.0.

(Inherited from OpenAIPromptExecutionSettings)
TokenSelectionBiases

Modify the likelihood of specified tokens appearing in the completion.

(Inherited from OpenAIPromptExecutionSettings)
ToolCallBehavior

Gets or sets the behavior for how tool calls are handled.

(Inherited from OpenAIPromptExecutionSettings)
TopLogprobs

An integer specifying the number of most likely tokens to return at each token position, each with an associated log probability.

(Inherited from OpenAIPromptExecutionSettings)
TopP

TopP controls the diversity of the completion. The higher the TopP, the more diverse the completion. Default is 1.0.

(Inherited from OpenAIPromptExecutionSettings)
User

A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse

(Inherited from OpenAIPromptExecutionSettings)
WebSearchOptions

An object to allow models to search the web for the latest information before generating a response.

(Inherited from OpenAIPromptExecutionSettings)

Methods

Clone()

Creates a new PromptExecutionSettings object that is a copy of the current instance.

Clone<T>()

Clone the settings object.

(Inherited from OpenAIPromptExecutionSettings)
Freeze()

Makes the current PromptExecutionSettings unmodifiable and sets its IsFrozen property to true.

(Inherited from OpenAIPromptExecutionSettings)
FromExecutionSettings(PromptExecutionSettings, Nullable<Int32>)

Create a new settings object with the values from another settings object.

FromExecutionSettingsWithData(PromptExecutionSettings, Nullable<Int32>)
Obsolete.

Create a new settings object with the values from another settings object.

ThrowIfFrozen()

Throws an InvalidOperationException if the PromptExecutionSettings are frozen.

(Inherited from PromptExecutionSettings)

Applies to