次の方法で共有


AzureOpenAIPromptExecutionSettings Class

Definition

Execution settings for an AzureOpenAI completion request.

[System.Text.Json.Serialization.JsonNumberHandling(System.Text.Json.Serialization.JsonNumberHandling.AllowReadingFromString)]
public sealed class AzureOpenAIPromptExecutionSettings : Microsoft.SemanticKernel.Connectors.OpenAI.OpenAIPromptExecutionSettings
[<System.Text.Json.Serialization.JsonNumberHandling(System.Text.Json.Serialization.JsonNumberHandling.AllowReadingFromString)>]
type AzureOpenAIPromptExecutionSettings = class
    inherit OpenAIPromptExecutionSettings
Public NotInheritable Class AzureOpenAIPromptExecutionSettings
Inherits OpenAIPromptExecutionSettings
Inheritance
AzureOpenAIPromptExecutionSettings
Attributes

Constructors

AzureOpenAIPromptExecutionSettings()

Properties

AzureChatDataSource

An abstraction of additional settings for chat completion, see https://learn.microsoft.com/en-us/dotnet/api/azure.ai.openai.azurechatextensionsoptions. This property is compatible only with Azure OpenAI.

ChatSystemPrompt

The system prompt to use when generating text using a chat model. Defaults to "Assistant is a large language model."

(Inherited from OpenAIPromptExecutionSettings)
ExtensionData

Extra properties that may be included in the serialized execution settings.

(Inherited from PromptExecutionSettings)
FrequencyPenalty

Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.

(Inherited from OpenAIPromptExecutionSettings)
FunctionChoiceBehavior

Gets or sets the behavior defining the way functions are chosen by LLM and how they are invoked by AI connectors.

(Inherited from PromptExecutionSettings)
IsFrozen

Gets a value that indicates whether the PromptExecutionSettings are currently modifiable.

(Inherited from PromptExecutionSettings)
Logprobs

Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the content of message.

(Inherited from OpenAIPromptExecutionSettings)
MaxTokens

The maximum number of tokens to generate in the completion.

(Inherited from OpenAIPromptExecutionSettings)
ModelId

Model identifier. This identifies the AI model these settings are configured for e.g., gpt-4, gpt-3.5-turbo

(Inherited from PromptExecutionSettings)
PresencePenalty

Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.

(Inherited from OpenAIPromptExecutionSettings)
ResponseFormat

Gets or sets the response format to use for the completion.

(Inherited from OpenAIPromptExecutionSettings)
Seed

If specified, the system will make a best effort to sample deterministically such that repeated requests with the same seed and parameters should return the same result. Determinism is not guaranteed.

(Inherited from OpenAIPromptExecutionSettings)
ServiceId

Service identifier. This identifies the service these settings are configured for e.g., azure_openai_eastus, openai, ollama, huggingface, etc.

(Inherited from PromptExecutionSettings)
StopSequences

Sequences where the completion will stop generating further tokens.

(Inherited from OpenAIPromptExecutionSettings)
Temperature

Temperature controls the randomness of the completion. The higher the temperature, the more random the completion. Default is 1.0.

(Inherited from OpenAIPromptExecutionSettings)
TokenSelectionBiases

Modify the likelihood of specified tokens appearing in the completion.

(Inherited from OpenAIPromptExecutionSettings)
ToolCallBehavior

Gets or sets the behavior for how tool calls are handled.

(Inherited from OpenAIPromptExecutionSettings)
TopLogprobs

An integer specifying the number of most likely tokens to return at each token position, each with an associated log probability.

(Inherited from OpenAIPromptExecutionSettings)
TopP

TopP controls the diversity of the completion. The higher the TopP, the more diverse the completion. Default is 1.0.

(Inherited from OpenAIPromptExecutionSettings)
User

A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse

(Inherited from OpenAIPromptExecutionSettings)

Methods

Clone()

Creates a new PromptExecutionSettings object that is a copy of the current instance.

Clone<T>()

Clone the settings object.

(Inherited from OpenAIPromptExecutionSettings)
Freeze()

Makes the current PromptExecutionSettings unmodifiable and sets its IsFrozen property to true.

(Inherited from OpenAIPromptExecutionSettings)
FromExecutionSettings(PromptExecutionSettings, Nullable<Int32>)

Create a new settings object with the values from another settings object.

FromExecutionSettingsWithData(PromptExecutionSettings, Nullable<Int32>)
Obsolete.

Create a new settings object with the values from another settings object.

ThrowIfFrozen()

Throws an InvalidOperationException if the PromptExecutionSettings are frozen.

(Inherited from PromptExecutionSettings)

Applies to