Freigeben über


OnnxRuntimeGenAIPromptExecutionSettings Class

Definition

OnnxRuntimeGenAI Execution Settings.

public sealed class OnnxRuntimeGenAIPromptExecutionSettings : Microsoft.SemanticKernel.PromptExecutionSettings
type OnnxRuntimeGenAIPromptExecutionSettings = class
    inherit PromptExecutionSettings
Public NotInheritable Class OnnxRuntimeGenAIPromptExecutionSettings
Inherits PromptExecutionSettings
Inheritance
OnnxRuntimeGenAIPromptExecutionSettings

Constructors

OnnxRuntimeGenAIPromptExecutionSettings()

Properties

DiversityPenalty

Indicating by which amount to penalize common words between beam group

DoSample

Do random sampling

EarlyStopping

Allows the generation to stop early if all beam candidates reach the end token

ExtensionData

Extra properties that may be included in the serialized execution settings.

(Inherited from PromptExecutionSettings)
FunctionChoiceBehavior

Gets or sets the behavior defining the way functions are chosen by LLM and how they are invoked by AI connectors.

(Inherited from PromptExecutionSettings)
IsFrozen

Gets a value that indicates whether the PromptExecutionSettings are currently modifiable.

(Inherited from PromptExecutionSettings)
LengthPenalty

Length penalty of generated summaries

MaxTokens

Max number of tokens to generate including the prompt

MinTokens

Min number of tokens to generate including the prompt

ModelId

Model identifier. This identifies the AI model these settings are configured for e.g., gpt-4, gpt-3.5-turbo

(Inherited from PromptExecutionSettings)
NoRepeatNgramSize

No repeated ngram in generated summaries

NumBeams

The number of beams used during beam_search

NumReturnSequences

The number of independently computed returned sequences for each element in the batch

PastPresentShareBuffer

The past/present kv tensors are shared and allocated once to max_length (cuda only)

RepetitionPenalty

Repetition penalty to sample with

ServiceId

Service identifier. This identifies the service these settings are configured for e.g., azure_openai_eastus, openai, ollama, huggingface, etc.

(Inherited from PromptExecutionSettings)
Temperature

Temperature to sample with

TopK

Top k tokens to sample from

TopP

Top p probability to sample with

Methods

Clone()

Creates a new PromptExecutionSettings object that is a copy of the current instance.

(Inherited from PromptExecutionSettings)
Freeze()

Makes the current PromptExecutionSettings unmodifiable and sets its IsFrozen property to true.

(Inherited from PromptExecutionSettings)
FromExecutionSettings(PromptExecutionSettings)

Convert PromptExecutionSettings to OnnxRuntimeGenAIPromptExecutionSettings

ThrowIfFrozen()

Throws an InvalidOperationException if the PromptExecutionSettings are frozen.

(Inherited from PromptExecutionSettings)

Applies to