Freigeben über


Kernel Class

The Kernel of Semantic Kernel.

This is the main entry point for Semantic Kernel. It provides the ability to run functions and manage filters, plugins, and AI services.

Initialize a new instance of the Kernel class.

Inheritance

Constructor

Kernel(plugins: KernelPlugin | dict[str, KernelPlugin] | list[KernelPlugin] | None = None, services: AI_SERVICE_CLIENT_TYPE | list[AI_SERVICE_CLIENT_TYPE] | dict[str, AI_SERVICE_CLIENT_TYPE] | None = None, ai_service_selector: AIServiceSelector | None = None, *, retry_mechanism: RetryMechanismBase = None, function_invocation_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], None]], None]]] = None, prompt_rendering_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], None]], None]]] = None, auto_function_invocation_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], None]], None]]] = None)

Parameters

Name Description
plugins

The plugins to be used by the kernel, will be rewritten to a dict with plugin name as key

Default value: None
services

The services to be used by the kernel, will be rewritten to a dict with service_id as key

Default value: None
ai_service_selector

The AI service selector to be used by the kernel, default is based on order of execution settings.

Default value: None
**kwargs
Required

Additional fields to be passed to the Kernel model, these are limited to retry_mechanism and function_invoking_handlers and function_invoked_handlers, the best way to add function_invoking_handlers and function_invoked_handlers is to use the add_function_invoking_handler and add_function_invoked_handler methods.

Keyword-Only Parameters

Name Description
retry_mechanism
Required
function_invocation_filters
Required
prompt_rendering_filters
Required
auto_function_invocation_filters
Required

Methods

add_embedding_to_object

Gather all fields to embed, batch the embedding generation and store.

invoke

Execute a function and return the FunctionResult.

invoke_function_call

Processes the provided FunctionCallContent and updates the chat history.

invoke_prompt

Invoke a function from the provided prompt.

invoke_prompt_stream

Invoke a function from the provided prompt and stream the results.

invoke_stream

Execute one or more stream functions.

This will execute the functions in the order they are provided, if a list of functions is provided. When multiple functions are provided only the last one is streamed, the rest is executed as a pipeline.

add_embedding_to_object

Gather all fields to embed, batch the embedding generation and store.

async add_embedding_to_object(inputs: TDataModel | Sequence[TDataModel], field_to_embed: str, field_to_store: str, execution_settings: dict[str, PromptExecutionSettings], container_mode: bool = False, cast_function: Callable[[list[float]], Any] | None = None, **kwargs: Any)

Parameters

Name Description
inputs
Required
field_to_embed
Required
field_to_store
Required
execution_settings
Required
container_mode
Default value: False
cast_function
Default value: None

invoke

Execute a function and return the FunctionResult.

async invoke(function: KernelFunction | None = None, arguments: KernelArguments | None = None, function_name: str | None = None, plugin_name: str | None = None, metadata: dict[str, Any] = {}, **kwargs: Any) -> FunctionResult | None

Parameters

Name Description
function
<xref:semantic_kernel.kernel.KernelFunction>

The function or functions to execute, this value has precedence when supplying both this and using function_name and plugin_name, if this is none, function_name and plugin_name are used and cannot be None.

Default value: None
arguments
<xref:semantic_kernel.kernel.KernelArguments>

The arguments to pass to the function(s), optional

Default value: None
function_name
<xref:<xref:semantic_kernel.kernel.str | None>>

The name of the function to execute

Default value: None
plugin_name
<xref:<xref:semantic_kernel.kernel.str | None>>

The name of the plugin to execute

Default value: None
metadata
dict[str,<xref: Any>]

The metadata to pass to the function(s)

Default value: {}
kwargs
Required
dict[str,<xref: Any>]

arguments that can be used instead of supplying KernelArguments

Exceptions

Type Description

If an error occurs during function invocation

invoke_function_call

Processes the provided FunctionCallContent and updates the chat history.

async invoke_function_call(function_call: FunctionCallContent, chat_history: ChatHistory, arguments: KernelArguments | None = None, function_call_count: int | None = None, request_index: int | None = None, function_behavior: FunctionChoiceBehavior = None) -> AutoFunctionInvocationContext | None

Parameters

Name Description
function_call
Required
chat_history
Required
arguments
Default value: None
function_call_count
Default value: None
request_index
Default value: None
function_behavior
Default value: None

invoke_prompt

Invoke a function from the provided prompt.

async invoke_prompt(prompt: str, function_name: str | None = None, plugin_name: str | None = None, arguments: KernelArguments | None = None, template_format: Literal['semantic-kernel', 'handlebars', 'jinja2'] = 'semantic-kernel', **kwargs: Any) -> FunctionResult | None

Parameters

Name Description
prompt
Required
str

The prompt to use

function_name
str

The name of the function, optional

Default value: None
plugin_name
str

The name of the plugin, optional

Default value: None
arguments
<xref:<xref:semantic_kernel.kernel.KernelArguments | None>>

The arguments to pass to the function(s), optional

Default value: None
template_format
<xref:<xref:semantic_kernel.kernel.str | None>>

The format of the prompt template

Default value: semantic-kernel
kwargs
Required
dict[str,<xref: Any>]

arguments that can be used instead of supplying KernelArguments

Returns

Type Description

The result of the function(s)

invoke_prompt_stream

Invoke a function from the provided prompt and stream the results.

async invoke_prompt_stream(prompt: str, function_name: str | None = None, plugin_name: str | None = None, arguments: KernelArguments | None = None, template_format: Literal['semantic-kernel', 'handlebars', 'jinja2'] = 'semantic-kernel', return_function_results: bool | None = False, **kwargs: Any) -> AsyncIterable[list[StreamingContentMixin] | FunctionResult | list[FunctionResult]]

Parameters

Name Description
prompt
Required
str

The prompt to use

function_name
str

The name of the function, optional

Default value: None
plugin_name
str

The name of the plugin, optional

Default value: None
arguments
<xref:<xref:semantic_kernel.kernel.KernelArguments | None>>

The arguments to pass to the function(s), optional

Default value: None
template_format
<xref:<xref:semantic_kernel.kernel.str | None>>

The format of the prompt template

Default value: semantic-kernel
return_function_results

If True, the function results are yielded as a list[FunctionResult]

Default value: False
kwargs
Required
dict[str,<xref: Any>]

arguments that can be used instead of supplying KernelArguments

Returns

Type Description

The content of the stream of the last function provided.

invoke_stream

Execute one or more stream functions.

This will execute the functions in the order they are provided, if a list of functions is provided. When multiple functions are provided only the last one is streamed, the rest is executed as a pipeline.

async invoke_stream(function: KernelFunction | None = None, arguments: KernelArguments | None = None, function_name: str | None = None, plugin_name: str | None = None, metadata: dict[str, Any] = {}, return_function_results: bool = False, **kwargs: Any) -> AsyncGenerator[list[StreamingContentMixin] | FunctionResult | list[FunctionResult], Any]

Parameters

Name Description
function
<xref:semantic_kernel.kernel.KernelFunction>

The function to execute, this value has precedence when supplying both this and using function_name and plugin_name, if this is none, function_name and plugin_name are used and cannot be None.

Default value: None
arguments
<xref:<xref:semantic_kernel.kernel.KernelArguments | None>>

The arguments to pass to the function(s), optional

Default value: None
function_name
<xref:<xref:semantic_kernel.kernel.str | None>>

The name of the function to execute

Default value: None
plugin_name
<xref:<xref:semantic_kernel.kernel.str | None>>

The name of the plugin to execute

Default value: None
metadata
dict[str,<xref: Any>]

The metadata to pass to the function(s)

Default value: {}
return_function_results

If True, the function results are yielded as a list[FunctionResult]

Default value: False
content
Required

content (<xref:in addition to the streaming>)

yielded.
Required

yielded. (<xref:otherwise only the streaming content is>)

kwargs
Required
dict[str,<xref: Any>]

arguments that can be used instead of supplying KernelArguments

Attributes

model_computed_fields

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_computed_fields: ClassVar[Dict[str, ComputedFieldInfo]] = {}

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'populate_by_name': True, 'validate_assignment': True}

model_fields

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo] objects.

This replaces Model.fields from Pydantic V1.

model_fields: ClassVar[Dict[str, FieldInfo]] = {'ai_service_selector': FieldInfo(annotation=AIServiceSelector, required=False, default_factory=AIServiceSelector), 'auto_function_invocation_filters': FieldInfo(annotation=list[tuple[int, Callable[list, NoneType]]], required=False, default_factory=list), 'function_invocation_filters': FieldInfo(annotation=list[tuple[int, Callable[list, NoneType]]], required=False, default_factory=list), 'plugins': FieldInfo(annotation=dict[str, KernelPlugin], required=False, default_factory=dict), 'prompt_rendering_filters': FieldInfo(annotation=list[tuple[int, Callable[list, NoneType]]], required=False, default_factory=list), 'retry_mechanism': FieldInfo(annotation=RetryMechanismBase, required=False, default_factory=PassThroughWithoutRetry), 'services': FieldInfo(annotation=dict[str, AIServiceClientBase], required=False, default_factory=dict)}

function_invocation_filters

Filters applied during function invocation, from KernelFilterExtension.

function_invocation_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], None]], None]]]

prompt_rendering_filters

Filters applied during prompt rendering, from KernelFilterExtension.

prompt_rendering_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], None]], None]]]

auto_function_invocation_filters

Filters applied during auto function invocation, from KernelFilterExtension.

auto_function_invocation_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], None]], None]]]

plugins

A dict with the plugins registered with the Kernel, from KernelFunctionExtension.

plugins: dict[str, KernelPlugin]

services

A dict with the services registered with the Kernel, from KernelServicesExtension.

services: dict[str, AIServiceClientBase]

ai_service_selector

The AI service selector to be used by the kernel, from KernelServicesExtension.

ai_service_selector: AIServiceSelector

retry_mechanism

The retry mechanism to be used by the kernel, from KernelReliabilityExtension.

retry_mechanism: RetryMechanismBase