Partager via


KernelFunction Class

Semantic Kernel function.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Inheritance
KernelFunction

Constructor

KernelFunction(*, metadata: ~semantic_kernel.functions.kernel_function_metadata.KernelFunctionMetadata, invocation_duration_histogram: ~opentelemetry.metrics.Histogram = <opentelemetry.metrics._internal.instrument._ProxyHistogram object>, streaming_duration_histogram: ~opentelemetry.metrics.Histogram = <opentelemetry.metrics._internal.instrument._ProxyHistogram object>)

Keyword-Only Parameters

Name Description
metadata
Required
invocation_duration_histogram
Default value: <opentelemetry.metrics._internal.instrument._ProxyHistogram object at 0x10788f020>
streaming_duration_histogram
Default value: <opentelemetry.metrics._internal.instrument._ProxyHistogram object at 0x10788e5a0>

Methods

from_method

Create a new instance of the KernelFunctionFromMethod class.

from_prompt

Create a new instance of the KernelFunctionFromPrompt class.

function_copy

Copy the function, can also override the plugin_name.

invoke

Invoke the function with the given arguments.

invoke_stream

Invoke a stream async function with the given arguments.

from_method

Create a new instance of the KernelFunctionFromMethod class.

from_method(method: Callable[[...], Any], plugin_name: str | None = None, stream_method: Callable[[...], Any] | None = None) -> KernelFunctionFromMethod

Parameters

Name Description
method
Required
plugin_name
Default value: None
stream_method
Default value: None

from_prompt

Create a new instance of the KernelFunctionFromPrompt class.

from_prompt(function_name: str, plugin_name: str, description: str | None = None, prompt: str | None = None, template_format: Literal['semantic-kernel', 'handlebars', 'jinja2'] = 'semantic-kernel', prompt_template: PromptTemplateBase | None = None, prompt_template_config: PromptTemplateConfig | None = None, prompt_execution_settings: PromptExecutionSettings | list[PromptExecutionSettings] | dict[str, PromptExecutionSettings] | None = None) -> KernelFunctionFromPrompt

Parameters

Name Description
function_name
Required
plugin_name
Required
description
Default value: None
prompt
Default value: None
template_format
Default value: semantic-kernel
prompt_template
Default value: None
prompt_template_config
Default value: None
prompt_execution_settings
Default value: None

function_copy

Copy the function, can also override the plugin_name.

function_copy(plugin_name: str | None = None) -> KernelFunction

Parameters

Name Description
plugin_name
str

The new plugin name.

Default value: None

Returns

Type Description

The copied function.

invoke

Invoke the function with the given arguments.

async invoke(kernel: Kernel, arguments: KernelArguments | None = None, metadata: dict[str, Any] = {}, **kwargs: Any) -> FunctionResult | None

Parameters

Name Description
kernel
Required
<xref:semantic_kernel.functions.Kernel>

The kernel

arguments

The Kernel arguments

Default value: None
metadata
<xref:Dict>[str,<xref: Any>]

Additional metadata.

Default value: {}
kwargs
Required
Any

Additional keyword arguments that will be added to the KernelArguments.

Returns

Type Description

The result of the function

invoke_stream

Invoke a stream async function with the given arguments.

async invoke_stream(kernel: Kernel, arguments: KernelArguments | None = None, metadata: dict[str, Any] = {}, **kwargs: Any) -> AsyncGenerator[FunctionResult | list[StreamingContentMixin | Any], Any]

Parameters

Name Description
kernel
Required
<xref:semantic_kernel.functions.Kernel>

The kernel

arguments

The Kernel arguments

Default value: None
metadata
<xref:Dict>[str,<xref: Any>]

Additional metadata.

Default value: {}
kwargs
Required
Any

Additional keyword arguments that will be added to the KernelArguments.

Attributes

description

The description of the function.

fully_qualified_name

The fully qualified name of the function.

is_prompt

Whether the function is based on a prompt.

model_computed_fields

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

model_computed_fields: ClassVar[Dict[str, ComputedFieldInfo]] = {}

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'populate_by_name': True, 'validate_assignment': True}

model_fields

Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo] objects.

This replaces Model.fields from Pydantic V1.

model_fields: ClassVar[Dict[str, FieldInfo]] = {'invocation_duration_histogram': FieldInfo(annotation=Histogram, required=False, default=<opentelemetry.metrics._internal.instrument._ProxyHistogram object>), 'metadata': FieldInfo(annotation=KernelFunctionMetadata, required=True), 'streaming_duration_histogram': FieldInfo(annotation=Histogram, required=False, default=<opentelemetry.metrics._internal.instrument._ProxyHistogram object>)}

name

The name of the function.

parameters

The parameters for the function.

plugin_name

The name of the plugin that contains this function.

return_parameter

The return parameter for the function.

stream_function

The stream function for the function.

function

The function to call.

prompt_execution_settings

The AI prompt execution settings.

prompt_template_config

The prompt template configuration.

metadata

The metadata for the function.

metadata: KernelFunctionMetadata

invocation_duration_histogram

invocation_duration_histogram: Histogram

streaming_duration_histogram

streaming_duration_histogram: Histogram