Compartilhar via


OpenAIPromptExecutionSettings.ResponseFormat Property

Definition

Gets or sets the response format to use for the completion.

[System.Text.Json.Serialization.JsonIgnore(Condition=System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull)]
[System.Text.Json.Serialization.JsonPropertyName("response_format")]
public object? ResponseFormat { get; set; }
[<System.Text.Json.Serialization.JsonIgnore(Condition=System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull)>]
[<System.Text.Json.Serialization.JsonPropertyName("response_format")>]
member this.ResponseFormat : obj with get, set
Public Property ResponseFormat As Object

Property Value

Attributes

Remarks

An object specifying the format that the model must output. Setting to { "type": "json_schema", "json_schema": { ...} } enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide. Setting to { "type": "json_object" } enables JSON mode, which ensures the message the model generates is valid JSON. Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if finish_reason= "length", which indicates the generation exceeded max_tokens or the conversation exceeded the max context length. Possible values are:

- String values: "json_object", "text";

- OpenAI.Chat.ChatResponseFormat object;

- Type object, which will be used to automatically create a JSON schema.

Applies to