PromptManagerOptions interface

Options used to configure the prompt manager.

Properties

max_conversation_history_tokens

Optional. Maximum number of tokens of conversation history to include in prompts.

max_history_messages

Optional. Maximum number of messages to use when rendering conversation_history.

max_input_tokens

Optional. Maximum number of tokens user input to include in prompts.

promptsFolder

Path to the filesystem folder containing all the application's prompts.

role

Optional. Message role to use for loaded prompts.

Property Details

max_conversation_history_tokens

Optional. Maximum number of tokens of conversation history to include in prompts.

max_conversation_history_tokens?: number

Property Value

number

Remarks

The default is to let conversation history consume the remainder of the prompts max_input_tokens budget. Setting this to a value greater than 1 will override that and all prompts will use a fixed token budget.

max_history_messages

Optional. Maximum number of messages to use when rendering conversation_history.

max_history_messages?: number

Property Value

number

Remarks

This controls the automatic pruning of the conversation history that's done by the planners LLMClient instance. This helps keep your memory from getting too big and defaults to a value of 10 (or 5 turns.)

max_input_tokens

Optional. Maximum number of tokens user input to include in prompts.

max_input_tokens?: number

Property Value

number

Remarks

This defaults to unlimited but can be set to a value greater than 1 to limit the length of user input included in prompts. For example, if set to 100 then the any user input over 100 tokens in length will be truncated.

promptsFolder

Path to the filesystem folder containing all the application's prompts.

promptsFolder: string

Property Value

string

role

Optional. Message role to use for loaded prompts.

role?: string

Property Value

string

Remarks

Defaults to 'system'.