TestModel class

A PromptCompletionModel used for testing.

Constructors

TestModel((model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>)

Creates a new OpenAIModel instance.

Properties

events

Events emitted by the model.

Methods

completePrompt(TurnContext, Memory, PromptFunctions, Tokenizer, PromptTemplate)

Completes a prompt using OpenAI or Azure OpenAI.

createTestModel((model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>)
returnContent(string, number)
returnError(Error, number)
returnRateLimited(Error, number)
returnResponse(PromptResponse<string>, number)
streamTextChunks(string[], number)

Constructor Details

TestModel((model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>)

Creates a new OpenAIModel instance.

new TestModel(handler: (model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>)

Parameters

handler

(model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>

Property Details

events

Events emitted by the model.

PromptCompletionModelEmitter events

Property Value

An event emitter for the model.

Method Details

completePrompt(TurnContext, Memory, PromptFunctions, Tokenizer, PromptTemplate)

Completes a prompt using OpenAI or Azure OpenAI.

function completePrompt(context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate): Promise<PromptResponse<string>>

Parameters

context

TurnContext

Current turn context.

memory
Memory

An interface for accessing state values.

functions
PromptFunctions

Functions to use when rendering the prompt.

tokenizer
Tokenizer

Tokenizer to use when rendering the prompt.

template
PromptTemplate

Prompt template to complete.

Returns

Promise<PromptResponse<string>>

A PromptResponse with the status and message.

createTestModel((model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>)

static function createTestModel(handler: (model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>): TestModel

Parameters

handler

(model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>

Returns

returnContent(string, number)

static function returnContent(content: string, delay?: number): TestModel

Parameters

content

string

delay

number

Returns

returnError(Error, number)

static function returnError(error: Error, delay?: number): TestModel

Parameters

error

Error

delay

number

Returns

returnRateLimited(Error, number)

static function returnRateLimited(error: Error, delay?: number): TestModel

Parameters

error

Error

delay

number

Returns

returnResponse(PromptResponse<string>, number)

static function returnResponse(response: PromptResponse<string>, delay?: number): TestModel

Parameters

response

PromptResponse<string>

delay

number

Returns

streamTextChunks(string[], number)

static function streamTextChunks(chunks: string[], delay?: number): TestModel

Parameters

chunks

string[]

delay

number

Returns