Extend OpenAI using Tools and execute a local Function with .NET

Get started with AI by creating a simple .NET 8 console chat application. The application will run locally and use the OpenAI gpt-3.5-turbo model, using Tools to extend the model's capabilities by calling a local .NET method. Follow these steps to get access to OpenAI and learn how to use Semantic Kernel.

Prerequisites

  • .NET 8.0 SDK - Install the .NET 8.0 SDK.
  • An API key from OpenAI so you can run this sample.
  • On Windows, PowerShell v7+ is required. To validate your version, run pwsh in a terminal. It should return the current version. If it returns an error, execute the following command: dotnet tool update --global PowerShell.

Get started with AI by creating a simple .NET 8 console chat application. The application will run locally and use the OpenAI gpt-35-turbo model deployed into an Azure OpenAI account. It uses Tools to extend the model's capabilities by calling a local .NET method. Follow these steps to provision Azure OpenAI and learn how to use Semantic Kernel.

Prerequisites

Get the sample project

Clone the GitHub repository that contains the sample apps for all of the quickstarts:

git clone https://github.com/dotnet/ai-samples.git

Create the Azure OpenAI service

The sample GitHub repository is structured as an Azure Developer CLI (azd) template, which azd can use to provision the Azure OpenAI service and model for you.

  1. From a terminal or command prompt, navigate to the src\quickstarts\azure-openai directory of the sample repo.

  2. Run the azd up command to provision the Azure OpenAI resources. It might take several minutes to create the Azure OpenAI service and deploy the model.

    azd up
    

    azd also configures the required user secrets for the sample app, such as the OpenAI access key.

    Note

    If you encounter an error during the azd up deployment, visit the troubleshooting section.

Try the the hiker pro sample

  1. From a terminal or command prompt, navigate to the azure-openai\04-HikerAIPro directory.

  2. Run the following commands to configure your OpenAI API key as a secret for the sample app:

    dotnet user-secrets init
    dotnet user-secrets set OpenAIKey <your-openai-key>
    
  3. Use the dotnet run command to run the app:

    dotnet run
    
  1. From a terminal or command prompt, navigate to the azure-openai\04-HikerAIPro directory.

  2. Use the dotnet run command to run the app:

    dotnet run
    

    Tip

    If you get an error message, the Azure OpenAI resources might not have finished deploying. Wait a couple of minutes and try again.

Understand the code

The application uses the Microsoft.SemanticKernel package to send and receive requests to the OpenAI service.

The entire application is contained within the Program.cs file. The first several lines of code set configuration values and get the OpenAI Key that was previously set using the dotnet user-secrets command.

var config = new ConfigurationBuilder().AddUserSecrets<Program>().Build();
string model = "gpt-3.5-turbo";
string key = config["OpenAIKey"];

The Kernel class facilitates the requests and responses with the help of AddOpenAIChatCompletion service.

// Create a Kernel containing the OpenAI Chat Completion Service
IKernelBuilder b = Kernel.CreateBuilder();

Kernel kernel = b
    .AddOpenAIChatCompletion(model, key)
    .Build();

The application uses the Microsoft.SemanticKernel package to send and receive requests to the OpenAI service.

The entire application is contained within the Program.cs file. The first several lines of code loads up secrets and configuration values that were set in the dotnet user-secrets for you during the application provisioning.

var config = new ConfigurationBuilder().AddUserSecrets<Program>().Build();
string endpoint = config["AZURE_OPENAI_ENDPOINT"];
string deployment = config["AZURE_OPENAI_GPT_NAME"];
string key = config["AZURE_OPENAI_KEY"];

The Kernel class facilitates the requests and responses with the help of AzureOpenAIChatCompletion service.

// Create a Kernel containing the Azure OpenAI Chat Completion Service
IKernelBuilder b = Kernel.CreateBuilder();

Kernel kernel = b
    .AddAzureOpenAIChatCompletion(deployment, endpoint, key)
    .Build();

The functions ImportPluginFromFunctions and CreateFromMethod define the local function that will be called by the model.

// Add a new plugin with a local .NET function that should be available to the AI model
// For convenience and clarity of into the code, this standalone local method handles tool call responses. It will fake a call to a weather API and return the current weather for the specified location.
kernel.ImportPluginFromFunctions("WeatherPlugin",
[
    KernelFunctionFactory.CreateFromMethod(
        ([Description("The city, e.g. Montreal, Sidney")] string location, string unit = null) =>
    {
        // Here you would call a weather API to get the weather for the location
        return "Periods of rain or drizzle, 15 C";
    }, "get_current_weather", "Get the current weather in a given location")
]);

Once the kernel client is created, the code uses a system prompt to provide context and influence the completion tone and content. Note how the weather is emphasized in the system prompt.

ChatHistory chatHistory = new("""
    You are a hiking enthusiast who helps people discover fun hikes in their area.
    You are upbeat and friendly. Good weather is important for a good hike. 
    Only make recommendations if the weather is good or if people insist.
    You introduce yourself when first saying hello. When helping people out,
    you always ask them for this information to inform the hiking recommendation you provide:

    1. Where they are located
    2. What hiking intensity they are looking for

    You will then provide three suggestions for nearby hikes that vary in length
    after you get that information. You will also share an interesting fact about the local
    nature on the hikes when making a recommendation.
    """);

The app also adds a user message to the model using the AddUserMessage function. The GetChatMessageContentAsync function sends the chat history to the model to generate a response based off the system and user prompts.

chatHistory.AddUserMessage("""
    Is the weather is good today for a hike?
    If yes, I live in the greater Montreal area and would like an easy hike. 
    I don't mind driving a bit to get there. I don't want the hike to be over 10 miles round trip.
    I'd consider a point-to-point hike.
    I want the hike to be as isolated as possible. I don't want to see many people.
    I would like it to be as bug free as possible.
    """);

Console.WriteLine($"{chatHistory.Last().Role} >>> {chatHistory.Last().Content}");

chatHistory.Add(await service.GetChatMessageContentAsync(
    chatHistory, 
    new OpenAIPromptExecutionSettings()
    { 
        MaxTokens = 400 
    }));

Console.WriteLine($"{chatHistory.Last().Role} >>> {chatHistory.Last().Content}");

Customize the system prompt and user message to see how the model responds to help you find a hike that you'll like.

Clean up resources

When you no longer need the sample application or resources, remove the corresponding deployment and all resources.

azd down

Troubleshoot

On Windows, you might get the following error messages after running azd up:

postprovision.ps1 is not digitally signed. The script will not execute on the system

The script postprovision.ps1 is executed to set the .NET user secrets used in the application. To avoid this error, run the following PowerShell command:

Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass

Then re-run the azd up command.

Another possible error:

'pwsh' is not recognized as an internal or external command, operable program or batch file. WARNING: 'postprovision' hook failed with exit code: '1', Path: '.\infra\post-script\postprovision.ps1'. : exit code: 1 Execution will continue since ContinueOnError has been set to true.

The script postprovision.ps1 is executed to set the .NET user secrets used in the application. To avoid this error, manually run the script using the following PowerShell command:

.\infra\post-script\postprovision.ps1

The .NET AI apps now have the user secrets configured and they can be tested.

Next steps