Get insight about your data from a .NET AI chat app
Get started with AI development using a .NET 8 console app to connect to an OpenAI gpt-3.5-turbo
model. You'll connect to the AI model using Semantic Kernel to analyze hiking data and provide insights.
Prerequisites
- .NET 8.0 SDK - Install the .NET 8.0 SDK.
- An API key from OpenAI so you can run this sample.
- On Windows, PowerShell
v7+
is required. To validate your version, runpwsh
in a terminal. It should return the current version. If it returns an error, execute the following command:dotnet tool update --global PowerShell
.
Get started with AI development using a .NET 8 console app to connect to an OpenAI gpt-3.5-turbo
model deployed on Azure. You'll connect to the AI model using Semantic Kernel to analyze hiking data and provide insights.
Prerequisites
- .NET 8 SDK - Install the .NET 8 SDK.
- An Azure subscription - Create one for free.
- Access to Azure OpenAI service.
- Azure Developer CLI (Optional) - Install or update the Azure Developer CLI.
Get the sample project
Clone the sample repository
You can create your own app and follow along the steps in the sections ahead, or you can clone the GitHub repository that contains the completed sample apps for all of the quickstarts. The sample repo is also structured as an Azure Developer CLI template that can provision an Azure OpenAI resource for you.
git clone https://github.com/dotnet/ai-samples.git
Create the Azure OpenAI service
The sample GitHub repository is structured as an Azure Developer CLI (azd
) template, which azd
can use to provision the Azure OpenAI service and model for you.
From a terminal or command prompt, navigate to the
src\quickstarts\azure-openai
directory of the sample repo.Run the
azd up
command to provision the Azure OpenAI resources. It might take several minutes to create the Azure OpenAI service and deploy the model.azd up
azd
also configures the required user secrets for the sample app, such as the Azure OpenAI endpoint and model name.
Try the hiking chat sample
From a terminal or command prompt, navigate to the
src\quickstarts\openai\semantic-kernel\03-ChattingAboutMyHikes
directory.Run the following commands to configure your OpenAI API key as a secret for the sample app:
dotnet user-secrets init dotnet user-secrets set OpenAIKey <your-openai-key>
Use the
dotnet run
command to run the app:dotnet run
From a terminal or command prompt, navigate to the
semantic-kernel\02-HikerAI
directory.Use the
dotnet run
command to run the app:dotnet run
Tip
If you get an error message, the Azure OpenAI resources might not have finished deploying. Wait a couple of minutes and try again.
Explore the code
The application uses the Microsoft.SemanticKernel
package to send and receive requests to an OpenAI service.
The entire application is contained within the Program.cs file. The first several lines of code set configuration values and gets the OpenAI Key that was previously set using the dotnet user-secrets
command.
var config = new ConfigurationBuilder().AddUserSecrets<Program>().Build();
string model = "gpt-3.5-turbo";
string key = config["OpenAIKey"];
The OpenAIChatCompletionService
service facilitates the requests and responses.
// Create the OpenAI Chat Completion Service
OpenAIChatCompletionService service = new(model, key);
Once the OpenAIChatCompletionService
client is created, the app reads the content of the file hikes.md
and uses it to provide more context to the model by adding a system prompt. This influences model behavior and the generated completions during the conversation.
The application uses the Microsoft.SemanticKernel
package to send and receive requests to an Azure OpenAI service deployed in Azure.
The entire application is contained within the Program.cs file. The first several lines of code loads up secrets and configuration values that were set in the dotnet user-secrets
for you during the application provisioning.
// == Retrieve the local secrets saved during the Azure deployment ==========
var config = new ConfigurationBuilder().AddUserSecrets<Program>().Build();
string endpoint = config["AZURE_OPENAI_ENDPOINT"];
string deployment = config["AZURE_OPENAI_GPT_NAME"];
The AzureOpenAIChatCompletionService
service facilitates the requests and responses.
// == Create the Azure OpenAI Chat Completion Service ==========
AzureOpenAIChatCompletionService service = new(deployment, endpoint, new DefaultAzureCredential());
Once the OpenAIChatCompletionService
client is created, the app reads the content of the file hikes.md
and uses it to provide more context to the model by adding a system prompt. This influences model behavior and the generated completions during the conversation.
// Provide context for the AI model
ChatHistory chatHistory = new($"""
You are upbeat and friendly. You introduce yourself when first saying hello.
Provide a short answer only based on the user hiking records below:
{File.ReadAllText("hikes.md")}
""");
Console.WriteLine($"{chatHistory.Last().Role} >>> {chatHistory.Last().Content}");
The following code adds a user prompt to the model using the AddUserMessage
function. The GetChatMessageContentAsync
function instructs the model to generate a response based off the system and user prompts.
// Start the conversation
chatHistory.AddUserMessage("Hi!");
Console.WriteLine($"{chatHistory.Last().Role} >>> {chatHistory.Last().Content}");
chatHistory.Add(
await service.GetChatMessageContentAsync(
chatHistory,
new OpenAIPromptExecutionSettings()
{
MaxTokens = 400
}));
Console.WriteLine($"{chatHistory.Last().Role} >>> {chatHistory.Last().Content}");
The app adds the response from the model to the chatHistory
to maintain the chat history or context.
// Continue the conversation with a question.
chatHistory.AddUserMessage(
"I would like to know the ratio of the hikes I've done in Canada compared to other countries.");
Console.WriteLine($"{chatHistory.Last().Role} >>> {chatHistory.Last().Content}");
chatHistory.Add(await service.GetChatMessageContentAsync(
chatHistory,
new OpenAIPromptExecutionSettings()
{
MaxTokens = 400
}));
Console.WriteLine($"{chatHistory.Last().Role} >>> {chatHistory.Last().Content}");
Customize the system or user prompts to provide different questions and context:
- How many times did I hike when it was raining?
- How many times did I hike in 2021?
The model generates a relevant response to each prompt based on your inputs.
Clean up resources
When you no longer need the sample application or resources, remove the corresponding deployment and all resources.
azd down
Troubleshoot
On Windows, you might get the following error messages after running azd up
:
postprovision.ps1 is not digitally signed. The script will not execute on the system
The script postprovision.ps1 is executed to set the .NET user secrets used in the application. To avoid this error, run the following PowerShell command:
Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass
Then re-run the azd up
command.
Another possible error:
'pwsh' is not recognized as an internal or external command, operable program or batch file. WARNING: 'postprovision' hook failed with exit code: '1', Path: '.\infra\post-script\postprovision.ps1'. : exit code: 1 Execution will continue since ContinueOnError has been set to true.
The script postprovision.ps1 is executed to set the .NET user secrets used in the application. To avoid this error, manually run the script using the following PowerShell command:
.\infra\post-script\postprovision.ps1
The .NET AI apps now have the user secrets configured and they can be tested.