Hello Asadbek Sindarov,
Welcome to the Microsoft Q&A and thank you for posting your questions here.
I understand that you would like to fix backend unexpected response error.
Regarding your code and explanation:
- Since you're using the Azure.AI.OpenAI NuGet package version
2.2.0-beta.1
, there might be compatibility issues with the Llama model. Consider checking for any updates or patches for the NuGet package that might address this issue - https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-llama - Make sure that the Llama-3.3-70B-Instruct model is correctly deployed and active in the Azure AI Foundry. Sometimes, deployment issues can cause backend errors - https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-llama
- Sometimes, misconfigurations of the endpoint and key can lead to unexpected responses from the backend - check thoroughly - https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-llama
- You can also try to implement more detailed logging around the API calls to capture more information about the error.
- Check the code below as a modified of yours:
using Azure; using Azure.AI.OpenAI; using Azure.Identity; using OpenAI.Chat; using static System.Environment; async Task RunAsync() { var endpoint = "AZURE_OPENAI_ENDPOINT"; var key = "AZURE_OPENAI_KEY"; AzureKeyCredential credential = new AzureKeyCredential(key); AzureOpenAIClient azureClient = new(new Uri(endpoint), credential); ChatClient chatClient = azureClient.GetChatClient("Llama-3.3-70B-Instruct"); var messages = new List<ChatMessage> { new SystemChatMessage(""), new UserChatMessage("What are 3 things to visit in Seattle?") }; var options = new ChatCompletionOptions { Temperature = (float)0.7, MaxOutputTokenCount = 800, FrequencyPenalty = 0, PresencePenalty = 0, }; try { ChatCompletion completion = await chatClient.CompleteChatAsync(messages, options); if (completion.Content != null && completion.Content.Count > 0) { Console.WriteLine($"{completion.Content[0].Kind}: {completion.Content[0].Text}"); } else { Console.WriteLine("No response received."); } } catch (Exception ex) { Console.WriteLine($"An error occurred: {ex.Message}"); // Additional logging Console.WriteLine($"Stack Trace: {ex.StackTrace}"); } } await RunAsync();
- If the issue persists, providing detailed logs and error messages to Microsoft support via your Azure Portal will be very necessary.
I hope this is helpful! Do not hesitate to let me know if you have any other questions or clarifications.
Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.