Hi Taimoor Akhtar,
Welcome to Microsoft Q&A forum. Thank you for posting your query.
The error message suggests that the authentication flow for Azure OpenAI Service using Microsoft Entra ID (formerly AAD) with LangChain’s AzureAIChatCompletionsModel is failing. Since you mentioned that AsyncAzureOpenAI works but LangChain fails, this is likely an issue with how LangChain handles authentication.
Steps to Resolve the Issue
Verify Your Azure OpenAI Configuration:
Make sure you have:
Assigned the correct Microsoft Entra ID role to your identity (Azure OpenAI User or Cognitive Services User).
The correct endpoint and deployment name in your LangChain code.
The API version that supports AzureAIChatCompletionsModel (e.g., "2023-12-01-preview").
To check your role:
az ad signed-in-user show
az role assignment list --assignee "<your-client-id>"
Make sure your user or service principal has the required role.
Confirm That Azure CLI Authentication Works:
Since you are using AzureCliCredential, verify that it actually retrieves a valid token:
Run:
az account get-access-token --resource https://cognitiveservices.azure.com
You should see a token in the response. If not, re-login:
az login
Then, confirm:
az account show
Ensure it’s using the correct subscription.
Explicitly Pass the Token to LangChain:
LangChain’s AzureAIChatCompletionsModel does not automatically retrieve the token from AzureCliCredential. You need to explicitly pass the access token.
Modify your code like this:
Manually Retrieve and Pass Access Token
from azure.identity import AzureCliCredential
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
# Fetch Access Token manually
credential = AzureCliCredential()
token = credential.get_token("https://cognitiveservices.azure.com/.default").token
# LangChain Model Setup
llm = AzureAIChatCompletionsModel(
deployment_name="your-deployment-name",
endpoint="https://your-openai-instance.openai.azure.com",
api_version="2023-12-01-preview",
azure_ad_token=token, # Pass the token here
)
response = llm.invoke("Hello, how are you?")
print(response)
Debug Using OpenAI Python SDK
Before testing in LangChain, ensure authentication works in OpenAI’s azure-openai package:
from openai import AsyncAzureOpenAI
from azure.identity import AzureCliCredential
credential = AzureCliCredential()
token = credential.get_token("https://cognitiveservices.azure.com/.default").token
client = AsyncAzureOpenAI(
api_version="2023-12-01-preview",
azure_endpoint="https://your-openai-instance.openai.azure.com",
azure_deployment="your-deployment-name",
azure_ad_token=token,
)
response = await client.chat.completions.create(
model="your-deployment-name",
messages=[{"role": "user", "content": "Your Content"}],
)
print(response)
if this works but LangChain fails, the issue is LangChain-specific.
Hope this helps. Do let us know if you any further queries.
-------------
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful.
Thank you.