Unable to authenticate access to Azure Open AI service model using microsoft entrada id for langchain?

Taimoor Akhtar 20 Reputation points Microsoft Employee
2025-03-03T03:56:15.6666667+00:00

I have setup Azure Open AI service and deployed a model for access using Microsoft entrada ID. While I was able to connect to the model using python's openai library (specifically AsyncAzureOpenAI). However, I have had no success when attempting the same to enable model access for langchain (specifically via AzureAIChatCompletionsModel class in langchain_azure_ai.chat_models). I am certain that the api_version is correct for the model. Also, I use AzureCliCredential for authentication. However, I had no success whatsoever and get the following error:

ClientAuthenticationError: (None) Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.

I understand that the above error was reported by other tickets too. I tried resolution through tips on those tickets but have not found a solution yet. Kindly provide assistance on this.

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
3,745 questions
0 comments No comments
{count} votes

Accepted answer
  1. Prashanth Veeragoni 795 Reputation points Microsoft External Staff
    2025-03-03T06:48:04.41+00:00

    Hi Taimoor Akhtar,

    Welcome to Microsoft Q&A forum. Thank you for posting your query.

    The error message suggests that the authentication flow for Azure OpenAI Service using Microsoft Entra ID (formerly AAD) with LangChain’s AzureAIChatCompletionsModel is failing. Since you mentioned that AsyncAzureOpenAI works but LangChain fails, this is likely an issue with how LangChain handles authentication.

    Steps to Resolve the Issue

    Verify Your Azure OpenAI Configuration:

    Make sure you have:

    Assigned the correct Microsoft Entra ID role to your identity (Azure OpenAI User or Cognitive Services User).

    The correct endpoint and deployment name in your LangChain code.

    The API version that supports AzureAIChatCompletionsModel (e.g., "2023-12-01-preview").

    To check your role:

    az ad signed-in-user show
    az role assignment list --assignee "<your-client-id>"
    

    Make sure your user or service principal has the required role.

    Confirm That Azure CLI Authentication Works:

    Since you are using AzureCliCredential, verify that it actually retrieves a valid token:

    Run:

    az account get-access-token --resource https://cognitiveservices.azure.com
    

    You should see a token in the response. If not, re-login:

    az login
    

    Then, confirm:

    az account show
    

    Ensure it’s using the correct subscription.

    Explicitly Pass the Token to LangChain:

    LangChain’s AzureAIChatCompletionsModel does not automatically retrieve the token from AzureCliCredential. You need to explicitly pass the access token.

    Modify your code like this:

    Manually Retrieve and Pass Access Token

    from azure.identity import AzureCliCredential
    from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
    
    # Fetch Access Token manually
    credential = AzureCliCredential()
    token = credential.get_token("https://cognitiveservices.azure.com/.default").token
    
    # LangChain Model Setup
    llm = AzureAIChatCompletionsModel(
        deployment_name="your-deployment-name",
        endpoint="https://your-openai-instance.openai.azure.com",
        api_version="2023-12-01-preview",
        azure_ad_token=token,  # Pass the token here
    )
    
    response = llm.invoke("Hello, how are you?")
    print(response)
    

    Debug Using OpenAI Python SDK

    Before testing in LangChain, ensure authentication works in OpenAI’s azure-openai package:

    from openai import AsyncAzureOpenAI
    from azure.identity import AzureCliCredential
    
    credential = AzureCliCredential()
    token = credential.get_token("https://cognitiveservices.azure.com/.default").token
    
    client = AsyncAzureOpenAI(
        api_version="2023-12-01-preview",
        azure_endpoint="https://your-openai-instance.openai.azure.com",
        azure_deployment="your-deployment-name",
        azure_ad_token=token,
    )
    
    response = await client.chat.completions.create(
        model="your-deployment-name",
        messages=[{"role": "user", "content": "Your Content"}], 
    ) 
    print(response)
    

    if this works but LangChain fails, the issue is LangChain-specific.

    Hope this helps. Do let us know if you any further queries.   

    ------------- 

    If this answers your query, do click Accept Answer and Yes for was this answer helpful.

    Thank you.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.