Getting data through api for using it in prompt on my application

Jasmeet Singh 0 Reputation points
2025-01-23T08:52:33.4266667+00:00

Hi,

I want to get data from Api in my prompt where if I ask questions to it will give response according to the data I'm getting from Api, how I can do this? Give me step by step approach, I know how to use OpenAI model in azure I've used it manually with code approach on my jupyter env, I can go with nocode approach as well, at last I want to integrate the same in my web application.

Thanks
J. Singh

Azure AI Search
Azure AI Search
An Azure search service with built-in artificial intelligence capabilities that enrich information to help identify and explore relevant content at scale.
1,165 questions
Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
3,578 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Manas Mohanty (Quadrant Resource LLC) 95 Reputation points Microsoft Vendor
    2025-01-27T09:16:20.4433333+00:00

    Hi Jasmeet Singh!

    Welcome to Microsoft Q&A Forum, thank you for posting your query here.

    Not sure on Portal UI way, but you can definitely check respective Python SDK for the API to send Get and Post requests to API to get desired queries which can be used as prompt_query in OpenAI get completion response or embedding.

    Once you complete testing the SDK, you can use them in flask api to deploy them through Azure webapp.

    Attached sample code for chat completion and documents for reference.

    import os
    import requests
    from openai import AzureOpenAI
    # Set up Azure OpenAI client
    client = AzureOpenAI(
      azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"), 
      api_key=os.getenv("AZURE_OPENAI_API_KEY"),  
      api_version="2024-02-01"
    )
    # Function to fetch data from the API
    def fetch_api_data():
        api_key = "YOUR_API_KEY"
        endpoint = "API_ENDPOINT"
        headers = {"Authorization": f"Bearer {api_key}"}
        response = requests.get(endpoint, headers=headers)
        return response.json()
    
    # Fetch data from the API
    data = fetch_api_data()
    
    # Prepare the prompt with API data
    prompt = f"Given the following data: {data}, answer the question: 'Do other Azure AI services support customer managed keys?'"
    # Get the response from Azure OpenAI
    response = client.chat.completions.create(
        model="gpt-35-turbo",  # model = "deployment_name"
        messages=[
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Does Azure OpenAI support customer managed keys?"},
            {"role": "assistant", "content": "Yes, customer managed keys are supported by Azure OpenAI."},
            {"role": "user", "content": prompt}
        ]
    )
    # Print the response
    print(response.choices[0].message.content)
    
    import requests
    api_key = "YOUR_API_KEY"
    endpoint = "API_ENDPOINT"
    headers = {"Authorization": f"Bearer {api_key}"}
    response = requests.get(endpoint, headers=headers)
    data = response.json()
    #parse the json to get desired text which can be used a prompt to openai agent
    
    #use the parse data in your prompt flow
    from azure.ai.openai import OpenAI
    openai_client = OpenAI(api_key="YOUR_OPENAI_API_KEY")
    
    #data from above api call has been used in the prompt below
    prompt = f"Given the following data: {data}, answer the question: 'What is the key insight?'"
    
    response = openai_client.chat.Completion.create(engine="text-davinci-002", prompt=prompt)
    print(response.choices[0].text.strip())
    
    
    

    Kindly refer below for detailed SDK usage.

    programming-language-python

    Please don't forget to upvote this answer if it helped.

    Thank you.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.