How to use parallel function calling in Azure AI Studio Prompt Flow with vscode ?

Diogo Guerreiro 5 Reputation points
2025-01-09T15:42:53.99+00:00

I'm working on implementing multiple function calls in response to a single user query using Azure OpenAI chat models in PromptFlow. While the functionality works as expected when using Python directly (as shown in Microsoft's documentation), it is not explained how to make it work the same way in PromptFlow. I only see the option to use function_call and that does not work for parallel function calling as can only pass a function call.

The tool_calls field is in the response_message remains null, so I assume that is can work in PromptFlow. I think the problem can be in the definition on the flow.dag.yaml file and also in the llm. I don't see any docs that can help with this, but I think that some settings on the yaml and the llm can make it work.

Bellow you can see my code. Thanks in advance.

flow.dag.yaml

id: use_functions_with_chat_models
name: Use Functions with Chat Models
environment:
  python_requirements_txt: requirements.txt
inputs:
  chat_history:
    type: list
    default:
    - inputs:
        question: What is the weather like in Boston?
      outputs:
        answer: '{"forecast":["sunny","windy"],"location":"Boston","temperature":"72","unit":"fahrenheit"}'
        llm_output:
          content: null
          function_call:
            name: get_current_weather
            arguments: |-
              {
                "location": "Boston"
              }
          role: assistant
    is_chat_input: false
    is_chat_history: true
  question:
    type: string
    default: How about London next week?
    is_chat_input: true
outputs:
  answer:
    type: string
    reference: ${run_function.output}
    is_chat_output: true
  llm_output:
    type: object
    reference: ${use_functions_with_chat_models.output}
nodes:
- name: run_function
  type: python
  source:
    type: code
    path: run_function.py
  inputs:
    response_message: ${use_functions_with_chat_models.output}
  use_variants: false
- name: use_functions_with_chat_models
  type: llm
  source:
    type: code
    path: use_functions_with_chat_models.jinja2
  inputs:
    deployment_name: gpt-4o
    temperature: 0.7
    top_p: 0.9
    response_format:
      type: text
    functions:
    - name: get_current_weather
      description: Get the current weather in a given location
      parameters:
        type: object
        properties:
          location:
            type: string
            description: The city name, e.g. San Francisco
          unit:
            type: string
            enum:
            - celsius
            - fahrenheit
        required:
        - location
    - name: get_current_time
      description: Get the current time in a given location
      parameters:
        type: object
        properties:
          location:
            type: string
            description: The city name, e.g. San Francisco
        required:
        - location
    function_call: auto
    chat_history: ${inputs.chat_history}
    question: ${inputs.question}
  provider: AzureOpenAI
  connection: my-connection
  api: chat
  module: promptflow.tools.aoai
  use_variants: false

use_functions_with_chat_models.jinja2

# system:
Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous.
If you identify that you need to call more then one function please call them.

{% for item in chat_history %}
# user:
{{item.inputs.question}}

{% if 'function_calls' in item.outputs.llm_output and item.outputs.llm_output.function_calls is not none %}
# assistant:
Multiple function calls requested:
{% for call in item.outputs.llm_output.function_calls %}
Function = {{call.name}}, args = {{call.arguments}}
{% endfor %}

# function calls:
{% for call in item.outputs.llm_output.function_calls %}
## name:
{{call.name}}
## content:
{{item.outputs.answers[loop.index0]}}
{% endfor %}

{% else %}
# assistant:
{{item.outputs.llm_output}}

{% endif %}

{% endfor %}

# user:
{{question}}

run_function.py

from promptflow import tool
import json
from datetime import datetime
from zoneinfo import ZoneInfo

WEATHER_DATA = {
    "tokyo": {"temperature": "10", "unit": "celsius"},
    "san francisco": {"temperature": "72", "unit": "fahrenheit"},
    "paris": {"temperature": "22", "unit": "celsius"}
}

TIMEZONE_DATA = {
    "tokyo": "Asia/Tokyo",
    "san francisco": "America/Los_Angeles",
    "paris": "Europe/Paris"
}

def get_current_weather(location, unit=None):
    location_lower = location.lower()
    for key in WEATHER_DATA:
        if key in location_lower:
            weather = WEATHER_DATA[key]
            return {
                "location": location,
                "temperature": weather["temperature"],
                "unit": unit if unit else weather["unit"]
            }
    return {"location": location, "temperature": "unknown", "unit": unit or "unknown"}

def get_current_time(location):
    location_lower = location.lower()
    for key, timezone in TIMEZONE_DATA.items():
        if key in location_lower:
            current_time = datetime.now(ZoneInfo(timezone)).strftime("%I:%M %p")
            return {"location": location, "current_time": current_time}
    return {"location": location, "current_time": "unknown"}

@tool
def run_function(response_message: dict) -> str:
    """
    Handle multiple tool calls and return responses for each function call.
    """
    if "tool_calls" not in response_message or not response_message["tool_calls"]:
        return json.dumps({"error": "No tool calls were made by the model."})

    results = []
    
    # Iterate through all tool calls
    for tool_call in response_message["tool_calls"]:
        function_name = tool_call["function"]["name"]
        function_args = json.loads(tool_call["function"]["arguments"])
        
        if function_name == "get_current_weather":
            function_response = get_current_weather(
                location=function_args.get("location"),
                unit=function_args.get("unit")
            )
        elif function_name == "get_current_time":
            function_response = get_current_time(
                location=function_args.get("location")
            )
        else:
            function_response = {"error": f"Unknown function: {function_name}"}
        
        # Append the response for this call
        results.append({
            "function_name": function_name,
            "function_response": function_response
        })

    # Return all collected results
    return json.dumps(results)

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,071 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.