Emit metrics for consumption of large language model tokens
APPLIES TO: All API Management tiers
The llm-emit-token-metric
policy sends custom metrics to Application Insights about consumption of large language model (LLM) tokens through LLM APIs. Token count metrics include: Total Tokens, Prompt Tokens, and Completion Tokens.
Note
Currently, this policy is in preview.
Note
Set the policy's elements and child elements in the order provided in the policy statement. Learn more about how to set or edit API Management policies.
Supported models
Use the policy with LLM APIs added to Azure API Management that are available through the Azure AI Model Inference API.
Limits for custom metrics
Azure Monitor imposes usage limits for custom metrics that may affect your ability to emit metrics from API Management. For example, Azure Monitor currently sets a limit of 10 dimension keys per metric, and a limit of 50,000 total active time series per region in a subscription (within a 12 hour period).
These limits have the following implications for configuring custom metrics in an API Management policy such as emit-metric
or azure-openai-emit-token-metric
:
You can configure a maximum of 10 custom dimensions per policy.
The number of active time series generated by the policy within a 12 hour period is the product of the number of unique values of each configured dimension during the period. For example, if three custom dimensions were configured in the policy, and each dimension had 10 possible values within the period, the policy would contribute 1,000 (10 x 10 x 10) active time series.
If you configure the policy in multiple API Management instances that are in the same region in a subscription, all instances can contribute to the regional active time series limit.
Learn more about design limitations and considerations for custom metrics in Azure Monitor.
Prerequisites
- One or more LLM APIs must be added to your API Management instance.
- Your API Management instance must be integrated with Application insights. For more information, see How to integrate Azure API Management with Azure Application Insights.
- Enable Application Insights logging for your LLM APIs.
- Enable custom metrics with dimensions in Application Insights. For more information, see Emit custom metrics.
Policy statement
<llm-emit-token-metric
namespace="metric namespace" >
<dimension name="dimension name" value="dimension value" />
...additional dimensions...
</llm-emit-token-metric>
Attributes
Attribute | Description | Required | Default value |
---|---|---|---|
namespace | A string. Namespace of metric. Policy expressions aren't allowed. | No | API Management |
Elements
Element | Description | Required |
---|---|---|
dimension | Add one or more of these elements for each dimension included in the metric. | Yes |
dimension attributes
Attribute | Description | Required | Default value |
---|---|---|---|
name | A string or policy expression. Name of dimension. | Yes | N/A |
value | A string or policy expression. Value of dimension. Can only be omitted if name matches one of the default dimensions. If so, value is provided as per dimension name. |
No | N/A |
Default dimension names that may be used without value
- API ID
- Operation ID
- Product ID
- User ID
- Subscription ID
- Location
- Gateway ID
Usage
- Policy sections: inbound
- Policy scopes: global, workspace, product, API, operation
- Gateways: classic, v2, consumption, self-hosted, workspace
Usage notes
- This policy can be used multiple times per policy definition.
- You can configure at most 10 custom dimensions for this policy.
- Where available, values in the usage section of the response from the LLM API are used to determine token metrics.
- Certain LLM endpoints support streaming of responses. When
stream
is set totrue
in the API request to enable streaming, token metrics are estimated.
Example
The following example sends LLM token count metrics to Application Insights along with API ID as a custom dimension.
<policies>
<inbound>
<llm-emit-token-metric
namespace="MyLLM">
<dimension name="API ID" />
</llm-emit-token-metric>
</inbound>
<outbound>
</outbound>
</policies>
Related policies
- Logging
- emit-metric policy
- azure-openai-emit-token-metric policy
- llm-token-limit policy
Related content
For more information about working with policies, see:
- Tutorial: Transform and protect your API
- Policy reference for a full list of policy statements and their settings
- Policy expressions
- Set or edit policies
- Reuse policy configurations
- Policy snippets repo
- Azure API Management policy toolkit
- Author policies using Microsoft Copilot in Azure