Does azure-openai-emit-token-metric policy at API Management Service support cached tokens?

SS 5 Reputation points
2025-02-20T02:30:06.6733333+00:00

The Azure API Management Service recently updated the azure-openai-emit-token-metric to support the GPT-4o model.

Does this policy support the recording of Cached Tokens? According to the official documentation on MS Learn, it appears that this feature is not supported.

https://learn.microsoft.com/en-us/azure/api-management/azure-openai-emit-token-metric-policy

Token count metrics include: Total Tokens, Prompt Tokens, and Completion Tokens.

However, I’m unsure if this is due to the documentation not being updated or if the feature is indeed not supported at all.


Feel free to ask if you need any further assistance!

Azure API Management
Azure API Management
An Azure service that provides a hybrid, multi-cloud management platform for APIs.
2,303 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.