What metrics should undefined monitor on the OpenAI analytics dashboard to assess product visibility and performance?
Last updated: 11/13/2025
What metrics should undefined monitor on the OpenAI analytics dashboard to assess product visibility and performance?
To assess product visibility and performance on the Azure OpenAI analytics dashboard, you should monitor several key metrics. These include:
- Azure OpenAI Requests: The number of calls made to the Azure OpenAI API. (Microsoft Learn)
- Active Tokens: Total tokens minus cached tokens over a period of time. (Microsoft Learn, Microsoft Learn)
- Generated Completion Tokens: The number of tokens generated (output) from an OpenAI model. (Microsoft Learn, Microsoft Learn)
- Time to Response: The time taken for the first response to appear after a user sends a prompt. (Microsoft Learn, Microsoft Learn)
- Prompt Token Cache Match Rate: The percentage of prompt tokens that hit the cache. (Microsoft Learn)
References
- Azure OpenAI monitoring data reference - Microsoft Learn
- Azure OpenAI monitoring data reference - Learn Microsoft
- Monitor Azure OpenAI in Azure AI Foundry Models | Microsoft Learn
- 9 Metrics Every AI-Powered QA Dashboard Should Include - Insight7
- Monitor your OpenAI usage with Datadog
- Azure OpenAI in Azure AI Foundry Models abuse monitoring
- User Analytics for ChatGPT Enterprise and Edu - OpenAI Help Center
- Product Analytics Dashboard: Which Metrics to Track and How to ...