Observability Dashboard
Observability Dashboard allows you to scrutinize the AI capabilities employed by Copilot, Autopilot Knowledge, and Autopilot. The dashboard provides comprehensive insights into the performance of the generative responses, enabling you to identify areas for improvement and optimize your operations.
Copilot
The Observability Dashboard for Copilot provides information about generative responses, Copilot for Agents Queries, and AutoSummary.
-
Click the app selector
and select Actions.
-
On Actions, click Observability Dashboard.
By default, the Copilot dashboard appears. It contains three charts:
-
If needed, update the Date Range for the dashboard and select the interaction type in Channel. Choose to view details for Voice interactions, Chat interactions, or All.
-
Click Run Query. The three graphs are updated to reflect your desired dates and channels.
Viewing Data About Generative Responses
Generative responses are answers automatically generated during calls.
-
Click the Generative Responses label to drill down into the statistics. Three graphs appear:
-
Over Time: Shows the percentage of answers that were used, modified, ignored, or resulted in no answer over time.
-
By Category: Displays the details by category.
-
Average Kb Per Interaction: Shows the average number of knowledge base interactions per day.
-
-
You can customize data that appears in the graphs:
-
In the Over Time and Category graphs, you can toggle the display of different answer statuses by clicking on the legends.
-
In the Over Time and Category graphs, click Absolute Numbers
or Percentage
to switch between percentages and absolute numbers.
-
In all three graphs, click Maximize
to view the data in a table format.
-
-
Scroll down to see the data grouped by categories. The Categories View organizes the knowledge base answers into different categories, offering a structured approach to analysis. Each category presents:
-
Total volume of knowledge base answers
-
Average adherence score
-
Average number of links and images provided
-
Average knowledge score (score assigned by the knowledge base)
-
-
Clicking on a category reveals the specific queries and their details, such as:
-
Query sent to the Knowledge Base
-
Suggested knowledge base answer
-
Agent's actual response
-
Number of links and images provided
-
Query Feedback
-
Adherence score (the similarity between suggested and actual response)
-
Offset from the beginning of the interaction
-
Click Play Interaction
to listen to the audio of the interaction (if available).
-
Click the Info
button next to the query. You can view the query feedback of the interaction. In the Response details panel on the right, you can see whether the AI generated response received positive
or negative
feedback, along with any comments and tags provided.
-
-
You can switch the way you view this data. The default view is by category. Click Group By to change the grouping from Category to one of the following:
-
Master Contact
-
Team
-
Skill
-
Agent ID
The data reappears based on the new grouping.
-
Viewing Data About Agent Queries
Agent Queries displays information about the knowledge base answers generated on-demand based on agent questions. It provides graphs that show the status of direct queries and adherence scores.
Click the chart heading, Agent Queries, to drill down into the statistics. The first graph shows the percentage of responses and no responses Over Time. The second graph displays the details Over Category.
You can toggle the display of different answer statuses by clicking on the legends.
Click Absolute Numbers
or Percentage
to switch between percentages and absolute numbers.
Click Maximize
to view the graph in full screen.
Scroll down to see the data grouped by categories. The Categories View organizes the knowledge base answers into different categories, offering a structured approach to analysis. Each category presents:
Total volume of responses
Number and average of no responses
Average number of links and images provided
Average knowledge score (score assigned by the knowledge base)
Clicking on a category reveals the specific queries and their details, such as:
Agent query sent to the Knowledge Base
Response to the agent query
Number of links and images provided
Date and time of the response
Average knowledge score
Viewing Data About AutoSummary Queries
AutoSummary provides a comprehensive view of summary performance. You can see graphs that track performance over time, with data grouped by intent and skill. Detailed tables display suggested summaries alongside actual summaries, complete with adherence scores to gauge accuracy. For more comprehensive details, you can play back specific interactions, which will give you a full picture of how summaries are generated and used in real conversations.
Click on the chart heading, AutoSummary to drill down into the statistics.
The first graph shows the percentage of summaries that were used over time. Summaries are identified as being used in one of the following fashions: As Is, Revised, with Minor Revisions, and Ignored.
The second graph displays the details By Intent.
The third graph displays the details By Skill.
The fourth graph displays the details By Team.
You can toggle the display of different answer statuses by clicking on the legends.
Click Absolute Numbers
or Percentage
to switch between percentages and absolute numbers.
Click Maximize
to view the graph in full screen.
Scroll down to see the data grouped by categories. The Categories View organizes the knowledge base answers into different categories, offering a structured approach to analysis. Each category presents:
Total volume of responses
Number and average of no responses
Average number of links and images provided
Average knowledge score (score assigned by the knowledge base)
Clicking on a category reveals the specific queries and their details, such as:
Agent query sent to the knowledge base
Response to the agent query
Overall Feedback
Number of links and images provided
Date and time of the response
Average knowledge score
Click the Info
button next to the query. You can view the overall feedback of the interaction. In the Response details panel on the right, you can see whether the interaction received positive
or negative
feedback, along with any comments and tags provided.

As a CX leader, Mark wants to use the Observability Dashboard to identify knowledge gaps in the knowledge base. He notices that many knowledge base suggestions are being modified or ignored. Mark clicks on Knowledge Base Answers to view data by category. He focuses on categories with low average adherence scores, indicating misalignment between suggestions and actual responses. By expanding low-scoring categories like Billing and Payments, Mark sees low adherence to queries about payment plans and refund policies, suggesting knowledge gaps in those areas.
Through the Observability Dashboard, Mark can pinpoint topics needing knowledge base enhancements. He can then work with the knowledge base team to address these gaps, improving the quality of suggestions for better customer interactions.
Autopilot Knowledge
The Observability Dashboard for Autopilot Knowledge shows data about how well your automated system handles customer questions. You'll see a graph that displays performance trends over time. This lets you track changes daily, weekly, or monthly.
-
Click the app selector
and select Actions.
-
On Actions, click Observability Dashboard.
-
Click the Autopilot Knowledge tab. Set the desired Date Range for the dashboard, and click Run Query. It displays three charts:
Viewing Data About Overall Effectiveness
This graph displays a high level summary of the Autopilot Knowledge chatbot's performance and status.
-
Click the Overall Effectiveness graph heading to drill down into the statistics. Four graphs appear:
-
Engaged: Displays the number of visitors who engaged with the chatbot, helping you understand the engagement trends over time.
-
Contained: Displays the percentage and count of chatbot users who completed their conversation without needing escalation to a live agent. With this metric you can assess how effectively the chatbot resolves queries independently.
-
Elevated: Displays the percentage and count of chatbot users who escalated their conversation to a live agent, highlighting cases that required human intervention. With this metric you can monitor how often the chatbot hands off conversations to human agents.
-
Abandoned: Displays the percentage and count of chatbot users who abandoned an ongoing conversation. With this metric you can identify drop-off points and improve user engagement.
-
-
You can customize the data that appears in the graphs:
-
Click Absolute Numbers
or Percentage
to switch between percentages and absolute numbers.
-
Click Maximize
to view the graph in full screen.
-
Viewing Data About GenAI Performance
This graph displays the percentage of user queries that were effectively addressed by the generative AI engine.
-
Click the GenAI Performance label to drill down into the statistics. Three graphs appear:
-
Over Time : Displays the percentage of chatbot responses over time.
-
By Category: Displays the percentage of chatbot responses over category.
-
Queries to Generative Model: Displays the total number and percentage of chatbot queries processed by the generative engine. This metric provides insight into how frequently the generative engine is utilized in handling user interactions.
-
-
You can customize the data that appears in the graphs:
-
In the Over Time and Category graphs, to toggle the display of different answer statuses, click on the Response or No Response legends in the graphs.
-
Click Absolute Numbers
or Percentage
to switch between percentages and absolute numbers.
-
Click Maximize
to view the graphs in full screen.
-
-
Scroll down to see the data grouped by categories. The Categories View organizes the knowledge base answers into different categories, offering a structured approach to analysis. Each category presents:
-
Total volume of responses
-
Total number of no responses
-
Average number of links and images provided
-
Average knowledge score (score assigned by the knowledge base)
-
-
Clicking on a category reveals the specific queries and their details, such as:
-
Contact number of the interaction
-
Query that initiated the chatbot interaction
-
Chatbot’s reply based on the user’s input, intent, and context.
-
Number of links and images provided
-
Date and time of the response
-
Average knowledge score.
-
-
You can switch the way you view this data. The default view is by category. Click Group By to change the grouping from Category to Contact Number. The data reappears based on the new grouping.
Viewing Data About Bot Performance
This graph displays the distribution of chatbot intents, highlighting the top six most common user requests along with fallback occurrences. It helps you understand what users ask and how the chatbot responds.
-
Click the Bot Performance label to drill down into the statistics. Two graphs appear:
-
All Bot Intent: Displays the most common user requests and fallback cases, helping you improve how your chatbot responds.
-
Abandonment Indicator: Displays which chatbot intents were most common before users abandoned the conversation. It helps you identify drop-off points and improve user retention.
-
-
You can customize the data that appears in the graphs:
-
Click Absolute Numbers
or Percentage
to switch between percentages and absolute numbers.
-
Click Maximize
to view the graphs in full screen.
-
Autopilot
The Observability Dashboard for Autopilot shows how well your knowledge base is able to handle customer questions. Use this dashboard to look for areas where you can add more articles to your knowledge base or customize the articles to better answer the customers' questions.
-
Click the app selector
and select Actions.
-
On Actions, click Observability Dashboard.
-
Click the Autopilot tab. It displays the GenAI Performance graph.
-
Set the desired Date Range for the dashboard, and click Run Query.
Viewing Data About GenAI Performance
This graph shows the number of user questions that received relevant and complete answers from the AI engine.
-
Click the GenAI Performance label to view details about the questions asked and the articles provided for the time period specified.
Two graphs appear:
-
Over Time graph: Shows the number of successful responses and no-responses over the period of time. A response indicates the user was shown an article. A no-response indicates that no matching article was found.
-
By Category graph: Shows the number of successful responses and no-responses by categories.
You can customize the appearance of data in the graphs:
-
To switch between percentages and absolute numbers, click Absolute Numbers
or Percentage
.
-
To view the graph in full screen, click Maximize
.
-
To toggle the display of different answer statuses, click on the Response or No Response legends in the graphs.
-
-
Scroll down beneath the graphs to see a table that provides details on the queries. The queries are grouped by categories.
-
Clicking on a category reveals the specific queries and the provided responses. Use this to identify areas where you might be missing articles in your knowledge base.
Generate AI Powered Knowledge Articles
-
Click the app selector
and select Actions.
-
On Actions, click Observability Dashboard.
-
Click the Generative Response label to view detailed statistics.
-
Scroll down to the data grouped by categories section. Click a category to view specific queries.
-
Select a query and click the Info
button.
-
In the Response details panel on the right, click Create Article
. An AI-generated article is drafted based on the transcript. You can edit the article as needed and then publish it.
For complete information on editing and publishing an article, see the knowledge generation help.
-
When an article is already published, the Create Article icon appears in purple with a checkmark
. This means a knowledge article is available and you can view it, even if it was created by someone else.
Export Data from Observability Dashboard
-
Click the app selector
and select Actions.
-
On Actions, click Observability Dashboard.
-
Click the Generative Response label to view detailed statistics.
-
Scroll down to the data grouped by categories section. Click Export
. You can download all data, both visible and hidden, based on the filters set in the query builder.
-
When you export data from theObservability Dashboard, some fields in the spreadsheet are represented by numeric codes. These codes correspond to specific tags and feedback types, as shown below:
Tag
value
Accurate 1 Inaccurate 2 Complete
3 Incomplete 4 Relevant 5 Irrelevant 6 Slow 7 Other 8 Feedback type
value
Positive 1 Negative 2