Evaluations by Team
The Evaluation by Teams report provides a clear, visual summary of team and agent performance across all contact channels. Designed for supervisors and managers, it helps identify trends, spot under performance, and support quality improvement efforts. The report includes six interactive widgets, each offering a different perspective on team and agents evaluation data.
Below is a description of each widget and the insights it provides:

Eleanor Wood, a supervisor at Classics, Inc., oversees multiple teams and wants to identify agents who may be struggling with their evaluations. She opens the Evaluation by Teams report, sets the desired time period, selects her teams, and chooses the relevant evaluation forms.
To quickly spot issues, Eleanor uses the Average Score by Agent widget to view individual agent performance and identify those with scores below their team average. She then clicks on specific agent names to open the Agent Details page. There, the Evaluations by Agents widget helps her pinpoint which evaluations the agent struggled with most, enabling her to plan targeted coaching sessions.

You must have these permissions:
-
Dashboard > Dashboard Templates > Evaluations By Team :On
-
Dashboard > Dashboards: View
-
Dashboard > Dashboards: Edit (optional)
If you cannot access the reporting templates or Dashboard, check with your administrator. The administrator can find these permissions in CXone Mpower. Go to Admin > Security Settings > Roles and Permissions and select the role.
Data Overview
This report template includes:
Average Quality Score by Team
Widget type: Quality Score
This widget displays each team's average evaluation score using color-coded circles. The color of each circle reflects performance based on threshold values defined in the settings menu, making it easy to quickly assess which teams are meeting expectations and which may need attention.

-
View by: Team
-
Default Days: 7 days
QM specific filters:
-
Date Paradigm: Evaluation Start Date
-
Groups: All
-
Evaluation Form: All
Average Quality Score Over Time
Widget type: KPI Trend
This widget displays each team's average evaluation score as individual data points plotted over time. You can switch between daily and monthly views to explore performance trends and identify patterns or shifts in quality over a selected period.

-
View By: Team
-
Interval Unit: Daily & 1 days
-
Default Days: 7 days
-
Time Period drop-down: You can select from Daily or Monthly. The default value for the report is Daily.
-
Team drop-down: You can select maximum 5 teams from the dropdown. The default value for the report is the first 5 teams alphabetically.
QM specific filters:
-
Date Paradigm: Evaluation Start Date
-
Groups: All
-
Evaluation Form: All
Average Agent Performance
Widget type: Quality Score
This report displays the average evaluation score for agents in the team, offering a quick snapshot of overall performance. A colored circle visually represents this score, with its color determined by threshold values configured in the Settings menu. This helps you quickly assess whether the team's performance meets expectations.

-
View By: Agent
-
Default Days: 7 days
QM specific filters:
-
Date Paradigm: Evaluation Start Date
-
Groups: All
-
Evaluation Form: All
Average Score by Agent
Widget type: Quality Score
This report displays the average evaluation scores for agents across different channels, grouped by their respective teams. You can expand each team to view individual agent scores.
In addition to the average scores, the report also shows the variance of each agent’s score from their team’s average, helping to highlight performance differences within the team.

-
Default Days: 7 days
-
View by: Agent
QM specific filters:
-
Date Paradigm: Evaluation Start Date
-
Groups: All
-
Evaluation Form: All
-
Widget View: Table View
Average Score by Channels
Widget type: Quality Score
This report displays the average quality evaluation scores for each communication channel, Chat, Voice, Email, Employee, and Work Item. You can view the data either by team or by individual agent, allowing for flexible performance analysis.
It supports filtering by time period, groups, and evaluation form, enabling you to customize the data view according to your requirement. Clicking on an agent’s name drills down into a detailed report for that agent, providing deeper insights into individual performance.
By default, the widget displays data for the last 7 days.

-
View By: Team
-
Default Days: 7 days
QM specific filters:
-
Date Paradigm: Evaluation Start Date
-
Groups: All
-
Evaluation Form: All
-
Widget View: Channel Table View
Evaluations by Agents
Widget type: Quality Evaluations
This report displays detailed evaluation records for a selected agent, using data from the Evaluation Details dataset. You can view evaluations for a specific agent by filtering the Agent Name column within the widget.
It also shows the variance of each evaluation score from the agent’s average score, helping to identify trends in performance. Drill-down reports are not available for this widget.
By default, the widget displays data for the last 7 days.

-
View By: Agent
-
Report Set: Evaluation Details
-
Default Days: 7 days
You can customize the columns:
-
Click
to auto size a specific column or all the columns.
-
Click
to select filter options.
-
Click
to select the columns you want to see on the widget.
You can personalize your column settings by adjusting the column size, sort, filter, and arrangement, and then save these changes, even with the View dashboard permission.
Sorting table columns
You can sort the data in the table by clicking a column header. To apply a secondary sort, hold down the Shift key and click another column header.
-
The primary sort column displays a 1 next to the column title.
-
The secondary sort column displays a 2 next to the column title.

Column |
description |
---|---|
Agent |
Name of an agent |
Master Contact |
Unique ID number given to the contact by the ACD. |
Plan |
Name of the plan, if applicable. |
Skill |
Skill for the interaction. |
Form |
Name of the form used for this evaluation. |
Submitted |
Date (in UTC) the evaluation was submitted. |
Evaluator |
Name of the evaluator who performed the evaluation. |
Evaluator User Name |
Evaluator User Name |
Evaluation Subject |
Possible values include:
|
Variance |
Difference between this score and the average score of the agent for all evaluations. The of Variance should not be displayed at the team level. |
Score % |
Evaluation score as a percentage. If the score is edited in the evaluation, it is updated in the report. |
Total Score |
Actual evaluation score. |
Maximum Score |
Maximum score an agent could have received for the entire form. |
Channel Type |
Channel for the interaction, such as voice, email, or chat. |
Interaction Date |
Date (in UTC) the evaluated interaction took place. |
Interaction Duration |
Duration of the evaluated interaction. The duration is in the format HH:MM:SS. |
Direction |
Direction of the interaction:
|
Workflow ID |
Unique ID of the evaluation. |
Play |
Click Play to play back the interaction. |