QM Evaluation Analysis
This dataset is available for accounts that have Quality Management (QM) enabled.
By clicking on the Create button in Talkdesk Explore™ reporting tool, users can access the complete dataset as described below, where every field can also be turned into a pivot or a filter.
Desired Field | Example/Format | Type | Notes |
---|---|---|---|
Agent | Includes the “Name”, “e-mail”, and the agent’s ID. < E,g: Name Surname ([email protected])[60643r3de2aef33b6e396105] > | Text (string) | |
Agent Email | [email protected] | Text (string) | |
Agent ID | 349485789f228e60070f9d28 | Text (string) | Unique agent ID. |
Agent Name | Agent name. | Text (string) | |
Branch To | When an answer has a branching option configured, the destination of the branching will be identified here. | ||
Evaluation Date/Time | 2021-09-13 9:32:31 | Time | Date when the evaluation was submitted. |
Evaluation Element Text | The content of the free text answer type. | Text (string) | Same information as the “Question-answer”, but only for the “Text question” type of question. |
Evaluation ID | e928dhj57-2e5c-43a1-ab01-ccjasdh8ef4f0a | Text (string) | Unique evaluation ID. |
Evaluation Max Score | 100 | Number | Maximum score possible for this evaluation. |
Evaluation Obtained Score | 80 | Number | Score obtained by the agent. |
Evaluation Version | 2 | Number | This number will indicate the version of the evaluation. “0” means it’s the original version, and has never been edited before. A different number means this evaluation has been edited “X” number of times. |
Evaluator ID | 8475gh29j48afe45634a522f | Text (string) | Unique ID” of the evaluator that performed the evaluation (if the evaluation is performed by the AI, it will be "AI Scored Evaluation"). |
Evaluator Name | Text (string) | Name of the evaluator that performed the evaluation (If performed by the AI, this field will be empty until it is edited by an evaluator, and then, their name will be registered here). | |
Form ID | 93uejd8345h62854f801762e192416000a | Text (string) | Unique ID of the form used for the evaluation. |
Form Name | AI Demo Form 3.0 | Text (string) | Name of the form used for the evaluation. |
Interaction ID | “Unique ID” of the interaction used for this evaluation. It can be “empty” if the evaluation was made outside the platform). | Text (string) | Unique ID for the interaction the evaluation is based on. |
Interaction Reference | URL, Text | Text (string) | Reference added to ad-hoc evaluations. It can be “null” if the field was left empty or the evaluation is of an interaction. |
Header ID | 2d0ef6fb-fdc3-4838-a2ff-23240eb9b353 | Text (string) | Unique identifier for a header section. |
Header Text | “Additional call information” | Text (string) | Text input added as identification of the header (the “Header” title). |
Option Text | Text (string) | Text of the selected option to the answer. | |
Question Answer | Text (string) | Text of the selected answer to the question. | |
Question ID | 8af2b55d8027456e018046a1a57f49d8 | Number | Unique question ID. |
Question Max Score | 30 | Number | Maximum possible score for the question. |
Question Obtained Score | 20 | Text (string) | Obtained score for the question. |
Question Text | Text (string) | Question text that is part of the form. | |
Question Type | -“MultipleChoiceQuestion”-“TextQuestion” -“DropdownSingleChoice” -“CheckBoxQuestion” | Text (string) | One of the types of questions available on the form builder. |
Ring Group ID’s (Deprecated) | Text (string) | Text (string) | This field is no longer being used. |
Ring Group Names | Text (string) | List of the ring group or queue names the interaction belongs to. | |
Ring Group Names (Deprecated) | Text (string) | This field is no longer being used. | |
Section ID | List of the ring group or queue names the interaction belongs to. | Text (string) | Unique form section ID. |
Section Name | Text (string) | Name/title given to the section. | |
Section Max Score | 50 | Number | Maximum possible section score. |
Section Obtained Score | 45 | Number | Attained section score. |
Team ID | 585f9ab2786e4511980b195a7fb9d7ef | Text (string) | Unique team ID. Can be “null” if the person doesn’t belong to any teams. |
Team Name | Text (string) | Name of the team the evaluated person belongs to. It can be “null”. | |
First Evaluated On | 2021-09-13 9:32:31 | Time | Date and time when the first version of the evaluation occurred. It can be “null” if there is only one version of the evaluation. |
Measures
Below, you can find pre-made formulas that users can use in Explore. Some formulas apply to "Evaluation", some to "Questions" only, and others to "Sections".
“Evaluation” Measures
Metric Name | Description | Calculation | Metric Type | Perspective |
---|---|---|---|---|
% Distinct Evaluation Score | Gives the average performance across all evaluations as a whole, treating them as one big evaluation. It shows how well the agent did overall in all the evaluations combined. | As an example, let's assume the four evaluations: - Evaluation 1: Obtained score = 80, Maximum score = 100 (Percentage score = 80%) - Evaluation 2: Obtained score = 40, Maximum score = 50 (Percentage score = 80%) - Evaluation 3: Obtained score = 60, Maximum score = 80 (Percentage score = 75%) - Evaluation 4: Obtained score = 90, Maximum score = 100 (Percentage score = 90%) The "Distinct % Evaluation Score" would be: (80 + 40 + 60 + 90) / (100 + 50 + 80 + 100) = 270 / 330 ≈ 0.8181 or 81.81% | Percentage | Evaluation |
AVG % Evaluation Score | Gives the average performance score for each evaluation separately, and then it calculates the overall average of these individual percentages. It helps understand how well the agent performed on average in each evaluation category, and then it gives the average of those category averages. | As an example, let's assume the four evaluations: - Evaluation 1: Obtained score = 80, Maximum score = 100 (Percentage score = 80%) - Evaluation 2: Obtained score = 40, Maximum score = 50 (Percentage score = 80%) - Evaluation 3: Obtained score = 60, Maximum score = 80 (Percentage score = 75%) - Evaluation 4: Obtained score = 90, Maximum score = 100 (Percentage score = 90%) The "AVG % Evaluation Score" would be: (80% + 80% + 75% + 90%) / 4 = 325% / 4 = 81.25% | Percentage | Evaluation |
Agent Pass Rate Evaluation | The % of individual agents that have been evaluated above the 80% threshold. This threshold in currently not editable. | Count DISTINCT gents with a Pass Rate of (80% or more) divided by the total of Agents. (the result can be interpreted as a percentage) | Percentage | Agent Evaluation |
Count Distinct Evaluation ID | Distinct Submitted Evaluation Count - Distinct count of submitted evaluations | Count Distinct Evaluation ID | Count | Evaluation |
Max % Evaluation | Highest Evaluation Score (in %) - The highest score | MAX score = (evaluation_obtained_score / evaluation_max_score) | Percentage | Evaluation |
Min % Evaluation | Lowest Evaluation Score (in %) - - The lowest score obtained among all evaluations | MIN score = (evaluation_obtained_score / evaluation_max_score) | Percentage | Evaluation |
"Question" Measures
Metric Name | Description | Calculation | Metric Type | Perspective |
---|---|---|---|---|
Count Distinct Question ID | Distinct Question Count - Distinct count of questions from all submitted evaluations | Count Distinct Question ID | Count | Question |
Max % Question | Highest Question Score (in %) - The highest score obtained among all questions | MAX score = (question_obtained_score / question_max_score) | Percentage | Question |
Min % Question | Lowest Question Score (in %) - The lowest score obtained among all questions | MIN score = (question_obtained_score / question_max_score) | Percentage | Question |
Question AVG % | Average Question Score (in %) - Average percentage of scores obtained among all questions from submitted evaluations | AVG score = (question_obtained_score / question_max_score) | Percentage | Question |
Question Score % | Distinct Question Score (in %) - Percentage of score obtained from distinct questions | % distinct question = (question_obtained_score / question_max_score) | Percentage | Question |
"Section" Measures
Metric Name | Description | Calculation | Metric Type | Perspective |
---|---|---|---|---|
Count Distinct Section ID | Distinct Section Count - Distinct count of sections from all submitted evaluations | Count Distinct Section ID | Count | Section |
Max % Section | Highest Section Score (in %) - The highest score obtained among all sections | MAX score = (section_obtained_score / section_max_score) | Percentage | Section |
Min % Section | Lowest Section Score (in %) - The lowest score obtained among all sections | MIN score = (section_obtained_score / section_max_score) | Percentage | Section |
Section Score % | Distinct Section Score (in %) - Percentage of score obtained from distinct sections | % distinct question = (section_obtained_score / section_max_score) | Percentage | Section |
Section AVG % | Distinct Section Score (in %) - Average percentage of scores obtained among all sections from submitted evaluations | AVG score = (section_obtained_score / section_max_score) | Percentage | Section |
Updated 8 months ago