If you are among the 93% of executives who say that improving customer experience is one of their organization’s top priorities - welcome, you’ve come to the right place. We’re about to explore the road that will take you there. The quality of your phone support is probably also high on your agenda, as 40% of customers prefer to seek out a real person over the phone as their issues get more complex. Thus, contact centre quality assurance shall be our main focus in this piece.
Call center quality assurance is the path to providing excellent phone support. It’s a systematic way of analyzing your team’s interactions, rating them in different categories, and providing feedback to agents. However, we prefer to call this process conversation reviews, a term that does a lot more justice to the practice. You can read more about the terminology quibble here.
So,we’re at least as obsessed about the quality of customer service as you are - we’ve even built a conversation review tool because of that. Hence, we sat down and wrote a complete guide for setting up a call center quality assurance program:
- Build a vision for your service
- Define Quality Assurance criteria and scorecard
- Set up your Quality Assurance procedure
- Make conversation reviews systematic
- Make use of your review feedback
Now, let’s dig into each of these steps one by one and create a quality assurance framework that aligns with your call center needs.
Before you rush into analyzing your call center interactions you should have a clear vision of what you want your customer service to look like. Some companies aim to provide a high level of support to all users across all platforms, while others put their focus on specific customers or channels. That doesn’t necessarily mean that the latter ones are lazy or doing it wrong.
Call centers are not a “one size fits all” type of service. Your support strategy should align with your company’s vision and goals. So, when you’re creating a conversation review process for your call center, start by defining who you are:
- What do we do? We’re guessing that the answer to this question begins with something along the lines of “We offer phone support...” - unless you’re in the same boat with Facebook, who is notorious for working under the slogan “Don't bother trying to call Facebook”
- Whom do we serve? Options range from helping all users equally, including trials and leads, to focusing specifically on paying or premium users. It’s mostly a business decision, but your branding team might also have a say in this, as all your customer-facing activities have an impact on your company image.
- How do we serve them? On the one side of the scale, we have teams that focus strictly on giving quick, short, and to the point answers. It’s often the preferred strategy for B2C companies operating with large customer bases. On the opposite side, we have companies who see customer service as a major revenue driver. Teams focused on support-driven growth always go that extra mile to offer additional information to customers to increase customer engagement and upsell their products.
So, your call center vision might be something ranging from “We offer phone support to everyone, with a sharp focus on providing quick help” to “Our call center works with our premium customers by solving their issues, driving product engagement and upsells.”
It’s important to know where you want to see yourself so that you understand what to look for in conversation reviews. Once you’ve decided on your call center positioning, you’ll be able to create specific and actionable goals that make your support vision come to life.
Goal setting tends to be the part of the QA configuration that people often rush through. However, it plays a crucial role in aligning your conversation reviews with your support vision, so make sure you don’t skip this part.
This step consists of four components to help you build a scorecard that reflects how your team is performing based on your idea of excellent service:
- Set goals. Go back to your support vision and break it down to 2-4 specific goals. For example, are you looking for ways to boost your CSAT, provide faster solutions, or improve your agents’ product knowledge? Goals will direct you to the aspects of the conversations that you need to analyze, help you build team cohesion, and pivot quickly, if needed - read more about the benefits of goal setting.
- Create rating categories that reflect your goals. Match each of your goals with at least one rating category on your QA scorecard. Make sure all your goals are represented in your conversation reviews but try not to overdo it. 1-2 rating categories per target should be enough. For example, if you’re polishing agents’ product knowledge, include “provided accurate product information” as an assessment criterion. Those looking for ways to make their calls feel warmer and friendlier should include a category for “empathy”, and maybe also “appropriate opening and closing lines”.
- Prioritize your rating categories, if necessary. Turn to your vision and goals to understand which of your rating categories matter the most to you. Give those categories more weight in your internal assessments. For example, sharing correct product information might be more critical than the vocabulary that the agent used.
- Agree upon a rating scale. Implement a scoring system that is easily understandable to all reviewers, so that they would assess tickets in the same manner. For this reason, we’re using a simple 2-point scale at Klaus, so that reviewers can only give a thumbs up or a thumbs down in each rating category.
Your call center scorecard is the foundation of your QA process, so it’s worthwhile to dedicate time on creating a proper one. If you skip the vision and goal setting parts, you might end up with assessment criteria that are irrelevant to your call center.
Instead of overwhelming your reviews with dozens of questions, concentrate on the aspects of your team’s interactions that matter most to you. Read Automattic’s Happiness Team lead Valentina Thöner’s take on rating categories.
Before you get going with conversation reviews, you need to give this process a solid structure. This will make your QA consistent, transparent, and understandable for everyone.
Though some people see call center QA as a time-consuming and burdensome procedure, it doesn’t have to be like that. Keep efficiency in mind when designing your conversation reviews by finding answers to the following questions:
- Who will review? Choose between managers, QA specialists, peers, and self-reviews. They all have their pros and cons, so see which one works best for you: Managers can review a limited number of cases, so manager reviews are suitable for smaller teams with feasible workloads. QA specialist reviews help to bring agents to the same level of quality. Dedicated QA staff is often needed in large teams. Peer reviews are the most time-efficient form of internal assessment that push agents to learn from each other. Self-reviews help agents understand their areas of improvement, which is the key to agents’ professional growth.
- How many calls to review? Express your goal as a percentage of the total volume to keep your conversation reviews statistically relevant as your company grows. Most teams aim to assess about 5-10% of all calls, while companies like PandaDoc also review all cases for new agents as a part of their onboarding process.
- Where to manage reviews? Most companies start by using spreadsheets to organize their internal feedback. With a small team and low conversation volume, a shared document can be enough. Conversation review tools like Klaus reduce the time spent on administrative work and streamline the quality assurance process. We’ve seen companies decrease the amount of time spent on QA by a jaw-dropping 70% when switching from spreadsheets to Klaus.
Having a proper setup for your call center conversation reviews is crucial because it helps you keep track of your team’s performance over time and provide consistent feedback to your agents. The more convenient the procedure is for everybody, the more likely they are to participate in it.
As you start doing conversation reviews in your call center, keep in mind that consistency is your key to success. To get a picture of how your phone support is performing and to track it over time, you should approach it in a well-organized manner.
These four basic rules will help you stay on track:
- Pick a random sample of conversations. To get an understanding of how your team is doing, don’t just analyze the amazingly good or the worst phone calls. The vast majority of all cases fall between those two extremes and are handled OK. “That’s fine, continue” is great feedback and reassures your agents that they are doing everything right.
- Listen to the entire call. Though at times you might be tempted to skip the small talk and jump to the next question on your scorecard, you should always pay attention to the full conversation. Otherwise, you won’t get an accurate notion of the agents’ tone and style. Plus, you might miss some crucial information buried in the small talk. Giving incorrect ratings will do injustice to the reviewee and diminish the entire QA procedure.
- Do conversation reviews across channels. If you also offer customer service in other channels besides phone support, make sure you track data that is comparable across all platforms. This will help you maintain a consistent level of quality. 58% of customers are frustrated with getting different experiences depending on the channel, so it’s something that most companies cannot afford to ignore.
- Be consistent. Call center QA is not a one-time project. It’s an ongoing process that helps you boost and maintain the quality of your support team. Your products and teams are likely to change over time, so you need to make sure that both your new hires and old-timers are meeting your quality standards alike.
Systematic conversation reviews help you stay in control of what is happening in your customer service. If you notice a decrease in any of the support metrics that you track, internal call center evaluations will help you identify the areas that your team needs to work on.
Before diving into your team’s call recordings, plan out how you will use the information that you’re gathering. Conversation reviews don’t just help you pinpoint the mistakes your agents are making; it's an opportunity to help your team grow through feedback. Utilize these resources to the maximum.
Here are three examples of how to make good use of your conversation review outcomes:
- Input for 1:1 meetings: Use call reviews as the basis for giving feedback to your agents. Find your agents’ areas of growth and set actionable and time-bound goals. Then see how your team progresses from call to call.
- Food for thought for self-reflection: Analyzing their own performance against internal quality standards helps agents improve their interactions. According to a study published in the Journal of Business Research, self-assessments boost the quality of customer service and can increase your NPS by 5%.
- Reporting Internal Quality Score (IQS): IQS is the conversation review metric that reflects how your team performs against your internal quality standards. It’s a perfect KPI to report to your managers, as it provides a different perspective to your customers’ opinions expressed in metrics like CSAT, NPS, and CES.
A proper quality assurance process combines qualitative insight with countable data that helps you understand how your team performs through time. Use this information to aid your team excel in their job and to report your successes to the higher-ups.
The quality of your call center is highly dependant on conversation reviews. You cannot know what is happening in your support interactions unless you listen to how your agents talk to your customers.
If you’ve set up a QA scorecard that reflects your company’s values and goals, you’ll be able to understand how your team performs against your internal quality standards. Not all companies have the same vision and strategy for their support, which is why all teams need a unique setup for their call center quality assurance.
Though most teams start doing support QA in spreadsheets, companies like Automattic, PandaDoc, and Figma have already switched to the conversation review tool Klaus. Using dedicated software saves heaps of time that you’d otherwise spend on setup, maintenance, and reporting.