This website requires JavaScript.

Estudo de caso: Como a Wistia implantou as avaliações entre pares para o atendimento ao cliente no Klaus

wistia-office-klaus

2019

Cliente desde

500k

Empresas que usam a Wistia

94%

CSAT

Internet

Setor

Share

Wistia is a video software provider greatly admired in the customer service space for the solutions they’ve created to incorporate videos to support interactions. The people behind Wistia believe in the powers of amazing customer service – and that’s what they aim to provide to their customers, too.

Wistia recently rolled out a conversation review program in their customer service department to keep their quality, customer delight, and consistency trending upwards.

Stacy Justino, Director of Customer Happiness at Wistia, shares insight into how they built an efficient customer service quality program from the ground up.

Why did you choose peer reviews instead of manager or QA specialist reviews?

We decided to do conversation reviews in the form of peer reviews for two reasons. With our team structure (a director with three direct reports, a manager with nine direct reports, two Senior Champs, and nine Customer Service Champs):

  • there wasn’t bandwidth for the manager, myself, or the senior Champs to take on the reviews;
  • nor was there room to hire a dedicated specialist.

Moreover, we had already introduced productivity expectations and other workflow expectations over the past year, so I didn’t want to introduce another top-down performance system.

Wistia-Klaus

How did you create criteria for quality reviews?

I wrote the quality criteria myself but after writing up an initial draft, I shared it with the Customer Support Manager and our Senior Support Specialist. I got some feedback from them and incorporated our company values where I could.

After revising the rubric I shared it with the rest of the team. I asked if the criteria made sense and if folks agreed with the following rating categories we’d initially decided on:

  • Completeness;
  • Correctness;
  • Not a robot;
  • Presentation and simplicity.

The team was on board with the criteria, so we moved forward with the peer-review process.

Wistia-Klaus

Don’t miss out! Check out our Fireside Chat with Stacy Justino from Wistia.

How do you do peer reviews?

Our Customer Service Champs do peer reviews in groups of three. This format makes the most sense, given that our team consists of nine champs, no one had done quality reviews previously, and we wanted to stress test the criteria.

I assigned folks into groups of three:

  • Champ A reviews tickets from Champ B,
  • Champ B reviews tickets from Champ C, and
  • Champ C reviews tickets from Champ A.

Each Champ reviews five tickets in Klaus, and then they meet to deliver the feedback in groups of three. In the 75-minute feedback session they go over the peer reviews as follows:

  • Champ A provides feedback to Champ B while Champ C acts as a facilitator;
  • Reviewers explain the reasons behind the thumbs up or down they gave and agents have the opportunity to explain why they handled the conversation the way they did.

The same procedure repeats for two more feedback rounds.

Wistia-Klaus

How do you make sure that all Champs rate tickets in the same way?

In the first 2 months, I acted as the facilitator in the feedback sessions. I made sure that each bullet point in the rubric was covered, asked questions to figure out if folks disagreed or had questions about the feedback, pointed out things that might have been missed,   or were worth calibrating on.

In the 3rd month, I sat in on the feedback sessions but allowed the Champs to facilitate the meeting on their own. For example, Champ A gave feedback on Champ B’s tickets, Champ C facilitated that portion of the feedback session, but I would step in if the facilitator missed something that needed to be addressed.

In the fourth month, the Champs did the feedback review sessions without me being present, and I read through the conversation reviews in Klaus and gave each reviewer individual feedback. There wasn’t too much to comment on. The team was doing a great job applying the rubric objectively and consistently.

In the fifth month, we’d go through one more round with me giving feedback to the reviewers. After that, we’d do calibration sessions on a couple of tickets with the whole team to keep evaluations consistent.

In the fifth month, we’ll go through one more round with me giving feedback to the reviewers. We plan to do calibration sessions at least once a quarter on a handful of tickets with the whole team to keep evaluations consistent. We also have incorporated a training session on peer reviews as part of our Champ onboarding process, so new Champs can be added to the peer review process smoothly.


We’re happy to see that Klaus is helping Wistia maintain their excellent level of support.

If you’d like to start doing conversation reviews in your customer service, give Klaus a go. It provides a great framework for peer reviews and other forms of feedback.

Escrito por

stacy-justino
Stacy Justino
Diretor de felicidade do cliente na Wistia

Nunca perca uma atualização

Ao se inscrever, você concorda com a Política de Privacidade do Klaus e confirma que gostaria de receber conteúdo informativo em seu e-mail.