Our Current Process for Handling Feedback

Share the love

Yesterday, I read an article on NPR about performance reviews titled “Yay, It’s Time For My Performance Review! (Said No One Ever).” As you might expect from the title, the piece detailed where current annual performance reviews are falling short and how they need to adapt with the future of work.

The annual performance review has quite a few problems:

  • They’re too infrequent. Trying to summarize a year’s work in one meeting is insane.
  • They’re often lopsided with feedback flowing only one direction.
  • There are too many layers of abstraction. Managers might not work closely enough with an employee or on a project to fully appreciate the challenges.

Giving feedback in general is difficult, and no system is ever going to be perfect. We’ve been working hard on providing feedback at Automattic, and while we still face many of the same challenges above, I think the system we have in place (for my team and a few others) works decently well.

Here’s the full system laid out in detail. If you have any thoughts, suggestions, or alternatives, I would love to hear them in the comments!

Before diving into the feedback system, I want to give a hat tip to Simon. I stole much of this current system from working with him at Automattic.

Our Current Feedback Schedule

First, we needed a way to address the issue of timeliness. Providing feedback once a year is just not enough. We operate on a continuous four month cycle consisting of:

  1. Peer reviews – Having a colleague review and provide feedback on customer interactions.
  2. Leadback survey – A questionnaire providing me direct feedack on leading the team.
  3. 3-2-1-Oh – A more comprehensive deep dive into goals, performance, etc between myself and the team member.
  4. (Off)

So, January through April would be one cycle, and then we’ll start back at the top. This cycle ensures that the team member gets direct feedback on their work every other month (in the peer review and 3-2-1-Oh).

In addition to this cycle, every team member has a 1-1 with me at least every other week if not every week. I detailed more about what goes into those 1-1’s here.

Peer Reviews

If you work in customer support, you likely have a similar setup at your organization. We pair up members of the team to review one another. If you’re the reviewer, you read through 20-30 random customer interactions of the reviewee.

From those interactions, you’re looking for overall patterns or trends (both positive and negative). We’re not looking for a spelling error. We’re looking for things like tone, approaches to a problem, etc. The reviewer and reviewee meet for 45 minutes to go over the feedback. Often times, the reviewer gets as much out of the feedback session as the reviewee in terms of things they’re taking back to their own workflows.

Ideally, these peer reviews serve three purposes:

  1. Team members get valuable feedback from their peers and improve their craft.
  2. Team members get comfortable giving feedback. Like anything else, giving great feedback is a skill.
  3. We create a culture of feedback. When someone notices an error in a ticket, they don’t feel shy about bringing it up directly with the person.

Leadback Survey

It’s imperative for leaders to get feedback from their team members. I end each 1-1 with some variation the following:

What can I be accountable to you for the next time we talk?

It adds a layer of accountability, and frequently, members of the team will give me things to do whether that’s clarifying the direction for the team or taking care of a specific roadblock.

Still, think of the last time that your supervisor asked you for feedback on their leadership style? It’s super intimidating! There’s a strong desire to only provide positive remarks.

The leadback survey (again, which I stole from Simon) comes as a Google Form generally consisting of three “sections” each comprised of a multiple choice question and explanation section. Here’s an example from the last leadback survey I sent out:

Question: I know exactly what I need to do when I come to work each day for Sparta to be successful as a team.

Follow-up: If you selected anything other than “Yes,” walk me through a recent scenario where you weren’t sure what to work on to push Sparta further as a team.

The responses are completely anonymous. At the end of the month, I read through all of the responses and write up a p2 post summarizing the answers and detailing how I’m going to try to address each point.

I’ve done two of these so far. I generally try to pick 1-2 themes that I want to focus on and repeat a similar line of questioning in back to back leadback surveys to gauge improvement. Each survey then contains 1-2 returning focus areas I’m looking at for improvement and 1-2 new focus areas.

3-2-1-Oh’s

This is a popular feedback framework across Automattic that breaks down like this:

  • 3 things you do/have done well.
  • 2 areas or skills you’d like to develop further, the more specific the better.
  • 1 way your team lead and Automattic can support you.
  • And — oh! — a sentence or two on what most excites you and how you want your career to develop here.

(If you’re interested in more about the format, Andrew wrote about it here.)

I setup a 1-1.5 hour timeblock with every member of the team. Then, I ask the team member to do their own 3-2-1-Oh self-evaluation and send it over to me a week in advance. This allows me to read through their responses, make notes, and send them my thoughts in return. When we get together, we’ve already read over each other’s responses and have a better framework for the discussion.

The most difficult part of 3-2-1-Oh’s is condensing four months worth of work down into six bullet points. The goal is for these sessions to be iterative. They should build upon one another. Andrew elaborated a bit on this in his article linked above:

It’s better to keep someone on course through a series of small adjustments than through a U-turn. My goal is to have these conversations on a quarterly basis. Trying to improve a host of things about your work in that limited amount of time isn’t realistic. It’s better to narrow your focus and then regularly revisit and adjust goals.

Here’s what typically goes into a prep session as the team lead (when I’m reviewing someone else’s 3-2-1-Oh):

  • Randomly checking tickets/chats. I read through 20 support interactions to get a sense of how the individual is handling support. Yes, they get this through the peer review as well. This is more of an accessory to the peer review process.
  • p2 posts/Trac tickets/GitHub reports. I’m looking for everything from how much and how often they’re communicating to how detailed and well-organized your bug reports are.
  • Collecting feedback from peers. I reach out to everyone else on the team and those the individual interacts closely with asking for feedback (all anonymous).
  • Notes from previous 1-1’s. What kinds of larger tasks/projects did they attempt over the past few months? Did they deliver on the items we discussed in 1-1’s?

Coming out of a 3-2-1-Oh, we should have a shared sense of where we’re trying to head over the next four months and the incremental steps we need to take to get there. The feedback is then summarized and sent over to HR (the team member gives it a stamp of approval).

***

That’s a snapshot of how we’re approaching feedback currently on Sparta. The normal caveats apply. The process is likely to evolve; your mileage may vary; and other teams at Automattic handle feedback differently.

I would love to hear about your feedback cycle if you have one though! We’re always looking for ways to improve.

Share the love
Strategies on solving problems and wowing customers every Sunday 👉
Strategies for solving problems and wowing customers 👇