We build goals and processes with the best intentions, but often times, the outcomes aren’t what we expected. In some cases, they actually run counter to the original intention.
Let’s look at a historical example before coming back to something a bit more practical.
Lowering the Cost of Housing
Due to various factors, many states and cities during the WWI and WWII era adopted rent control policies. These policies effectively set a maximum price on rent. We can imagine the intention being something akin to “Make housing more affordable by setting a maximum price on rent.”
Unfortunately, intentions and outcomes aren’t always the same thing.
With the gift of hindsight, we can see that rent control often doesn’t make housing more affordable. When you put a cap on the prices landlords can charge, housing becomes more affordable for the consumer, yes, but you also limit the upside for landlords.
The end result is that you incentivize some behavior that doesn’t fall inline with your intention like landlords neglecting properties because the profit isn’t worth the work. Or even worse, burn the building to reuse it for something else.
Measuring Output in Customer Support
Let’s turn to a more practice example. I don’t know much about the housing market, but I do know something about customer support.
As one part of measuring individual performance, support managers often look at output measured by replies per hour. We could even use this as a crude way to figure out staffing. For example, let’s say we have 1,000 tickets coming in per week that take 3 replies to solve on average. We need 3,000 total replies across our staff of 10 support pros. That’s 300 per week or 50 replies per day.
Bingo! We could draft up our intention as “Handle incoming customer inquiries by setting a performance benchmark of 50 replies per day.”
A month down the road, we’re pleased to find that all of our support pros are hitting 50 replies per day, but tickets now take 4 replies to solve on average. Now, the queue gets more backed up with each passing day! What gives?
Well, our intention was pure, but we actually incentivized sending emails, not resolving customer problems. One hypothesis could be that our support pros are now solely focused on output. So, each ticket requires more replies to solve because replies are optimized for speed, not quality.
There’s More Complexity Than We Realize
I chose two examples above, one historical and one practical, to illustrate a broader point.
Incentivizing one behavior (like sending more replies per day) can change the behavior of the whole system and lead to an entirely counterproductive result. There are certainly tools to address this. For one, you can build goals an incentives around the desired outcome (and not a downstream relative). Second, you can build shorter evaluation intervals to see if you’re having the intended effect.
Overall though, it’s something to keep top of mind when you think you fully understand a problem. Behind every simple problem and solution, there’s often more complexity than we can ever imagine.