> Continuous Improvement Articles

Never Trust a Windfall: A Lesson on Data Collection and Analysis

Last updated by Jeff Hajek on January 6, 2020

One of the things I teach people when doing data collection and analysis is to be suspicious whenever anything unexpected happens. If you suddenly see a major uptick in your productivity numbers or a monumental drop in your defect rate, make sure you understand why, or it WILL come back to bite you.

Here’s a case in point.

Wasatch County, in Utah, recently held an emergency council meeting to deal with a budget shortfall. Now, this was not an ordinary meeting. The shortfall was due to a typo.

Let me explain.

The property tax rate is variable there. The intent is to create stability in governmental budgets. This is good in theory. They don’t want public services, for example, to be dependent upon how well housing prices are holding up, and good fiscal practice says don’t just spend more money because more is available. So, the rates adjust to dial in on the projected budget.

And this works great until someone drops their phone on a keyboard while entering a property value. At least that’s the working theory about why a home that was valued at $302,000 suddenly became worth about a billion dollars—more precisely, $987 million and change.

So, the result was that with a windfall in property values, the home was taxed much higher—something to the order of $4.4 million, if I interpret the story correctly.

That means that all the other homeowners got a collective $4.4 million discount on their taxes. Their tax bills were sent out and paid. Except for the big one. It turns out that someone was doing a routine review later on and noticed the discrepancy, which eventually turned into the emergency meeting.

Now there’s not going to be enough money coming in to pay the bills. There’s going to be a lot of belt tightening and cancelling of projects for the near future.

In the end, it looks like they are going to spread the shortfall out over the next three years in increased property taxes. It looks like, eventually, the cost will be about the same to the homeowners. But it could create hardship if people saw that money as a bonus and spent it.

So, what is the lesson?

First of all, poka yoke. Do everything you can to make a process bulletproof. It will be hard to do that for every process, but if you are the county assessor, assessments are what you do, and this kind of error just can’t happen. You can’t rely on people not making data entry mistakes as your safety net.

Second, like I started out with, make sure that any big aberration is checked. It looks like they did eventually do it, but far too late, and it sounds like it wasn’t a mandated check—just a check done by a curious individual.

And finally, make sure your technology supports you. In this case, the tech, as programmed, took in the bad data, and gave a bad result. Actually, what happened is that the tech took in bad data and make the problem 4.4 million times worse.

I’m sure someone has already thought of this, but there is probably going to be a safeguard added to limit year over year increases without review or require a confirmation for extremely high values on property.

I’ll leave you with this final thought.

There’s a tool called FMEA, or failure mode and effects analysis. Engineers use it on product designs to force them to think about the ways something can break. While you don’t probably need to go through all that rigor on your processes, the principle is valid. Look at how things can go wrong with your process and think about what happens when they do. Then focus on the ones that strike you as the biggest problems and make sure they never happen.

Lean Terms Videos