Confidence ratings


It’s amazing how much the weather forecast for the same future period can change from one day to the next. For instance, they originally expected this past weekend in Chicago to be in the mid-90s, but they revised the forecast a few days later to the mid-80s. And when the weekend actually came around, it barely got above 80.

With this in mind, it would be helpful if the forecast was accompanied by some extra data. In particular, I’d like to see a “confidence rating” that shows how likely the forecast is to be accurate. In other words, they would run some calculations to see how accurate the past forecasts for this date ended up being, given the same amount of lead time before the actual date in question. Or if that type of historical data doesn’t exist, they could display some sort of standard deviation to illustrate how much the observed temperatures on a certain date tend to vary from the long-term averages.

Either way, the goal is the same: give customers a way to evaluate an estimate based on the performance of similar estimates in the past. If the historical data says that the estimate isn’t very accurate under the given set of conditions, or that the value in question often fluctuates wildly, people will put less stock in it and plan accordingly.

If the folks making these estimates aren’t comfortable saying things like “We’re only 40% confident in today’s weather forecast”, they can just create a scale to express relative confidence. Using a range from A to C should do the trick. In any event, the additional data would make the predictions a lot more useful, and increase the trust that customers place in them — even when things turn out differently than expected.