From the course: Excel Data Analysis for Supply Chain: Forecasting
Measuring forecast errors - Microsoft Excel Tutorial
From the course: Excel Data Analysis for Supply Chain: Forecasting
Measuring forecast errors
- [Instructor] Yep, we're doing this. I know, we have yet to cover even one forecasting method and still the first thing we're going to discuss forecasting errors. Why? Well, because forecasts are never, or at least almost never, perfect. All forecasts are flawed. So we need to understand errors. We need to compare real life outcomes to the forecast, and we also need a way to measure which forecasts perform better than others. That's why in this chapter we'll explore three different measures related to forecasting errors. MAD, M-A-D, MAPE, and RMSE. Let me give you a really quick explanation of each measure before we jump into Excel and start applying these on data sets. MAD stands for mean absolute deviation. Some also call it mean absolute error, but MAE isn't as interesting or easy to remember as MAD. So what is it? Well, suppose we show a picture of an adult man to two different people. We ask them to guess the man's age. One person guesses 42. The other guess is 38. So one guessed two years over the right value. The other person guessed two years younger than the right value. MAD doesn't care about over or under. It would rate these two guesses the same. Remember this, when we apply this to larger data sets and forecasts. M-A-P-E, MAPE stands for mean absolute percentage error. This measure is interested in measuring the size of the error compared to the size of the actual outcome. Remember our age guessing example? The two people that made guesses were each two years off, not too bad. Two years is 5% of 40 years. The actual age of the man. Suppose we asked the two guessers to also guess the weight of the man in the picture. One guesses 202 pounds. The other guesses 198 pounds. Again, they are each off by two units, but these guesses are so much more impressive. Two pounds is only 1% of the 200 pounds that are his actual weight. Finally, we'll cover RSME. The root mean squared error. And when we consider squared errors, you need to realize the bigger the error, the more dramatic the size of the square. Here you can see three actual outcomes, 25, 30, and 40. The related forecast for those three outcomes were 24, 33, and 35. This gives us errors of 1, -3, and 5. All seem to be pretty good, but when you square the errors, the error of five turns into 25 when it's squared. Okay, we now have a general idea of our three forecasting error metrics. So now it's time to head over to Excel. There we can see these metrics in action.
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.