Comparing predictions against reality | Vose Software

Comparing predictions against reality

See also: Model Validation introduction

In many cases, this might be akin to 'shutting the stable door after the horse has bolted': clearly, if you have made an irreversible decision on the basis of a risk assessment, this exercise may be of limited value. However, even when that is true, analyzing which parts of the model turned out to be the most inaccurate will help you focus in on how you might improve your risk models for the next decision, or prepare you for how badly you will have got it wrong.

Perhaps it is possible to structure a decision into a series of steps, each informed by risk analysis, and at each step in the series of decisions the risk analysis predictions are compared against what had happened so far. For example, setting up an investment that started with a pilot roll-out in a test market would let a company limit the risks and at the same time evaluate how well it had been able to predict the initial level of success.

Project risk analysis models, in which the cost and duration of the elements of a project are estimated, are an excellent example of where predictions can be continuously compared with reality. The uncertainty of the cost and time elements can be updated as each task is being completed to estimate the remaining duration and costs, whilst a review of each task estimate against what actually happened can give you a feel for whether your estimators have been systematically pessimistic or optimistic. A useful way of evaluating the performance of those who provide uncertainty estimates is to plot where the actual value fell on the cumulative distribution of the estimate. If estimators are well-calibrated, these values should fall uniformly from 0% to 100%.

Read on: Informal auditing

 

Navigation