Roll the dice many times with Monte Carlo Simulation and achieve better decision making

22 July | by Kerry Mayes and Lance Rubin

Introduction to the co-author

Kerry Mayes has been working in the Decision Analytics space for over 30 years. Over that period it has taken different forms and had different names. Kerry’s training was in Operations Research (which also has many names – Management Science, Decision Science, Decision Analytics, Engineering Science, …). 

Along the way he independently developed many techniques only to find them in common practice in the Financial Modelling community: from index match to four-way modelling.

“It’s a cross between heartening and dis-heartening when you find that index match, that you nutted out from first principles, is standard practice!”

Until recently he was an independent financial consultant but has now joined WSP where he focuses on infrastructure investment and valuation across New Zealand. Being such a small country, it doesn’t make sense to specialise too much which often means he is looking at other asset classes and businesses.

 Why did Kerry select the topic and why is he passionate about it?

Kerry is an advocate of decision lead analysis and evidence-based decision making. 

Decision lead analysis is being sure to define the questions before working on the answers; this allows greater structure in the analysis with resulting cost savings. Once the decisions have been effectively structured, evidence-based decision making allows an organisation to make the best decisions given the information available.

He likes to work on the difficult problems that require the tough decisions; those that involve uncertainty, multiple decision makers, or multiple criteria. 

He believes in a toolbox approach to analysis tools – the right tool for the job rather than “to the man with the hammer, every problem looks like a nail”.

He believes in developing systems that assist organisations make better decisions both now and ongoing. Designing of prioritisation systems, benefits realisation practises and optimisation systems all allow organisations to make incremental and step changes in their performance.

At the heart of every difficult problem lies a great deal of uncertainty. There are a few tools in his toolbox for uncertainty but the one he uses most often is Monte Carlo simulation. 

In the decision analytics journey it is still an exploratory tool not an optimisation tool (that is, it gives greater understanding of the problem but does not necessarily work out the best solution to that problem). In the majority of situations, it gives the user of the tool enough understanding in order to formulate a good decision.

And he has a soft spot for Monte Carlo simulation having used it on and off for 30 years.

His first working stochastic simulation model (yes, that’s a synonym) after University was dismissed because he was told, “It couldn’t be Monte Carlo because he wasn’t using @Risk!”

Uncertainty isn’t new. 

Decision Analytics isn’t new and a wide range of tools for making optimal decisions for problems with uncertainty have been around for decades.

However, now our lives have been steadily increasing in the level of uncertainty we are experiencing and being exposed to.

What we are therefore seeing is a greater appreciation for uncertainty and the need to make good decision in amongst that uncertainty, we simply cannot ignore that uncertainty anymore. Here in Christchurch New Zealand in 2011 we had a huge earthquake involving the highest ground acceleration ever recorded. 

New Zealand had expected Wellington to have a significant earthquake – not Christchurch. The need to take rare events into planning was thrust upon Christchurch residents by nature, where previously we had simply chosen to ignore it. 

COVID19 is similarly making things even more difficult; the past is no longer a good predictor of the future. 

Christchurch was using the term “new normal” from about 2013 (which is only now being used commonly around the world) when it became clear that we would never return to pre-earthquake normal. 

We understand all too deeply that post COVID will be different. 

That there will be winners and losers, and that new approaches are going to be required even for the usual problems. 

In our toolbox for making decisions under uncertainty, Monte Carlo simulation is like the builder’s hammer or the plumber’s pipe wrench and is now becoming more important than ever along with the other fundamental skills of Financial Modelling in this series.

The combination of this need for decisions in uncertainty coupled with Kerry’s passion on Monte Carlo simulation is what drove him to select this particular topic.

Topic and context in a few sentences

Monte Carlo analysis is all about managing uncertainty. 

When you walk into a casino they make you think you have as much chance of winning millions than the person walking in next to you or the house that bought the table or machine you are about to start playing at.

Whilst this might appear to be the case the algorithms that are tested rigorously using Monte Carlo simulations ensures that the uncertainty that you gamble on is skewed towards the casino and not you.

The house wins and you lose, now imagine you are the house and using the same techniques to better understand the outcomes in your financial model.

Now you win.

If you had to teach this topic in a class to school kids what key tips would you give them to focus on?

The first one would be that you should avoid over simplifying complexity by means of averages. There is obvious reason why we would like to simplify the uncertainty so that we can better understand and manage it, however this can undo even the best decisions. 

Mentally, even when we have some understanding of probability we tend to work on the average. There are many situations when that gives the wrong answer and is commonly referred to as the flaw of averages.

When teaching Graduates (that’s not too different from school kids, right?), I emphasise an Input (cooking ingredients) / Output (a meal) conceptual model. 

If the model is well structured (ingredients cooked in the right way) then the model builder (cook) will be looking at all their assumptions (ingredients) and find it easier to create a great meal (decision). By having this well-structured model it’s much easier to bolt on Monte Carlo simulations (the icing on the cake)? Imaging trying to put the icing on a cake that has the middle part of the cake sinking, it isn’t pretty.

Then assigning uncertainty distributions (the chances the temperature or time is incorrectly applied) to them then looking at the outputs in the outputs sheet (meal on the plate) and examining (tasting) the resulting variances that are displayed.

Unlike cooking where most good chefs follow the instructions precisely or risk burning or ruining the meal, reality simply cannot be followed that accurately.

Reality has a natural way of automatically throwing in uncertainty and spanners into our well thought out human designed plans. Spanners like a sudden change in temperature, time and all sorts of variables we cannot even think of whilst we are cooking.

Makes for an interesting meal wouldn’t you say?

Wanna try some?

What practical steps can people take now to learn more?

While there are many ways to do Monte Carlo simulation the quickest and easiest way to apply the techniques are by using either one of the add-on tools like @Risk or Analytical Solver.

These two add-on tools (and probably others) have demo versions that you can use for a couple of weeks. 

Take your well-structured financial model, download and install one of the tools, set up your assumptions with probability distributions, run the simulation and see what happens to your key results. 

This is a good way to see if the technique makes sense to you and is useful for your area of modelling. 

Once you’ve established that Monte Carlo is for you, you should do it “manually” to really get the idea of what is going on. 

There are a number of techniques within Excel for implementing Monte Carlo. 

They will all use Excel’s random number generator so that calculations result in new results.

In Excel using either a data table, randbetween formula, dynamic arrays and a little bit of VBA is where you should start.

Using VBA can be slow depending on how efficient you can code, but using dynamic arrays is game changing and very quick. Possibly as quick as some of the add-on tools which you must buy whilst dynamic arrays are free.

Set up the inputs to use one of the random number formulae linked to one of Excel’s statistical distribution formulae. Then use VBA to recalculate and copy results by value to a table of results.

Statistical analysis can then be done on the table of results or in the form of a distribution curve.

The problem really starts when you decide on the number of iterations / samples you need in your simulation. How many iterations to do depends on the underlying uncertainty. 

Kerry’s first stochastic simulation was a model he “inherited” from consultants and was a simple model of New Zealand’s electricity system. 

At the time, New Zealand had about 60% of its electricity generated from hydro electric dams but only about 6 weeks storage. The price of electricity was determined almost entirely by the probability of a dry year resulting in shortage. The model, running on Excel in the early 90’s, took half an hour to run 100 iterations (clearly the VBA code wasn’t that efficient – but the model did have a lot of variables and autocorrelated rainfall).

However, as he started working with it, he found that the price was being determined by one iteration – one dry year in the 100. He needed to get a better estimate of the likelihood of shortage. It turned out that he needed more than 10,000 iterations to get a good approximation of the variance. 

To get that many iterations, you would think it would take 100 times 30 minutes circa 50 hours. Unfortunately it was much worse.

The spreadsheet had to be bigger so it took longer to recalculate (the bigger it got the slower the spreadsheet ran). Kerry never ran it for the full 10,000 iterations before rewriting the model in C++.

So, don’t underestimate the number of iterations you need! 

Recently Lance ran 10,000 simulations using dynamic arrays (randarray formula) in 3 seconds. Video link

Below are two graphs showing the output distribution with 100 and 10,000 iterations, respectively. 

The estimate of the distribution smooths out considerably but notice how even the 10,000-iteration graph has areas that aren’t smooth – even more iterations may well be required.

When you start to use any number of tools or techniques it is important that you are able to depict the results visually so that you can explain the outliers and also determine the most likely outcomes.

Be careful however, when it comes to concepts like correlations and other complexities that even the most sophisticated simulation will need careful consideration.

Remember, all models are wrong, some are useful (George E.P. Box)

Where are good places (links) to find out more on the topic?

www.palisade.com

www.solver.com

Dynamic Array article

How important is this skill in the context of learning Financial Modeling?

I think it’s fair to put Monte Carlo simulation down as an advanced technique. 

You can still do good models and support good decisions without ever using it.

But there are insights to be gained that can’t easily be gained otherwise. 

Kerry modelled the Christchurch Earthquake horizontal infrastructure rebuild for SCIRT (Stronger Christchurch Infrastructure Rebuild Team). 

Over $2.5B, 600 projects across 3 waters and roading. 

Monte Carlo simulation suggested that the portfolio effect would mean that the accuracy of the estimate of the total cost would be very high – that is, the range would be very small

Management didn’t believe this, so the variance was artificially raised for reporting purposes. However, the model was correct. The outcome showed that the accuracy of the estimate was within 3%. 

What wasn’t modelled as accurately was the schedule. SCIRT never got a good handle on estimating the schedule. Schedules do not have the portfolio effect in fact they tend to have queueing like behaviour where planning on the average time results in congested systems and long delays. (The flaw of averages struck with a vengeance).

In reality, SCIRT didn’t finish the programme of work. 

These insights come from doing the modelling and finding the results that aren’t expected.

How does all this disruption, AI and automation talk impact this topic?

Modelling disruption or modelling under threat of disruption is what Monte Carlo simulation is all about. 

Exploring the consequences of uncertainty, demonstrating them to clients are enhanced by Monte Carlo simulation. 

Monte Carlo is sensitivity analysis automated. Automation isn’t new!

Monte Carlo analysis should be used before considering AI. 

In fact, there are a lot of problems that are better addressed by “HI” – Human Intelligence” - than AI!

We are quick to jump at AI for any problem that we don’t know how to solve.

Better to first check that there isn’t already a tool developed for this problem. 

Going back to my toolbox analogy, we are at risk of inventing the adjustable wrench to use where a pipe wrench has been the correct tool for generations.

If you want to find out more and follow the rest of the article series be sure to download the Financial Modelling App

If you want to find more information on financial modelling and content visit the Model Citizn website.

Previous
Previous

VBA might be dying but it ain’t dead

Next
Next

Scenario Analysis the best cure for uncertainty