(This article was originally published in February 2018 as part of the CILT Simulation Best Practices series. Mark Fogerty, Challenge Projects Ltd, shares his view of best practice on simulation approach focussing on the Model Build & Validation element of Simulation Best Practices series.)

Simulation projects can be broken down into five main elements:
1. Model definition
2. Data collection
3. Model build and validation
4. Experimentation and analysis
5. Communication and implementation

In this Simulation Forum newsletter, Part 3a focusing on the criticality of the simulation approach, Mark Fogerty, Challenge Projects Ltd, shares his view of best practice on simulation approach. This is the first of a number of parts that comprise the Model Build & Validation element of Simulation Best Practices series.

Criticality of the simulation approach

The reference to Simulation, and indeed previous articles in this series, suggests advanced algorithms and tools. But, are these the keys to success? Simulation is not only about a tool. The approach and experience are possibly even more important…and simulation is not always the answer.

Selecting the right tool for the job will certainly play a role in the outcome. Consider the following context and parallels to simulation tool selection. If you hire the best car, will it necessarily get you to the right destination – of course not. It is also reliant upon fuelling the car (data), steering and control (modelling governance), the driver (“user skills”) and a range of other factors for example congestion, other drivers etc (data integrity, business strategy, operations).

Perhaps the answer lies in defining the objective. If an exercise is focussed upon understanding complex interdependencies over time, then the tool probably plays a more significant role in success. We of course need to recognise that different Simulation algorithms, of equal merit, may well deliver some different results, although hopefully directionally the same. However, the complexity of the algorithms, and number of user parameters is in itself unlikely to materially impact the outcome, especially for many organisations who may benefit from any advancement in decision support, rather than the best tool.

We must also consider that there is a polarisation between users and approaches:

1. A small proportion of large organisations champion advanced simulation.
2. At the other end, there are organisations who do some .xls based modelling or “do nothing” to support intuition.
3. Then, there is a huge tranche in the middle that neither have the capability, inclination or in many cases the right tools

Let’s focus on this huge middle tranche.

Many organisations in this tranche may not be familiar with how simulation differs to other modelling approaches. Similarly, such organisations are unlikely to be able to attract and retain the required skills sets. And of course, the skills sets go beyond the modeller, and need the sponsorship and thus understanding of senior management. Given that so many organisations still work hard on adopting the basics, data management, supply chain skills and balancing the issues of today vs the strategic agenda for tomorrow, then simulation may not even feature on the agenda. A simulation expert trying to convince seniors of the merits of the best tool, maybe a lone voice trying to compete for attention, let alone budget.

The tool is important as an incubator for solutions, to challenge existing thinking and easily test options. However, many organisations are not focussed on the ideal state that maybe identified through optimisation and simulation. Instead, they are focussed upon the need for like-for-like cost and performance improvements and a competitive edge. Whilst many of us buy-into supply chain being the new competitive edge where excellence makes all the difference, there may be other areas of lower hanging fruit. Perhaps 80% of the benefit can be identified through 20% of the simulation effort and tool functionality. Maybe, we are better balancing simulation tool capability with other critical factors such as the excellent and accelerated delivery and operating of a solution. Whilst some might argue that the simulation results are less optimal, is this definitely worse than a perfect model that is perhaps less well executed, sponsored or operated?

If we are to argue that the tool is the most critical component in the mix, then perhaps we risk compromising other critical factors; stakeholder engagement, data validity, blue-sky thinking of people, assessing operability and implement-ability of solutions, which could cost us more dearly. In such an event, even the best tool will deliver something that stakeholders don’t understand and thus gets morphed into something that differs from the ideal that was identified through simulation anyway.

The best tool won’t necessarily deliver the best business outcomes. In fact the right tool in the wrong hands, with wrong data and the wrong approach is very dangerous. Rarely does the modelling consider the breadth and depth of implementation considerations; elapsed time, skills required and available, risk to business as usual and project, people & change (culture) and the lack of appetite for multi-year complex programmes. With all the functionality of simulation, the biggest barrier is often getting the model launched out of the incubator and out of the Boardroom and into implementation, which comes down to people with the right skills and experience of delivering change.

Maybe, the biggest component in the mix is pragmatism – know how to deliver, not just simulate, the biggest value for the least effort.

Mark Fogerty
Challenge Projects Ltd – Supply Chain Consulting