Points of View

Simulation Best Practices - Model Build & Validation

(This article was originally published in February 2018 as part of the CILT Simulation Best Practices series. Mark Fogerty, Challenge Projects Ltd, shares his view of best practice on simulation approach focussing on the Model Build & Validation element of Simulation Best Practices series.)

Simulation projects can be broken down into five main elements:
1. Model definition
2. Data collection
3. Model build and validation
4. Experimentation and analysis
5. Communication and implementation

In this Simulation Forum newsletter, Part 3a focusing on the criticality of the simulation approach, Mark Fogerty, Challenge Projects Ltd, shares his view of best practice on simulation approach. This is the first of a number of parts that comprise the Model Build & Validation element of Simulation Best Practices series.

Criticality of the simulation approach

The reference to Simulation, and indeed previous articles in this series, suggests advanced algorithms and tools. But, are these the keys to success? Simulation is not only about a tool. The approach and experience are possibly even more important…and simulation is not always the answer.

Selecting the right tool for the job will certainly play a role in the outcome. Consider the following context and parallels to simulation tool selection. If you hire the best car, will it necessarily get you to the right destination – of course not. It is also reliant upon fuelling the car (data), steering and control (modelling governance), the driver (“user skills”) and a range of other factors for example congestion, other drivers etc (data integrity, business strategy, operations).

Perhaps the answer lies in defining the objective. If an exercise is focussed upon understanding complex interdependencies over time, then the tool probably plays a more significant role in success. We of course need to recognise that different Simulation algorithms, of equal merit, may well deliver some different results, although hopefully directionally the same. However, the complexity of the algorithms, and number of user parameters is in itself unlikely to materially impact the outcome, especially for many organisations who may benefit from any advancement in decision support, rather than the best tool.

We must also consider that there is a polarisation between users and approaches:

1. A small proportion of large organisations champion advanced simulation.
2. At the other end, there are organisations who do some .xls based modelling or “do nothing” to support intuition.
3. Then, there is a huge tranche in the middle that neither have the capability, inclination or in many cases the right tools

Let’s focus on this huge middle tranche.

Many organisations in this tranche may not be familiar with how simulation differs to other modelling approaches. Similarly, such organisations are unlikely to be able to attract and retain the required skills sets. And of course, the skills sets go beyond the modeller, and need the sponsorship and thus understanding of senior management. Given that so many organisations still work hard on adopting the basics, data management, supply chain skills and balancing the issues of today vs the strategic agenda for tomorrow, then simulation may not even feature on the agenda. A simulation expert trying to convince seniors of the merits of the best tool, maybe a lone voice trying to compete for attention, let alone budget.

The tool is important as an incubator for solutions, to challenge existing thinking and easily test options. However, many organisations are not focussed on the ideal state that maybe identified through optimisation and simulation. Instead, they are focussed upon the need for like-for-like cost and performance improvements and a competitive edge. Whilst many of us buy-into supply chain being the new competitive edge where excellence makes all the difference, there may be other areas of lower hanging fruit. Perhaps 80% of the benefit can be identified through 20% of the simulation effort and tool functionality. Maybe, we are better balancing simulation tool capability with other critical factors such as the excellent and accelerated delivery and operating of a solution. Whilst some might argue that the simulation results are less optimal, is this definitely worse than a perfect model that is perhaps less well executed, sponsored or operated?

If we are to argue that the tool is the most critical component in the mix, then perhaps we risk compromising other critical factors; stakeholder engagement, data validity, blue-sky thinking of people, assessing operability and implement-ability of solutions, which could cost us more dearly. In such an event, even the best tool will deliver something that stakeholders don’t understand and thus gets morphed into something that differs from the ideal that was identified through simulation anyway.

The best tool won’t necessarily deliver the best business outcomes. In fact the right tool in the wrong hands, with wrong data and the wrong approach is very dangerous. Rarely does the modelling consider the breadth and depth of implementation considerations; elapsed time, skills required and available, risk to business as usual and project, people & change (culture) and the lack of appetite for multi-year complex programmes. With all the functionality of simulation, the biggest barrier is often getting the model launched out of the incubator and out of the Boardroom and into implementation, which comes down to people with the right skills and experience of delivering change.

Maybe, the biggest component in the mix is pragmatism – know how to deliver, not just simulate, the biggest value for the least effort.

Mark Fogerty
Challenge Projects Ltd – Supply Chain Consulting


Exploring the Associate Consulting Model

Attracting and retaining the breadth and depth of skills within an internal team is becoming increasingly difficult. This is even the case in the Professional Services firms, who have for a long time operated with a very heavy reliance on internal resource and have huge global scale.
Traditionally, taking on additional resources, in project environments, is typically speculative, binary and a step increment. Building capability is now about blending core in-house capability, with specialist flexible external resources, to provide tailored solutions when and where needed. Organisations are increasingly looking to trusted Associates who can provide flexible, focussed, specialist experience to complement internal teams and can generate value on shorter lead times.
The core Associate value-proposition is premised upon deep and broad industry subject matter experience, a structured approach and consulting skills. However, part of the resistance to using external Associates is a loss of control, culture and ways-of –working fit. Typical maturity of Associates, along with a vested interest by both parties, usually far outweighs any “control” concerns. Whilst, Associates need to form part of a cohesive team, part of the role of the Associate, and of the client, is to allow a different perspective to flourish, that isn’t got available within the in-house team. Ensuring availability of an Associate can often be proposed as a barrier given the imperative for the Associate to keep fee-earning. However, this is no different to internal pressures to minimise “bench-time”.
Perhaps the success of an “Associate” relationship lies within its meaning, whereby an individual is “project –ready” and both parties nurture, and in invest in, a trusted and meaningful relationship, over a period of time, resulting in a seamless but complementary extension of capability.


The Problem with Data

For all the talk of Big Data, advanced analytics and sophisticated simulation algorithms, many organisations still struggle with producing a historical snap-shot, let alone a forward projection, of activity, stock, costs and the key supply chain flows and infrastructure. Managing data, having confidence in its validity and the capability to quickly extract and process it, is possibly one of the biggest business capabilities to being able to understand root causes, specifying tactical solutions and assessing options, benefits and risk moving further forward. Once a good data person is identified they all too often become overwhelmed with demands resulting in risks to data validity and inappropriate application and interpretation.


Today's Fulfilment or Tomorrow's Improvement?

In many organisations, the challenge remains about executing the basics to a consistent and, at least, parity performance level, whilst also planning the next generation capabilities. Backfiling existing resource with short term, mid-level interim resource can provide the head-room to release internal resource to design and deliver the change for tomorrow. The longer term change can also be facilitated and expedited by experienced external advisors who have cross sector experience, tried and tested approaches, vendor relationships and subject matter expertise. External advisors come in many forms, so getting the right fit for the requirement, deliverables, culture is critical – do you want “developers of solutions and doers” or senior management strategic steering?


Marrying experiences of 3PL users with providers

There remain markedly different views from users vs providers of 3PL services around the prescriptive nature of relationships, collaboration and innovation. Often this has resulted in the way the requirement was defined, procured, documented in a contract and contract managed. In any event there are organisational, cultural ways of working and behaviours that can positively or negatively impact on the performance of an agreement. In challenging times, this can often be positively impacted by an experienced independent advisor who understands the information, commercial and operational needs of both parties.


The Challenge of Delivering Supply Chain Transformation

Evidence suggests that there is still a challenge of delivering supply chain transformation. There remains a challenge pairing great strategic ideas and concepts, analysis and design, with the ability to validate solutions and implement. This is often exaggerated by discrete project phases and teams. Independent external advisors can be a cost effective way of complementing internal capability with focus and experience, transferring skills and providing assurance.


Back to basics or Cutting Edge Initiatives?

I do enjoy and recognise the benefits of conferences and on-line webinars. They are a great opportunity to take some precious moments away from the bustle of the day-job, think about the wider picture and think differently. Invariably, the most useful part is hearing about other organisation’s application of a new approach, technology or change.

Typical examples of some of these cutting edge practices may include; bigger and more granular models, more advanced network optimisation algorithms, Big Data, Control Towers, and even effectively merging product in transit.
At the opposite end of the spectrum, there is the reality that many operations still rely upon immature processes such as vehicle key and fuel card control, routing system parameter configuration, KPI data capture and appropriate transit packaging. Perhaps the 2 biggest issues, that I have still yet to see performed well, in all but a handful of operations, however are;
1) Capturing, maintaining and protecting (handling) Product Data
2) Managing Returns process

One might expect that the organisations presenting their cutting edge practices are totally separate from those who have some very immature practices. However, my experience suggests they are invariably the same organisations that exhibit a few leading edge practices, many on-parity and some very outdated practices. A key difference is that the large cutting edge initiatives will attract the funding, senior management attention and effective project management. This can result in further divergence of practices within the organisation. We therefore end up with situations that we are deploying transport optimisation tools and processes with feedback loops from telematics (cutting edge), but lack appropriate Product and Vendor address data (the basics), to make the optimisation valuable. Or, we may have the best planning system and best people to use it (cutting edge), but if we haven’t got the multi-site IT networking sorted (the basics), then the benefits will be compromised.

So how do we reconcile cutting edge processes with the very immature processes?
Where should organisations focus their efforts?

Logically, one would advocate “sort out the basics”. If we can’t get some of the basics right, then does an organisation really have the foundations and proven capability to adopt the cutting edge practices? However, the cutting edge practices are the items that will drive the strategic development and the customer experience. There are clearly dependencies, but are quite different in their complexity, scale of change and the skills and capabilities needed to deliver.
It’s impossible here to quantify the relative benefits. The accepted norm is that there is greater value to be driven out at the strategic design level. However, I wonder if this assumes operations and tactics being on-parity.

Driving Cutting edge initiatives and Operations Excellence must remain a valid aspiration, but we must also focus on the very tangible “back to basics” agenda. We must recognise that Operations are rarely anything like “steady-state”. Instead Operations are in a constant state of flux. Capacity and capability for continuous improvement must be considered rather than assumed to be part of the day-to-day Operation.


The nth reason projects fail

There are many different lists of reasons why projects fail. Typically they include 8-12 reasons including sponsorship, governance, risk management, poor strategic alignment and even the wrong attitude. However, I wonder if there is an nth reason?

Projects, being about change, carry uncertainty. Even with the best planning, we will only really understand the finest detail of the solution as we go through the implementation, operate and troubleshoot it. This is never more true than of complex projects, with multiple streams and the integration of equipment, solutions and systems from multiple partners. We can pour over specifications, do reference site visits and process mapping, however, there comes a point where we have to accept some risk and move forward to actually deliver a solution.

Almost always, there remains a gap between what we understand will be delivered and is required to do to implement it, versus reality. The issue is we don’t know there is a gap. This isn’t as simple as tightening-up specifications or contracts. In summary we are ignorant. Using Donald Rumsfeld’s words, we have “unknown unknowns”.

The purist may argue that unknown unknowns, should appear as a catch-all on the risk register; risks plans and actions should always be specific, but with unknown unknowns, we can’t be specific, else they’d be “knows”.
Unknown unknowns are a major source of risk. We should constantly be striving to uncovering them. We can at least then make assumptions and mitigations. The project governance should pay attention to this area; where and when will the curve-balls come? This is difficult because all we can do is ask “what don’t we know?” or “what have we missed?” This question is so intangible it is invariably met with silence, for obvious reasons. But, experience shows unknown unkowns exist.

So what’s the solution?
• A constant review of the dependencies, processes and specifications
• A critique of every single component, in isolation and in the context of its handshakes with other components
• Integration of partners solutions
• Head the advice and warnings of practitioners and vendors who will speak from experience about the detail and the less obvious considerations, often outside of their direct remit
• Assurance to bring challenge, experience and a different perspective

So perhaps, another trick of the governance organisation is the ability to juggle invisible balls that have the potential to become very visible.


The Pragmatic Programme Manager

A large organisation was recruiting recently for a Programme Manager (PM) requiring a pragmatic approach. Who wouldn’t describe themselves as “pragmatic”- it’s a given isn’t it.

“Pragmatic” refers to dealing with practical matters and lessons, dealing with things sensibly and realistically, rather than theoretically.

A requirement for a pragmatic PM sounds like a nod the need for structured governance, but possibly with some degree of reticence. One could liken this to buying insurance. I believe the pragmatic PM relates to minimum, simple and effective project process. But, how do we know what these are? The only reference sites we have are the standard project methodologies. They exist because they have been proved to work and they are designed to be tailored. Pragmatic governance will be subjective. There comes point where diluting the process becomes rather meaningless, we realise we have gone too far and the dilution has cost us dearly. During a long programme, governance is at risk of becoming diluted in any event, with pressure to “just do it”. Diluting the process from the outset, surely, is just a faster track to uncontrolled change.

On significant endeavours, the PM is often the “minister without portfolio”. He/she doesn’t have BAU responsibilities or usually a large team. Beyond the change process, the PM often picks up all the activity, in a balanced way that falls between established functional departments. Given that change is multi-functional, this activity is usually considerable. Is this being pragmatic? Is this understood?
Perhaps a requirement for a pragmatic PM somehow relates to the person spec. Do you want someone who has just passed the project management exam and so is theoretically capable, or do you want a Gantt jockey, or do you want someone with functional content, control and structure, with broad and deep business context. The latter is most likely to be the pragmatic PM, but it is a difficult balance. If I’m in the detail, who’s managing the project? What takes priority; managing the process or doing? If I’m “doing”, I’m highly likely to lean towards the areas of my subject matter expertise, which may impact the unbalanced view of the project manager across the streams.
Nearly 20 years ago, in an interview, I was given a situation where there was a heap of time critical orders to get out of the door. Would I stand back and manage, or roll my sleeves up and get stuck in? I replied, “get stuck in”, at which point I realised I couldn’t win this one. The interviewer suggested I dress for work in overalls. I did get the job.

More recently, during a particularly sticky implementation, we had everyone including senior management, for many weeks, out on the warehouse floor. I was still trying to juggle the much diluted project process with on-floor operations. Part of the “project process” included negotiating overdue payments for GNFR suppliers, managing late delivery from the equipment and systems delivery partners, approving invoices for hotel bills so the project team had somewhere to stay each night etc. As is the case with so much of PM’s roles, this was invisible.

On a recent project, that I joined part way through, I proclaimed right from the outset that there were no clear deliverables, specification, or plans for one of the key streams. The push-back was that we needed to be more Agile. The project was compromised as we had no idea of what was coming, when, the risks, how to adopt it or mitigate. Was this pragmatic project management or a collective failure to adopt appropriate governance?

Tailoring governance is critical. It relies upon the experience of the PM with agreement and support of the Steering team. Light touch (pragmatic) governance has its place, possibly on non-business critical changes, whereby, if they fail or get delayed, they will not impact the overall business success. The root issue here is perhaps the lack of understanding of the breadth that typically a PM will be immersed in and the perceived value. Ultimately, by diluting governance standards, possibly too far, will the business accept increased risk of delays, cost overruns and customer service?