Figure 1 - Choose your path wisely
Don’t be vulnerable to complex sources of risk just because the link between cause and effect is hard to see. Probe your capability to eliminate this threat.
The cartoon is part of the story by the The Business Illustrator’s ‘Why Organisations Have to Change – The Rise of Complexity’ (narrated by yours truly at https://www.screencast.com/t/6PidU5dxr). This post explains why we tend to ignore sources of risk that are not obvious and how you can build the capability in your organisation to respond to them.
‘Complex sources of risk’ are easier to see in hindsight when the links between cause and effect become clearer when all or more facts are known. For example, Commissions of Inquiry might cause us to ask ‘why didn’t the people responsible for Aged Care in Australia do something about it earlier?’ Even corporations with well-established risk management systems are vulnerable - did BP ‘let’ the Deepwater Horizon disaster in the Gulf of Mexico happen? They didn’t ‘let’ it happen. It happened due to multiple factors, not all of which were apparent before the disaster.
These sources of risk are not new of course. Consider the numerous clichés we use to describe them; ‘unintended consequences’, ‘the elephant in the room’, and ‘the king with no clothes’ are some examples. What’s more, in hindsight, after a risk event, these risk factors can look obvious, hence the ‘Blame Game.’ I realized the problem isn’t about actioning complex risks, it’s about identifying them.
What we need is ‘prior hindsight’. How do we operate our organisations in a way that will stand scrutiny? This is a higher test for a CRO’s effectiveness than is the case at the moment. If we take ‘Governance, Risk and Compliance’, what we want is for people to make decisions appropriate to the context – when it’s obvious, complicated or complex.
The Cynefin framework was the key to my understanding in practical terms for people to comply with ‘best practice’ (‘no brainers)’, when I needed to get expert help and when ‘get prior hindsight of complex sources of risk’. If you are not familiar with the different decision making contexts, listen to David Snowden by clicking on the link.
This example from the Cynefin website illustrates how ‘soft’ issues in a ‘human activity system’ can have a profound effect on the system. In Figure 2 below, see how these are clustered in the ‘Complex’ part of the framework. You can see that this includes a Blame Culture; the question is how did the system evolve such culture? Or who drove it and why?
Figure 2 - Cynefin Framework - Mine Safety - Deloitte
If you were the CRO of this organisation, what would you infer from this diagram? Why do they have a ‘blame culture and accountability issues’? Why are all the issues clustered in the top left hand corner when it is likely most of the compliance effort is on the right hand side of the diagram? What will enable you to ‘get a handle on’ how to improve compliance, make incentives work, eliminate the ‘say-do’ gap?
In practice, we look at organisations as structures rather than systems, hence the cliché ‘silos’, and the problem of seeing where risks are interconnected. If we treat an organisation as a ‘system’ we can see how people, policies, processes which make the organisation ‘tick’ using its facilities, buildings and assets. If we add ‘Human Activity’ to the word ‘Systems’ (to get Human Activity Systems or HACs) it is easy to recognize a) that we humans do things for all sorts of reasons, hence the complex nature of a system and b) that we can influence how they think and act for better or for worse.
Let’s take a simpler system, a ‘traffic system’ comprising the infrastructure, and the ‘traffic’ comprising the vehicles, cyclists, drivers, riders and pedestrians. A traffic system is dynamic and, to some extent, it can be influenced or rather the people in the system can be influenced. For example, drivers changing their routes and or when they travel to avoid congestion. However, a traffic system can’t really be ‘controlled’ short of shutting it down. For a great TED talk on this, listen to Jonas Elliason on You Tube.
What ‘human activity’ can we influence in a system? ‘Mindsets’ in Figure 2 are listed. Just like drivers in a traffic system, peoples’ mindsets and how they exercise their judgement can be influenced by and will influence the performance of the organisation as a system. The ‘complexity mix of factors’ will vary; however, although traffic systems vary, there are some common factors that influence they ‘behave’. Over six years we have refined some of these common organisational factors, thereby giving insights into where and how to build the capability in the system to self-adapt.
Figure 3 - Capability in Uncertainty Survey
You can explore this in the context of your own organisation by using our Preview Capability Survey which you can access here.
The mix of factors that affect how a system behaves includes those which may not register in a conventional ERM system, for example ‘sensitive issues’ like inappropriate leadership styles and or decisions, ‘elephants in the room’ etc. Responses to the full survey give insights into what people think in a way that allows them to describe what stops them doing their job, what they worry about and so on.
For the CRO, this feedback can be the basis for addressing the challenge of how their organisations operate in a way that will stand scrutiny, a higher test for a CRO’s effectiveness than is the case at the moment.
Next in this series: - ‘Governance, Risk and Compliance’ - effective Operational Risk, Organisational Risk and Strategic Risk capability.