How often is a control failure to blame when things go wrong? I’d be stunned if it nudged much over 30%.
In 20-too-many years working in risk management – from counter-terrorism, investigations, and political risk via crisis management to ethics & compliance – when things go wrong, the main culprits tend to be:
- We failed to consider external factors (events, actions, threat levels).
- We didn’t know what was happening in our (organizational) homes.
I’d love to write about the first point but opt for internal behavioral failures as we can quickly analyze them. I use a four-step process to determine what’s going wrong: access, accountability, understanding, and trust.
Access denied
Knowledge management in your average organization looks like a disgruntled teenager’s bedroom. What you need is in there, but finding it won’t be quick, pleasant, or without peril.
A recent survey of a multinational operating across 29 countries found that accessibility to compliance (and broader risk) management materials fails once we escape headquarters. Frontline operational folk in many industries don’t have time (or access) to scour an intranet built for desktops in 2003. Training that won’t display correctly on smartphones isn’t smart, and don’t get me started on the new expenses or procurement system.
Check that your people can access the tools they need to manage risk. For those with (many or most) employees in the field, it’s imperative.
Above the law
“Do as I say, not as I do” rarely leads to good things. Most of us think, “that’s not me; I’m a paragon of virtue”. In numerous sessions, I ask:
- I am ethical – with options including “all the time, most of the time, some of the time, rarely, never”.
- Other people are ethical (same options).
Generally, 65%-75% of us will usually say we’re ethical. Those pesky other people? 30%-40% of the time. Where does this dissonance come from?
We judge ourselves on our intentions and others by their actions. My ethical conscience may be intact if I cut you off in my car while you commute. I might rationalize the decision because my kid is waiting for me (and it’s raining). You see a jerk cutting you off.
At an organizational level, the distortion between our delusions, rationalizations, and intentions is magnified. Especially when there is “them and us”. Asking your people’s opinions on accountability – fair enforcement, walking the talk, adherence to rules, etc. – will often help you identify the cynical, disengaged, or otherwise high-risk people. When you find them, typically, they don’t need judgment; they need to be heard.
A little bit of knowledge
Have you heard of the curse of knowledge? We in risk are often guilty – we can’t quite believe someone would defraud the company to pay off-the-books bribes, thinking they were doing the company a favor (no bungs on the books, true story). We must check our assumptions that people “know the right thing to do”.
Many malaises contribute to a weak understanding of risk. But the big ones include:
- Not communicating in plain language (preferably that of the recipient).
- Not making the Call To Action clear – what needs to be done, by whom, when, and how.
- Not understanding where our rules hit their world.
We must make risk content relatable, realistic, and relevant to bridge these gaps. Not easy, but it becomes attainable when we understand the basics of user experience and design thinking. Ideally, our knowledge should extend to a basic understanding of behavioral analysis – notably memory and how we learn.
Trust
We avoid many risks when people can say:
- I don’t know/understand.
- I made a mistake.
- I saw something (wrong or not right).
But why would they? Most organizations succeed with cultures of “don’t bring me problems, bring me solutions”. In risk, we need them to do the opposite – things go wrong when we go gonzo with complex issues. We also know many whistleblowers suffer significantly for raising their voices.
We need to know where people don’t feel safe. Relying on the number of calls to our reporting line without understanding where people aren’t using it will see us blindsided. Many factors – personal, cultural, organizational, and functional – impact our readiness to speak. But, in my view, those factors are overstated. Humans do as others do (often). If those around you keep quiet as bad things happen, you will, too (usually).
Ask questions about trust – including employee faith that if they raise a concern, you’ll a) do something about it, and they’ll b) be protected from retaliation. Better still, invest in polling or surveying data that shows the responses, the drop-off rates, and the time it takes them to respond. In a recent survey, employees asked a simple question about their confidence in protection from retaliation took 50 seconds on average to answer (an age in survey terms).
Details matter
Some of you may feel this is too soft. I disagree; risk is a human discipline. I want to know what people think and feel.
Most investigations I’ve worked on involved people doing things they shouldn’t because of pressures, ignorance, rationalizing the unethical, and making mistakes.
Gathering predictive intelligence about access, understanding, accountability, and trust is easier now than ever. It’s changed the way I approach risk management altogether. I recognize that if I am to prevent issues, I need to know who needs what (kind of) support. To do that, I need human risk data.