It’s 2 in the morning.  You are stopped at a well-lit, completely empty intersection looking up at a red light.  If you’re like me, you will wait till that light turns green before taking your foot off the brake, but there is a nagging little part of you that bristles at the thought that the rule, for that moment, seems arbitrary.  The rule should know better: the light should change for me not on its schedule but on mine.  Technology should solve this problem.

It’s 2 in the afternoon.  Your phone rings and on the other end is a colleague, relative, friend or, perhaps, even your boss.  They are worked up (and you never find out what set them off).  In an animated voice full of concern they spurt out “What are we going to do about those advanced persistent threats, about Heartbleed, GHOST, BASH, PLAUGE, HURICANES? WHAT ARE WE GOING TO DO?”  If you’re like me, you DO take risks seriously but there’s a part of you that says to the person “Keep calm and test your controls.”  Panic never got anyone anywhere.

In my last article for Security Current, Risk Based vs. Rule Based, I discussed how people’s attraction to seeing security as either dealing with risks or complying with rules can create blind spots in their security program.

What I’m talking about here may seem entirely different.

But it’s not.  It’s a further look at how bias can impact how effectively a security program works in an organization.

In the introduction to his article “Decision behaviour – Improving expert judgement,” Geir Kirkebøen, a professor of Psychology and Computer Science at the University of Oslo, puts it this way:

“People’s judgements are not perfect. Even experts make mistakes. Interestingly, mistakes in judgement and decision-making are not random. Professionals deviate systematically from normative standards. Per definition, systematic biases result in reduced decision quality, and accordingly, may have serious consequences. However, the fact that judgement biases are systematic also gives hope. If the biases and their causes can be identified, it should be possible to correct them.”

Just like people are attracted to certain ways of seeing the world around them, they are also repulsed by certain ways of seeing the world.

Security professionals deal with these aversions all the time, and may have them themselves.  Let’s look at them one at a time.

Rule aversion is a bias towards minimizing the importance of complying with rules.   Whether it is the IT Ops person who explains to you that they don’t need to follow the Enterprise patching policy for those servers because “they’re not in production” or the salesperson who tells you that the rule about not sharing their ID and password with their entire team does not apply to them because they “gotta make my numbers,” rule averse people sometimes sound lazy or driven more than they sound defiant.

After all, in an enterprise with any governance or command and control structure at all, saying you’re a complete renegade who will break the rules whenever you see fit does not always go over well.  But portraying yourself as someone who will “do anything to get the job done” can make you look heroic.

For the IT Ops folks, the cowboy streak expresses itself most often in the argument that they are resource constrained.  In the absence of any empirical evidence, they will justify being out of compliance with policies and rules because the overhead involved in compliance is too high.  Examples:

  • “We can’t encrypt, it will impact performance (well no, I have not measured that)”
  • “I had to give them local admin rights, they need to install software on their machines all the time and we’re not staffed to support that (well no, I have not gone back to check how many tickets they opened to have software installed)”
  • “I know what the policy says, but this is an exception (well no, I did not follow the policy exception process, there wasn’t time)”

At first glance, this may seem to be a type of laziness.  And, as Ezra Pound pointed out almost a century ago, laziness is “not the least incompatible with being very busy along habitual lines.”  Try calling a busy engineer who is on-call 24/7, worked 15 hours last Friday night to support a go-live and still needs to do a ton of maintenance tasks, “lazy” and you will get laughed at.   Truth is, it is probably bias, not laziness that is the problem.

Sometimes, the end user who takes shortcuts and cuts corners and the IT Ops engineer who makes policy exceptions on the fly do so just because they have a bias against playing by the rules.

Maybe they have that bias because they sometimes look like a hero when they “throw out the rulebook” and charge ahead.  Maybe they have that bias because that’s the way they grew up (“Mary, Mary, Quite Contrary” syndrome, if you will).

Getting to the bottom of why someone is rule averse is generally out of scope for the security professional.   To fit the remedy that professor Kirkebøen recommends above, the only thing you need to recognize is that the reason the policies are not being followed is that the person you keep finding out of compliance is simply rule averse.

How to influence their behavior depends on lots of factors: the severity of their rule breaking and their position in the company being two important ones that come to mind.  The best way to get a rule averse person to play by the rules that matter to you is to demonstrate to them that these are not the kind of rules they’re averse to.

In other words, show them that these are “controls,” “protective measures,” “common sense,” anything but “the rules.”   You might even have to throw a different rule under the bus to make your point.  For example: “You know how stupid it feels when you’re stopped at a red light in the middle of the night and no one’s around?  This isn’t like that, we’re just trying to protect ourselves from the hackers; and they never sleep.”

Then there are those who are risk averse.

By “risk averse” here, I don’t mean those that avoid taking risks.  I am referring to those who are in denial that there are any risks.  Or who minimize the risks to make themselves sleep better at night.  Most of us fall somewhere between Chicken Little (risk is everywhere and huge) and Pollyanna (everything is fine).

The bias towards denying risk is as dangerous to the organization as the bias against complying with rules.  At this point, the vast majority of individuals working in the corporate world have seen something about avoiding phishing scams.  But every year, the news covers hacks that succeeded because someone clicks on an infected file sent in an email from a stranger.  And since it is an electronic exchange, the victim cannot even claim “he had trusting eyes” or any of the other traditional things one might hear from victims of in-person scams.

How do you get a risk averse person to recognize that there are risks out there?  It usually starts with your telling them that, unfortunately, not all lemons make good lemonade.   In other words, you need to convince them that for better or worse, there are some risks out there and they need to recognize that.   And the reason they need to recognize that is because the consequences of ignoring the risks are experiencing their negative impacts.

It’s not as hopeless as it sounds.  They usually have had to admit sometimes in their lives that something bad might happen and at least during your conversation with them, they “get it.”  But they usually need a lot of reminding.

And that’s the point about all bias: it is when you are not pointing it out to someone, when they are distracted with other things, that their biases are most likely to influence their judgment.

What makes the job of the security professional the hardest with respect to all this is when they encounter someone who is both risk averse and rule averse.  Nothing justifies breaking a rule like denying there is any risk to mitigate.   Nothing “justifies” running a red light like insisting there was no one around when you did it.

Finally, there are people out there whose biases are really hard to correct for.  The only way to counter some people’s rule aversions is to tell them “because you have to” and the only way to weaken someone’s risk aversion is to tell them “we can’t afford to not take this seriously.”

Leave a Reply