Remember when the FBI initiated an investigation of the classified emails of State Department employee Huma Abedin found on the laptop computer of her (now estranged) husband, former (and disgraced) New York Congressman Anthony Weiner?

Now Former FBI Director James Comey testified that he was obligated to tell members of Congress about these emails during the election cycle because they had the potential to conflict with promises he had made previously about the investigation having been closed.

Put aside politics here for a few minutes.  Let’s talk infosec.  And behavior.

A recent report by CNN indicates that the Abedin emails found on Weiner’s laptop “were forwarded to Anthony Weiner’s computer via a backup system for her phone,” and that some emails were forwarded from Abedin’s account to Weiner’s computer (not sure if this mean Abedin’s account on Weiner’s computer, or Weiner’s email account on his computer) “for printing.”  The FBI sent a letter on May 9 to the Committee chair clarifying that some 49,000 emails of Abedin were on Weiner’s computer because Abedin apparently synced her blackberry with her (then) husband’s computer, and that a smaller number of emails were forwarded from Abedin to Weiner “for printing.”

The letter concludes that 10 email chains with classified information (of the 49K that were backed up) contained classified information and that two email chains that were forwarded for printing contained classified information.

Let’s put aside the question of misuse of classified information here because it is both inflammatory and unnecessary.  We don’t know if the 12 email chains were classified at the time, whether Abedin knew they were classified at the time, or whether they were marked classified at the time.

We don’t know if the email chains which contained the classified information were originated by Abedin, or whether she was simply copied on them.  For infosec purposes, it doesn’t matter.  What does matter is that 49K emails which “belonged” to the State Department found their way to a non-State Department device.  And that happened because of a very human thing.  A human being being a human being.

I believe that the vast majority of infosec “problems” arise from people attempting to do their jobs in the right way.  It’s not so much ignorance of the right thing, or ignorance of the consequences, but more that people are trying to get their jobs done.

So if you lock down a laptop and disable the ability of users to add printers or print drivers, as well as disable USB or other devices to secure the data, users will inevitably and predictably email documents they need to print to other – non-secure and non-controlled accounts, and print from there.

People will find a way to get their job – or what they think is their job – done. If possible, they will store corporate or government documents on Dropbox or other cloud services so they can access them during non-work hours or using non-work devices (e.g., their iPad, phone, etc.).

They will offload documents onto USB thumb drives (with greater and greater storage capacity) so they can work on them in other ways or on other machines.  If you give someone a restricted, poorly configured, or simply “old” laptop on which to work, they will purchase their own computer at home and find a way to work from that machine.

If you restrict access to social media, users will use personal devices and networks to access this information – inevitably finding a way to merge personal and employer data.  If websites are blocked, they will find a way over, under, around or through.

This doesn’t mean that users are ignorant or malicious. It means that they are humans. We can educate them, train them, make them aware of risks.  We can reward them, punish them, cajole them, and restrict them. But the vast majority of infosec issues – including successful phishing and dDOS campaigns – begin with humans being human. You cannot train or educate your way out of this.  No technology exists to make people behave better.  You can’t make things idiot proof – only idiot resistant – you simply wind up with more clever idiots.

Each of us has done something “stupid” from an infosec perspective. Most of us have done something stupid from an infosec perspective at least once a week.  You login to a local Starbucks that you know you shouldn’t because – well, you NEED to.  You create a tethered connection from your laptop through your phone because – well, what could happen?  You open a port on your home router because the SlingBox that streams the Washington Nationals baseball games to your cell phone says to do it.  You open an email from a trusted friend because – well, just this one time it looked legitimate. Remember, your enterprise has to be right every time – the bad guy only has to be right once.

It’s easy to castigate someone for bad (even abysmally bad) security in hindsight.  What’s more important is to examine WHY – from a human behavior standpoint – the system failed.  Some years ago, a university in England looked at the paved paths between buildings and found that students were not generally using them.  So they planted grass between all of the buildings – no paths at all.  Then they waited to see where the students walked, and later put paths where the grass had been worn down.  Rather than conforming people to rules, they conformed rules to people.

It is axiomatic that, if a CISO (or a lawyer) is perceived as being “Dr. No,” their advice will not be sought and it will be avoided.  That’s why CISOs (and lawyers) pay lip service to saying “I don’t want to be Dr. No, I want to be Dr. How…”  But in practice, particularly in regulated environments, they act as Dr. No, and users then circumvent their rules creating greater security problems than the rule was intended to solve.

So infosec starts first and foremost with understanding organizational and human behavior – something most infosec people know little about, and an area which is ripe for research.  I profess my own scathing ignorance here as well, and welcome comments and case studies.  But I firmly believe that, in information security, for the most part, the problem lies not with the Silicon, but with the Carbon.  And most people aren’t trying to be evil, reckless or negligent.  They’re trying to do their jobs.

Leave a Reply