The U.S. House Energy and Commerce Committee held hearings on November 16, 2016  on security of Internet of Things in the wake of the massive distributed denial of service (DDoS) attacks perpetrated against certain network DNS servers via a host of unsecured IoT devices.  So, what’s a government to do?

That is, not what SHOULD government do, but what CAN it do?

In the areas of security, governments can (1) conduct basic security research; (2) establish and impose security standards on an industry; (3) adjust or manipulate the private tort or liability system to impose or remove liability for such devices; (4) mandate certain warranties for security (and privacy) for IoT devices; (5) support security research for others; (6) regulate (compel industry sectors to meet certain requirements); (7) reward (provide incentives for those who do meet certain standards); (8) use the power of the purse (only purchase those devices that meet certain standards); (9) develop international consensus on  standards and regulation; (10) remove institutional and other barriers to IoT security; (11) subsidize both IoT security research and production of secure devices; (12) publish and disseminate IoT security “best” or “good” practices; (12) provide patent, copyright or other legal or monopolistic protection to incentivize the creation of new technologies (though this might serve as counterproductive).

Those are what I can think of off the top of my head.

Then there’s the point where theory meets Congress.  If “pro” is the opposite of “con,” then the opposite of “progress” may be “Congress,” right?  And here’s where politics comes in – you know politics, right?  From the root “poli” meaning many, and ticks, meaning “blood sucking creatures.”

The problem with trying to tackle a technical problem with legislation is, of course, that it’s not a technical problem and there are limits to what a government can – or should do.  IoT security problems arise not only from a technical inability to secure a particular device, but from a fundamental lack of understanding of what the device itself does.  For example, researchers at the University of Illinois demonstrated that they could hack a fitbit or similar device and use it as a keylogger.   A second problem is cost – both price and opportunity cost.  Security generally requires things like access control, authentication, lack of data leakage, and encryption of data at rest and in transmission.  These things add cost, and potentially weight, shortened battery life, and altered functionality.  In a “rush to market” strategy, anything that doesn’t immediately sell the product is jetsom.  Any security solution that does not at least work with market forces will have an uphill battle.

At the hearings, Kevin  Fu of the University of Michigan suggested that governments could  establish security milestones, support NIST and NSF research, have independent cybersecurity testing labs for IoT devices, and further fund and support academic and industrial research into IoT security (maybe at the University of Michigan?)  All good things, but will the government do them?

Not clear.  Rep. Michael Burgess noted, “Government is never going to have the man power or resources to address all of these [cybersecurity] challenges as they come up—which is why we need industry to take the lead.”  Rep. Greg Walden went further asking  “The knee-jerk reaction might be to regulate the Internet of Things, and while I am not taking that off the table, the question is whether we need a more holistic solution. The United States can’t regulate the world. Standards applied to American-designed, American manufactured, or American-sold device won’t capture the millions of devices purchased by the billions of people around the world.”  So, a more “hands off” “you guys fix it” solution?  Not so sure.

Much of this problem comes from the concept that security is a “cost” and an impediment to functionality.  It’s something the government forces people to do, that they otherwise wouldn’t do.  In this analysis, we have, at any given time, precisely the amount of security the marketplace demands.  Just like in my mom’s 1961 AMC Rambler, the marketplace didn’t “demand” seat belts as we slid across the Harlem River Drive into a steel pillar under what was once the Polo Grounds.  Fortunately, we exceeded the marketplace and had seat belts (air bags were years off).

Marketplaces can be changes – through a combination of education, awareness, technology, regulation (and threat of regulation), reward, and just making security an integral part of functionality.  And if we lead, others will follow.  And if they don’t, at least WE will have more secure devices – for a while.  And since these devices will be just about everywhere – from biomedical applications, to transportation, to communication, to home appliances, to, well, to everything, we really do need to get it right.  Right now.

As Bruce Schneier noted in his prepared testimony:

If the United States and perhaps a few other major markets implement strong Internet-security regulations on IoT devices, manufacturers will be forced to upgrade their security if they want to sell to those markets. And any improvements they make in their software will be available in their products wherever they are sold, simply because it makes no sense to maintain two different versions of the software. This is truly an area where the actions of a few countries can drive worldwide change.

Regardless of what you think about regulation vs. market solutions, I believe there is no choice. Governments will get involved in the IoT, because the risks are too great and the stakes are too high. Computers are now able to affect our world in a direct and physical manner.

And that’s democracy.  And we know the origin of that word, right?  Demos – for people, and crazy, for insane.  But to not have security on IoT?  Now that’s crazy!

Leave a Reply