On 60 Minutes last night, Lesley Stahl was shocked, shocked to see that modern automobiles collected vast amounts of information about their owners and drivers, had myriad sensors to collect information and allow remote access to that information, and has little if any security either for the sensors or the data collected.
Also shocking is the fact that there is almost no law or regulation about who owns this data, who can access it, and for what purposes. The same people were shocked in 2013, 2011, and 1999 when the same things were revealed.
In fact, people are voluntarily placing devices in their cars to allow insurers to monitor their movements, car lenders are putting devices in cars to track borrowers, and rental car companies are tracking renters.
That’s in addition to the car license plate readers, EZ-Pass scanners, and other devices tracking users’ movements, and the move to (because of lower gas prices and higher efficiency as well as hybrid and electric cars) impose commuter taxes based not on gasoline usage, but on distance travelled. To get the taxes, the government has to know where you are going.
The 60 minutes outrage was fueled by a report released by Massachusetts Democratic Senator Ed Markey. The report, called “Tracking & Hacking” shows that modern cars generate an awful lot of data, much of which can be remotely accessed (by whom is anyone’s guess), and much of which has little if any security on it.
So what did the report find? The results should not surprise most security professionals. It found that:
1. Nearly 100% of cars on the market include wireless technologies that could pose vulnerabilities to hacking or privacy intrusions.
2. Most automobile manufacturers were unaware of or unable to report on past hacking incidents.
3. Security measures to prevent remote access to vehicle electronics are inconsistent and haphazard across all automobile manufacturers, and many manufacturers did not seem to understand the questions posed by Senator Markey.
4. Only two automobile manufacturers were able to describe any capabilities to diagnose or meaningfully respond to an infiltration in real-time, and most say they rely on technologies that cannot be used for this purpose at all.
5. Automobile manufacturers collect large amounts of data on driving history and vehicle performance.
6. A majority of automakers offer technologies that collect and wirelessly transmit driving history data to data centers, including third-party data centers, and most do not describe effective means to secure the data.
7. Manufacturers use personal vehicle data in various ways, often vaguely to “improve the customer experience” and usually involving third parties, and retention policies – how long they store information about drivers – vary considerably among manufacturers.
8. Customers are often not explicitly made aware of data collection and, when they are, they often cannot opt out without disabling valuable features, such as navigation.
The car report, coupled with another recent report that Samsung “Smart TV’s” are smart enough to collect what you are saying in front of them illustrates one of the biggest problems with how we do both security and privacy in this country.
The Samsung TV apparently collected information about a viewer’s (well, irrespective of whether you were viewing) movements and also the voice recognition program collected the contents of a consumer’s communications if they were in the same room as the TV.
The “privacy policy” stated, “Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition.”
If you were planning to turn off or unplug your Samsung Smart TV, I wouldn’t recommend going to a quiet place to discuss it. At least not if your TV can lip read.
The problem here is that there are no generally accepted “norms” of behavior. Sure, we can put whatever we want into a privacy policy. But what information collection is “acceptable” and for what purpose?
And should we expect people participating in the Internet of Things (and a car and a TV are, at least for now, still “things”) to read and understand privacy policies for their car, their TV, their phone, their watch, their shoes, their appliances, and even their clothes? The old paradigm – disclose and allow opt out – simply does not work for devices like this.
My car collects a lot of data. My engine needs to know if I am accelerating or braking, if I use regular or premium gas, if I am parking or just slowing down. It’s not a problem for it to measure these things, collect and use this data to get me safely from point A to point B. That’s what I expect it to do. And I don’t mind if my X-Box is measuring my body movements while I am playing a game. I expect Siri to listen to me when I am talking to it.
But this data is available for many other purposes to many other people. Remote start, and remote control programs for my car can allow others to start, stop, and monitor information about my car. They can track others and me. They can hack control systems that were never designed to survive such attacks. Lawyers, law enforcement, intelligence agencies, insurance companies and others can access the data these devices throw off. There are no rules.
And there need to be. When you get in an accident, can the cops access the “black box” on your car? Can they do it without a warrant? Can On-Star sell or give your location data to others? What about to ex-boyfriends or creditors?
Did I mention that there are no rules? So self-regulation hasn’t worked. We will have to rely on Congress for reasonable discourse. And if that doesn’t frighten you, nothing will.