A recent report has indicated that police have subpoenaed records from Amazon to get the cloud stored audio files from an Amazon Echo to help solve a murder case. This points out a disturbing trend in privacy.

It’s not that the government is surveilling us.  It’s that we are surveilling ourselves in newer and more intimate ways, and in ways that fail to take into account all of the unintended consequences of data collection and analytics.  We deploy new products and new technologies, new features and feature sets, new devices and applications, without any consideration for the intended and unintended privacy implications – near and long term – of the technology.  By the time we actually consider these implications (and the security implications as well) it may be too late.  That genie ‘aint going back in the bottle.

That’s why both privacy and security must be integrated into the design of every technology that collects, stores, processes or transmits information.  That’s why data has to have an intended use, and an expiration data.  And finally, that’s why we need to decide who gets to decide these things.

The “Alexa murder” case is relatively straightforward. In the course of a murder investigation in Arkansas, the police determined that a suspect had “a bevy of “smart home” devices, including a Nest thermostat, a Honeywell alarm system, a wireless weather monitoring system and an Amazon Echo.”

Just as the police might subpoena phone records, parking records, or computer records, the Arkansas police issued a subpoena to Amazon for any surreptitious recordings the Amazon Echo may have made relevant to the murder.

I have previously written in this area about the dangers of such “always on” and “always listening” devices – from a privacy and data security perspective.  These devices pose a risk of electronic discovery and inspection by police, lawyers, courts, judges or other third parties.

Is it Legal?

Amazon hasn’t indicated whether or not any such echo files exist in the Arkansas case, and have resisted complying with the government’s demand for the information.  However, the general rule for evidence is, absent some privilege, if the information exists, it must be produced.  Amazon’s privacy policy notes that:

We release account and other personal information when we believe release is appropriate to comply with the law; enforce or apply our Conditions of Use and other agreements; or protect the rights, property, or safety of Amazon.com, our users, or others. This includes exchanging information with other companies and organizations for fraud protection and credit risk reduction.

So if there’s a valid subpoena for data – including the voice data – and the data is relevant and not privileged, it is subject to production.

But, of course most people using an Amazon Echo or its dopplegangers don’t realize that the device is listening to everything they are saying and uploading that data to a third-party cloud server.

It’s not that they don’t KNOW it, it’s that they don’t REALIZE it.  The risks and benefits of always listening is simply not incorporated into their purchasing or use decision.  When you buy a fitbit, you don’t consider the fact that the fitbit can be used to capture keystrokes as you type.  When you buy a Nest thermostat, you don’t consider the fact that it can be used to capture your location data at all times.

The Echo is just a convenient way to jot down reminders, play music, or settle bets.  And there are more unintended consequences.  The fact that the device is always on and always recording may mean that a lawyer with one in an office or home (ahem) may be deemed to have waived attorney client privilege by failing to protect communications and by “sharing” them with a third party (e.g., Amazon).

Many years ago, Google offered an alternative to traditional telephone directory assistance services called Goog411.  By dialing 1-800-GOOG411, you could get a free directory assistance call.  Pizza in Pittsburgh?  Fine!

Of course, Google was of course collecting data about user preferences, locations, and even the effectiveness of advertising campaigns, and linking the voice data with Gmail or other services.  OK, fine.  But Google also could use the data to compile specific voiceprint identifications of individual users, background sounds, and other information.  We give up a LOT of information when we don’t know we are giving it up.

And all of this information is subject to hacking, discovery or inspection.  It’s not just criminal investigations.  Civil attorneys, regulators, and anyone with a desire to find out can find a way to get access to the data.  If it exists, it will be used.  And used in ways other than expected or intended.

Nothing to Hide

But if you’re not doing anything wrong, what do you need to worry about?  You have nothing to hide.  After all, you didn’t murder anyone, right?  Probably.  But another thing we learn is that, once data is available, it is used in all kinds of ways.

For example, when municipalities in the UK installed surveillance cameras in public spaces pursuant to the Regulation of Investigatory Powers Act (RIPA), they cited terrorism as the justification for the intrusion into privacy.

Armed with these powers of investigation, how did these municipalities use the new surveillance powers?  They used them to monitor dog parking, pooper scooper violators, and unruly pigeons.  So we can imagine a time in the not too distant future (e.g., 2017) when Amazon Echo records are demanded for similar trivial violations – and worse – to investigate crimes like unlawful cohabitation, or to collect information about protected political activities.

If it exists, it will be discovered.

So we need to implement privacy and security by design.  Collect only what is needed.  Use it only for the purposes for which it is collected.  Secure it.  And delete it when done.  Simple in theory, difficult in practice.  Next time you are watching TV, just remember – the TV is watching you too.

Leave a Reply