How would you feel if your email provider were routinely reading your e-mails, looking for illegal content, and then turning over the contents of your e-mail to law enforcement agencies anywhere in the world?

Well, guess what. They are. Right now, though only to a somewhat limited extent, but enough so that the U.S. government has argued that people have NO expectation of privacy in the contents of their e-mails, and therefore the government can read them without a search warrant.

The case of alleged Houston pedophile John Henry Skillern should be a wake-up call.  It has recently been reported that Skillern, a registered sex offender, was arrested after Google trolled through his Gmail and reportedly found images of child pornography.

Google turned this information over to the National Center for Missing and Exploited Children, which in turn alerted law enforcement agents, who arrested Skillern. Score one for catching predators and for killing the Internet.

Everybody knows that Google “scans” the contents of emails. In fact, that’s a key part of its business model.  It uses an automated process to look for key words in emails and then deliver targeted ads to the customer based upon the contents of the email.

So if you send or receive an email with the term “Lamborghini,” you might find yourself receiving ads at the bottom of the email for high-end sports cars.  Moreover, Google can use the aggregate of your email to create a profile of you, which is useful to Google and to its advertisers.

Based on the contents of your email (and your search history and other information Google gathers) the company determines that you are a single, risk-taking, sports car driving, upwardly mobile city dweller with high disposable income and a penchant for fine wines.

It’s a compromise.  The user gets free email from Google, but doesn’t actually reveal the contents to anyone. Google gets to charge Porsche a premium to target ads to you (and not to the suburban lacrosse mom in the market for a Chrysler minivan).  Ads are delivered but privacy (somewhat) protected.

There are flaws to this theoretically privacy enhancing process. While the contents of Gmail are theoretically protected from “human” disclosure, the ability to aggregate the results of the algorithm means that an awful lot of intimate personal information is being collected from the contents of your email.

A useful experiment might be to add the words “lung cancer” to every outgoing email for a few years, and see what happens to your health insurance rates. The other problem is that, while the contents of emails are protected from disclosure without legal process (more on that below), the metadata about those emails are not.

That belongs to Google or its advertisers to do as they want, and is subject to disclosure by Google to the cops, or to subpoena by the cops using less exacting standards than that required by the Stored Communications Act, which provides:

(1) a person or entity providing an electronic communication service to the public shall not knowingly divulge to any person or entity the contents of a communication while in electronic storage by that service; and

(2) a person or entity providing remote computing service to the public shall not knowingly divulge to any person or entity the contents of any communication, which is carried or maintained on that service—

(A) on behalf of, and received by means of electronic transmission from (or created by means of computer processing of communications received by means of electronic transmission from), a subscriber or customer of such service;

(B) solely for the purpose of providing storage or computer processing services to such subscriber or customer, if the provider is not authorized to access the contents of any such communications for purposes of providing any services other than storage or computer processing; and

(3) a provider of remote computing service or electronic communication service to the public shall not knowingly divulge a record or other information pertaining to a subscriber to or customer of such service (not including the contents of communications covered by paragraph (1) or (2)) to any governmental entity.

But the Stored Communications Act applies only to the “contents” of stored communications. Given enough metadata however, I can determine or infer the contents of the communications.

Knowing what ads were delivered to you, when, and in response to which emails allows me to infer that you had certain terms in your email.

For example, if a Tea Party group organized a mass “write in” campaign to Members of Congress with a form letter, which resulted in the Tea Party members receiving ads based on the content of the form letter, a big data analysis could allow Google to infer that the contents of the letters were the same (also based on file size, source and destination) and what those contents were.

It’s like Jonny Carson’s Carnac the Magnificent.  No need to open the envelope – just hold it up to your temple!  So even without seeing the contents, I can know the contents.

But what Google did in the Skillern case, and probably does all the time is much worse. Google was examining the contents of Skillern’s emails (and attachments) for the sole and exclusive purpose of determine whether or not he was committing a crime, for the purpose of turning him in to the police. They were surveilling him.

They are doing it to you if you have a Gmail account, or ever send an mail to someone who does.  Oh, and don’t think you can avoid having the contents of your emails read for illegal stuff by moving to another provider.  Everyone is doing it.

But it’s an invasion of privacy. And it’s probably illegal, and probably a deceptive trade practice as well. And fraud. But why should we care – it’s only creepy pedophiles, right?

To Scan and Protect

What Google and other providers do is to routinely scan the contents of emails and attachments for the MD5 hash of “known” child pornography.  Every time someone is found with kiddie porn, a copy of that kiddie porn is sent to the National Center for Missing and Exploited Children (a private company) to create a database of that porn.

The porn is analyzed, and a hash of that porn is created. This database of hashes is available to certain entities (but not all) like law enforcement groups, ISP’s and email providers.

By comparing the hash of files on a seized computer, on a website, or being transmitted from one person to another by email, FTP or otherwise against the hash of known child porn, the ISP or provider can see whether any of the files match – a “hit” for kiddie porn.

That information can then be shared with law enforcement agencies that presumably then get a warrant for the contents of the suspected offender’s account. Just like what happened in Skillern’s case.

Think of it as a giant “dog sniff” for kiddie porn, but a packet sniff.  The Supreme Court has held that it’s ok to stop every car on the highway and have a dog sniff the contents of the car for drugs noting:

The legitimate expectation that information about perfectly lawful activity will remain private is categorically distinguishable from respondent’s hopes or expectations concerning the non-detection of contraband in the trunk of his car. A dog sniff conducted during a concededly lawful traffic stop that reveals no information other than the location of a substance that no individual has any right to possess does not violate the Fourth Amendment.

Child porn, like drugs is contraband.  People have no right to possess it (mostly).   It’s unlawful activity.  Therefore, when Google on its own, or on behalf of the government, or presumably the government itself scans for this contraband that people have no right to possess, no harm, no foul, right?

Not so fast, kemosabe.

Make no mistake about it.  Google (and other ISP’s and email providers) is reading the contents of your communications.  Generally, Google has no more “right” to read your emails than does your average Estonian hacker.

Not only does the Stored Communications Act created a general right to privacy in the contents of email, but the federal wiretap law, 18 USC 2511 also makes it illegal to “intercept” the contents of communications in most circumstances.  There are three exceptions. First is if there is a warrant, subpoena or other lawful court order.  When Google was “sniffing” Skillern’s porn, no warrant was yet issued.

Second, there is the so-called “provider” exception to the wiretap law.  The statute provides:

It shall not be unlawful under this chapter for an operator of a switchboard, or an officer, employee, or agent of a provider of wire or electronic communication service, whose facilities are used in the transmission of a wire or electronic communication, to intercept, disclose, or use that communication in the normal course of his employment while engaged in any activity which is a necessary incident to the rendition of his service or to the protection of the rights or property of the provider of that service, except that a provider of wire communication service to the public shall not utilize service observing or random monitoring except for mechanical or service quality control checks.

A couple of problems with Google (or others) relying on the provider exception as grounds for scanning (um, reading) the contents of Skillern’s emails.

First, this was not random quality monitoring – it was a deliberate examination for specific files, and unrelated to ensuring that the files were transferred.  Second, the scanning was not “necessarily incident to the rendition of services.”  Google can transmit files without scanning their content. Third, the only way that the scanning is necessary for the “protection of the rights or property” of Google is if Google were to be held liable for the contents of people’s emails – which they are not.

The law makes it a crime to “knowingly” possess or transfer child porn.  In fact, by scanning the emails and “learning” their contents, Google becomes criminally liable when it stores or transmits the kiddie porn!  The scanning creates liability, not reduces it.

Duty to Scan/Report

Contrary to popular belief, Google is NOT required to do this.  ISP’s and email providers are not required to scan for or look for child porn.

At least 10 states–Arkansas, California, Illinois, Missouri, North Carolina, Oklahoma, Oregon, South Carolina, South Dakota and Texas–have enacted laws requiring computer technicians or information technology workers to report child pornography if they encounter it in the scope of their work.

In addition, Michigan law provides confidentiality and immunity from civil liability for computer technicians who report child pornography encountered in the scope of their work.

The laws don’t require technicians or service providers to search for the illegal material, only to report it if they find it. Similar laws in some states apply to film developers who encounter child pornography on the job.  But these laws don’t require the Geek Squad to actually scan your computer for kiddie porn to diagnose a USB drive malfunction.  What they say is that, if during the course of work you FIND kiddie porn, you have to report it.

The laws also don’t address what should happen if the computer technician is hired by counsel for the purposes of evaluating a person (or company’s) liability.  So a technician hired to conduct e-Discovery by counsel (and therefore enjoying a privileged relationship) finds child porn on a client’s computer.

The technician (and possibly the lawyer who retained the technician) is now required to turn in the client, even though the knowledge of the alleged crime occurred solely as a result of the privileged relationship.

The federal law regarding the duty of ISP’s and email providers is similar.  They do NOT require (or even suggest) that companies scan for or look for kiddie porn.  For example, one law requires reporting of child sexual abuse on federal facilities or by federal workers with actual knowledge of the abuse (including child porn.)

For ISP’s and email providers, the law, 18 USC 2248A says that, if the ISP or email provider has actual knowledge of kiddie porn, it must disclose that to the cops.  The statute says:

(a) Duty To Report. —

(1) In general.— Whoever, while engaged in providing an electronic communication service or a remote computing service to the public through a facility or means of interstate or foreign commerce, obtains actual knowledge of any facts or circumstances described in paragraph (2) shall, as soon as reasonably possible—

(A) provide to the CyberTipline of the National Center for Missing and Exploited Children, or any successor to the CyberTipline operated by such center, the mailing address, telephone number, facsimile number, electronic mail address of, and individual point of contact for, such electronic communication service provider or remote computing service provider; and

(B) make a report of such facts or circumstances to the CyberTipline, or any successor to the CyberTipline operated by such center.

Putting the provider exception and the duty to report together means that if Google comes across a child porn file during the “ordinary course of business” while protecting its rights and property, it must turn that over to the cops, but it has absolutely no right to routinely scan the contents of emails to LOOK for child porn.

There’s a big difference between the “ordinary course of business” related to the providing of services and “routinely” doing something.  It’s like “cruel and unusual” punishment.  Routinely executing prisoners by dipping them in hot oil is no less “unusual” simply because it becomes routine.

Put simply, scanning files for kiddie porn (as distinguished from scanning for spam or viruses) might be a great idea and a great public service, but it has nothing to do with the providing of communications services.

Consent

There is a third reason that Google can scan Skillern’s Gmail for kiddie porn.  He (like everyone else) consented to it.  Hell, everyone knows that Google “reads” your email.  How could Skillern possibly have any “reasonable expectation of privacy” in the contents of his email?  Right?

That’s where the danger lies.

If, by using a commercial email provider with Terms of Service of Terms of Use which permit the provider to “read” the contents of email, the user abandons privacy expectations, then we have effectively gutted the Stored Communications Act, the Fourth Amendment to the Constitution, and all privacy rights.

In fact, in a case called United States v. Warshak the Government argued that the routine scanning of emails by entities like Yahoo for viruses or spam meant that users no longer had a “reasonable expectation of privacy” in the contents of their emails.  No privacy means that the government doesn’t need a warrant to get the contents.  Got it?  Because your ISP scans your email, the government needs no warrant to read it.

So, did Skillern really consent to having Google scan the contents of his emails and turn the result over to the cops?  I think not.

Google’s Terms of Service provides:

Your Content in our Services

Our automated systems analyze your content (including emails) to provide you personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection. This analysis occurs as the content is sent, received, and when it is stored.

Hmm. I don’t think that having the cops kick in the door with an arrest warrant constitutes a “personally relevant product feature…”  The way I read Google’s Terms of Service is that Google’s robots will scan my email and deliver up ads to me.

NOT that Google will (directly or indirectly) read my email and disclose the contents to anyone else – including advertisers, third parties, or the cops.  In fact, Google’s disclosure policy states:

We will share personal information with companies, organizations or individuals outside of Google if we have a good-faith belief that access, use, preservation or disclosure of the information is reasonably necessary to:

– meet any applicable law, regulation, legal process or enforceable governmental request.

– enforce applicable Terms of Service, including investigation of potential violations.

– detect, prevent, or otherwise address fraud, security or technical issues.

– protect against harm to the rights, property or safety of Google, our users or the public as required or permitted by law.

Note that Google’s unified privacy policy relates not only to email, but to everything stored on or transmitted through any Google product or service.  Thus, the policy relates to files stored on Google drive, Google cloud, etc.

There is a huge difference between having ads delivered to me based on the contents of my files and having the cops kick in my door based on the contents of my files.

It’s not the case that Google is scanning for MD5 hashes of child porn “to provide personally relevant product features…”  They are scanning to look for pedophiles and to turn them in.  A laudable goal.  But not one disclosed in their terms of service and privacy policy.

This is the camel’s nose in the tent.  It’s easy to not have sympathy for people like John Henry Skillern, a convicted pedophile and registered sex offender.  Oh, and while we are using automated algorithms to look for kiddie porn, lest also use them for scanning contents of documents and emails for information about terrorism, drug dealing, organized crime, tax evasion, and weapons trafficking.

The same algorithm that can find that I am interested in the new iPhone 6 can also find that I am interested in Ecstasy or Uzi’s.  Google can then “tip off” the cops to subpoena my documents, and I have “consented” to Google’s search.

Now Google can fix this “problem.”  They can change their terms of service to say, “Google routinely scans the contents of your documents or emails to look for evidence that you are committing a crime.

If we think that you are, we will turn this information over to the police, the secret intelligence service, or a paramilitary organization in the country in which you reside.  Thank you for using Google.”  We will see how that works out for them.

Leave a Reply