The New York Times recently reported that ride sharing service Uber used a tactic – approved by their lawyers – called “greyballing” – to thwart efforts by law and code enforcement agencies from catching Uber drivers and the company operating in prohibited areas.

Without commenting on the legality of the particular service itself, the case raises the question of the extent to which private entities may use fraud, deception, traps, snares, big data and data surveillance (you know, the techniques used by cops) in furtherance of their own business objectives.

In information security terms, the extent to which companies can use techniques like honeypots, honeynets, fake servers, pretexting, spoofing and data analytics for their own purposes.

Greyballing

According to The New York Times, the technique used by Uber would try to identify the cops and enforcement officials who were after them. They used geofencing on the apps (see who works at the local enforcement agency) coupled with identifying people who opened and closed the apps frequently (“eyeballing”), and searching for people linking the apps to new “throwaway” phones, credit cards (particularly cheap credit cards linked to police unions), and e-mail accounts.

They would search the “public” portions of social media to see if the identities used by the enforcement officials matched or did not match public profiles. Funny thing, though. These are the same techniques companies and security programs within those companies use to identify fraud and violations of Terms of Service. Know your enemy, right?

Having identified possible “adversaries” Uber reportedly then seeded their apps with false data. If a suspected enforcement official tried to hail an Uber, the app would show cars converging on the location, but no car would be hailed, and no driver would show up (to be arrested and have their car impounded). Cat, meet mouse.

But these are some of the very same techniques frequently employed by information security professionals. We set up fake servers to lure in suspected fraudsters (in the Uber case, the cops are the “fraudsters”).

We seed P2P networks with fabricated data, including data that runs without the express authorization of the downloader.. This may include pushing IP beacons, corrupted data, or – depending on how aggressive we are – data which may self-delete, or delete information on the computers to which they are downloaded. We browse hacker sites using spoofed IP addresses, make calls to suspected hackers using spoofed phone numbers, and create spoofed identities in spoofed locations, with spoofed credentials. We conduct massive data collection and analysis, often surreptitiously, to identify targets, and we link and analyze disparate databases in the name of threat profiling.

We lie.  We cheat.  We steal.

All in the name of security.  And all in ways that (hopefully) have been approved by our lawyers. We do this in the name of catching hackers, child pornographers, copyright infringers, intellectual property thieves, foreign intelligence agents, DDoS’ers, pirates, thieves, and hacktivists. You know. Bad guys. We assemble databases of stolen credentials, stolen ID’s, hacker handles, “bad” IP addresses, and (hopefully) share them to make the world a better place.

Goose/Gander

One of the problems with the law is that, in order to permit law enforcement agencies to do their jobs, we frequently tinker with what we consider to be private and what is public. When the police use ALPR’s (Automated License Plate Readers) to catch speeders, stolen cars, or people with warrants, we say “that’s OK, license plates are not personal information, and you have no expectation of privacy in your license plate.”  Fine.

That also means that commercial entities can set up ALPR’s on roadsides, at shopping malls, bridges, parking lots or roving, and sell a database (they do this) where you can put in anyone’s name and track their movements over time. If we use that database to catch bail jumpers, or to repossess cars that’s one thing. But what if we use that same database to – in real time – track where every police car in the city is located? The motives and impact are different, but the privacy rights would be the same. Data is data. Like the honeybadger, it doesn’t care how it is used.

The same is true for facial recognition. The FBI is spending billions of dollars to create and implement a database of facial recognition. The same technology could be used by bad-guys. Just take pictures of everyone frequently going into and out of the police station or courthouse (in public view) run it through a facial recognition program, and violia!

A database of undercover cops and informants. Add photos from the Massachusetts State Police Academy graduating class (Departed fans, anyone?) and you enhance the database.  This “public/private” thing really doesn’t work in binary.  Privacy rights (and infringements of them) are not binary. Purpose, intent, and motive for data collection and use are important.

The same analogies apply to other forms of data collection. Stingrays and Dirtboxes (telephone cell tower replicators), GPS data collectors, apps that collect data, traffic cameras, drones, etc. are all effective tools for (and therefore against) law enforcement.

IT Security

It’s important to note that the Uber program was also created in terms of enforcing the ride-sharing VTOS program (Violation of Terms of Service). The same tools and techniques were used to identify and block (or deny rides to) people who were likely to be violent, abusive, or fraudulently fail to pay. It was also used to block competitors’ use of the service – cab companies or limousine services that would hail an Uber, and then beat the driver or destroy the car.

Which brings us back to IT security. When protecting both our business interests, our Terms of Service or Terms of Use, our intellectual or other property, or the privacy and security of our customers’ data, we may use many techniques that are similar to those used by the Greyball program. Collect data.  Import data. Analyze data. Spoof data. It’s called “Security Through Deception.” We use so-called “continuous deception” and continuous detection to lure bad guys into safe spots where we can monitor them.

So, is it legal? Magic 8 ball says…. Ask again later.  Did you really think you would get free legal advice?  Whether any individual technique is “legal” depends on a wide variety of factors – how the technique works, what it does, where it operates, and how much you pay your lawyer. You know, clear as mud. But recognize that the world is not black and white. It’s grey. Like Greyballing.

Leave a Reply