The IRS, the Social Security Administration and other government agencies had a problem when dealing with the public. Scammers — often organized criminal groups or even state sponsors — impersonate genuine beneficiaries of entitlement programs (e.g. unemployment insurance, tax refunds or tax credits, social security retirement or disability benefits) and claim these benefits to themselves. The IRS has a publication devoted to helping people recover from ID theft in tax returns. The Treasury Department’s Office of Inspector General (its internal police department and auditor) found in 2020 that (as of 2018) “the IRS estimates it prevented the issuance of between $6.03 billion and $6.08 billion in fraudulent tax refunds (referred to as protected revenue). However, the IRS also reported that identity thieves were still successful in receiving an estimated $90 million to $380 million in fraudulent tax refunds (referred to as unprotected revenue).”

That’s a lot of cheese.

In 2019, the FTC reported that the losses to the Social Security Administration from identity fraud exceeded those suffered by the IRS.

So curbing ID fraud – particularly with respect to online filing – can save taxpayers a lot of money, and can save beneficiaries a lot of hassle. Strong authentication is great, multi factor authentication better, and multifactor authentication with a strong biometric is even better.

The IRS came up with a plan to allow a private company — — (as distinguished from its own authenticator to collect the data to authenticate users — including the biometric data to bind them to their credentials, and then, using the biometrically authenticated credentials, to access the government sites. In that way, the government agencies would not themselves collect the trove of biometric data, but would benefit from its collection.

On February 7, in response to outcry from the public, privacy advocates, and data security professionals, the IRS abandoned this plan noting that it would “transition away” from private biometric authentication. This was in direct response to concerns about the massive collection, storage and use of an online biometric database. The database not only collected the biometric data, but it also used biometric challenge/response (requiring the person seeking to be authenticated to interact on video with an employee) to demonstrate identity.

Under the now-rejected proposal, if you wanted to file your taxes online, get social security benefits, or apply for a driver’s license, you would  have to provide a biometric sample to a private company (facial, voice or other biometric) which would then authenticate you (maybe), and provide you a token which you can use (instead of a weak userid and password) to log into the IRS or SSA website. While it’s clear that simple single factor authentication (userid and password) are insufficient for these financial transactions, those who are not comfortable with providing a biometric online to a private company which promises that it won’t sell it to anyone else — might just have had to file their taxes with an envelope and a postage stamp.

Years ago, at the main post office in Washington, D.C. (on Capitol Hill across from Union Station) a game was played on April 15. Taxpayers would drive to the main post office (now a postal museum) and postal workers would be outside with big canvas carts on wheels. Motorists and pedestrians would toss their completed tax forms into the basket, secure in the knowledge that, even though it was 11:30 at night, the returns would be postmarked April 15, and they would have been filed timely. Then the taxpayers would stop at the Irish pub (the Dubliner) for a pint (or more) of Guinness to celebrate. More than a few filers would actually print their 1040 forms — or their checks — on their T-shirts, and toss those into the bin (literally giving the government the shirt off their back). It was a simpler — and stupider — time.

However, there’s a new ritual for those seeking to file their taxes, or interact with the IRS, or interact with the Social Security Administration. According to a recent article by Brian Krebs in his blog Krebs on security, the proposal was to require users of the IRS or SSA websites to authenticate themselves through the commercial entity As Krebs notes, “Some 27 states already use to screen for identity thieves applying for benefits in someone else’s name, and now the IRS is joining them. The service requires applicants to supply a great deal more information than typically requested for online verification schemes, such as scans of their driver’s license or other government-issued ID, copies of utility or insurance bills, and details about their mobile phone service. When an applicant doesn’t have one or more of the above — or if something about their application triggers potential fraud flags — may require a recorded, live video chat with the person applying for benefits.”

So I popped over to the site. First thing it does is it wants to serve me cookies. If you aren’t a Girl Scout or a Double Tree Hilton, I don’t want your cookies. Their Privacy Bill of Rights notes that:

You have the right to privacy. has built rigorous security and privacy requirements into our technology from inception. We are an ethical steward of your personal information and are committed to supporting your rights:

You are solely in control of your own data.

You must provide explicit consent before we will share any information.

You can see all authorized apps and data elements shared in your My Account portal.

You can revoke access to your data for any authorized app at any time.

You may destroy your credential and associated data at any time. ** Some data related to NIST 800-63-63 credentials will be retained after account deletion solely for fraud prevention and government auditing purposes.

Sounds good.  But even at the outset, inconceivable. Or, more accurately, “I do not think that word means what you think it means.” Once I turn over my biometric, authentication, data, etc. to, I am, almost by definition NOT in control over my own data. If is hacked, subpoenaed, a search warrant issued to them, etc., or any of their technological or business partners, I am NOT in control of my data. Sure, I can direct what entities I want them to give my credentials to (to authenticate me), but I certainly am not in CONTROL of the data that I transferred to them. And, unless that data is stored on their directories in a manner that is both encrypted AND for which I — and I alone have the decryption key (assuming the key is strong enough) or the data is forensically wiped from their machines, I most certainly am NOT in control of the data.

Same is true with respect to the statement that “You must provide explicit consent before we will share any information.” Have they never heard of a search warrant? A FISA warrant? A National Security Letter? A writ under the All Writs Act? Governments can (and do) not only compel entities to produce information ALL THE TIME, but government agencies and courts routinely compel the entity to which the orders have been directed to NOT tell their customers that the data has been sought or produced. Moreover, if a sophisticated hacker were to fake your identity to fraudulently obtain services or money that was yours, this policy seems to suggest that will not provide the information necessary to demonstrate that the person who faked your identity was not you, since they have agreed to not share THEIR data without THEIR consent. So the thief’s data is protected from disclosure? I hope not.

Additionally, there are concerns about the type of authentication that does. Your iPhone or Android device has biometric authentication but the biometric is stored on and compared on the device itself. As far as has been reported the biometric is not ever sent to Apple or Google. The device asks the question, “Are you John Smith?” and Mr. Smith provides a biometric which is scanned and compared to the one stored encrypted in the device to answer that question. If the answer is “yes” then access to some protected credentials is then unlocked. But Apple and Google could not provide a fingerprint or face analysis to the government if compelled cause they don’t have it.

The mechanism is different because the private company would collect and store the biometric, making is vulnerable not only to hacking and theft but to compelled production. Not good for privacy.

The other problem is the 1-1 vs. 1-many problem. A 1-1 authentication merely asks the question “are you John Smith” yes, no or maybe (Magic 8 ball says, ask again later). The 1-many facial recognition says “here’s this unknown guy — who is that?” or “here’s John Smith — tell me every photo and surveillance video that has him in it.” A very different proposition from a privacy standpoint. While denied doing “1-many” facial recognition, the CEO of posted on his linkedin page that “ uses a specific “1 to Many” check on selfies tied to government programs targeted by organized crime to prevent prolific identity thieves and members of organized crime from stealing the identities of innocent victims en masse. This step is internal to and does not involve any external or government database.” It starts with organized crime, then moves to international terrorism, then domestic terrorism, then child molesters, then thieves, robbers, tax cheats, and other “enemies of the state.” It’s not that there aren’t appropriate uses for facial recognition technology — its that, once created, it becomes too much of what the law would call an “attractive nuisance.”

 Third Party ID Verification

Another problem with using the model for authentication is the fact that many of the documents relied upon to establish identity — driver’s license, passport, etc., were actually generated using authentication credentials created or validated by — you guessed it —

The idea of third party ID verification, whereby you prove to one party your identity with strong ID verification, and then obtain from them a credential which can be securely transmitted to authenticate you is nothing new. When you prove to your state government that you have the skills necessary to operate a motor vehicle, you provide your local DMV with some evidence of identity (birth certificate, baptismal record, naturalization record) as well as some evidence of residence (lease, utility bill, etc), and they create a biometric (your picture) and issue you a reasonably strong identification document (a driver’s license). The purpose of that ID was to show that it was YOU who was able to navigate a 1969 Dodge Dart through the crowded streets of Yonkers, New York in the summer of 1973 (actually, my first driver’s license was on unlaminated paper with no photo), but over time the driver’s license has morphed into some kind of universal ID. Now, if you want to vote, get into a bar, get a gun license (in states that require it) or get on a plane, you need to present the “Real ID.”

The model differs in a few ways. First, is a private company collecting massive amounts of identity information, with only the patchwork quilt of data privacy laws protecting that data from disclosure or use. Of course, there are few legal constraints on how your local DMV uses your driver’s data — which is routinely shared with law enforcement agencies and others. In fact, local DMV’s used to sell this data to marketers and others until celebrities were stalked and one killed from the use of DMV data. Second, because attempts to authenticate individuals remotely and digitally, it had to come up with a scheme to determine (to some degree) the identity of the individual. If you can fake a digital ID, you can effectively “be” that person for many different purposes. says that the process of authentication is “simple.” They note that “The user takes a photo of their identity document (driver’s license, passport, or state ID) and a quick selfie. uses advanced facial recognition to compare the picture of the applicant on the ID document to the selfie.”

Not so simple. First, we assume that the driver’s license used is actually valid. It’s trivial to get a fake driver’s license from China. So, either is using the embedded authentication within the driver’s card, or has access to a database of driver’s license information to “validate” the driver’s license. If the latter, then what’s the point of “presenting” the driver’s license? If you have access to the picture taken at the time of issuance, then use that as the token. Second, this process depends on the authenticity of the DMV records. Guess what? is the one who collects the authentication documents needed for DMV. So they are at both ends of the authentication transaction — providing the documentation needed to get the strong ID, and then relying on that strong ID to issue a certificate. How does authenticate my birth certificate? How do they authenticate my water bill?

Identity management is tough. In fact, even a DNA sample would not necessarily distinguish me, a lawyer in Bethesda, Maryland from, say, a doctor in Pearl River, New York. (Sometimes it helps to have an identical twin, amirite?). The idea of a massive database of biometrics is too much even for the IRS. And that’s saying a lot.


Mark Rasch is an attorney and author of computer security, Internet law, and electronic privacy-related articles. He created the Computer Crime Unit at the United States Department of Justice, where he led efforts aimed at investigating and prosecuting cyber, high-technology, and white-collar crime.