When NSA security researchers learned that the methods used by Microsoft Windows 10 machines to examine digitally signed code (like that used to install patches) had a vulnerability which would have allowed the Agency to slip in malware, they had to debate the best method of protecting the nation. On the one hand, they could have kept the vulnerability secret, and used it to infiltrate and compromise the computers and networks of our adversaries — terrorist organizations, criminal enterprises, nation states, and others. The NSA could use the vulnerability to enhance its mission, and to prevent terrorist attacks on U.S. soil, and possibly learn of other nations’ nuclear ambitions or efforts. By keeping the vulnerability (and exploit) secret however, they would expose American citizens and companies (and everyone else) to the possibility that these same hackers, terrorists and foreign nations would also exploit the vulnerability, and cause harm to the citizens and companies.

At the same time, the Department of Justice, in the course of investigating an attack by a Saudi national at a U.S. Naval Station in Pensacola Florida, derided the refusal by Apple to deliberately engineer its billions of products in such a way that access to them were no longer secure, all in an effort by DOJ to get access to whatever files were on the attacker’s phone, and not otherwise provided by Apple when it turned over the contents of the attacker’s iCloud and other accounts. Note that there’s no current indication that the files on the attacker’s iPhone can or would be used to prevent a future attack, or that they are uniquely available only on the phone. Every e-mail, text message, SMS, or social networking post — indeed, every communication to or from that device — could conceivably be retrieved from the carrier or platform that transmitted or received it. But DOJ believes that the nation would be safer if devices like the iPhone were insecure.

Of course, that’s not how they would put it. DOJ would argue that they are merely requesting “technical assistance” from the phone manufacturer in enforcing the provisions of a search warrant issued by a neutral and detached magistrate after a finding of probable cause, and that no person or technology should be permitted to evade the legal responsibility of complying with a lawful court order.

At the end of the day, however, technology is either secure or it’s not. The goal of security is to let “good” guys in (or at least “authorized” guys) and keep “bad” (“unauthorized”) guys out. The FBI argues that there should be a technology developed that essentially says that law enforcement agents with a warrant (and maybe with a subpoena, and maybe with an administrative subpoena, and maybe with a different court order, and maybe with just a showing of need and exigency, and maybe with the consent of one party, and maybe when the record would be inevitably discovered, and maybe when the file is in “plain view,” and maybe when the file is subject to regulatory review… you get the idea) should be included on the list of “authorized” guys, and that the technology should be designed — from the ground up — to give them permission to access files, documents, data, or communications. Apple argues that technology doesn’t work that way. Creating a “master key” — one ring to rule them all — inevitably corrupts the wearer of the ring, and runs the risk not only that the master key will fall into the wrong hands (my precious), but that it will be able to be duplicated, reproduced, spoofed, or evaded. In fact, some of the very technologies which were at the core of the Windows 10 vulnerability NOT exploited by the NSA (signature spoofing) could have been used by malicious actors to digitally sign or impersonate the holder of a “master key” in the Apple case. Not the same vulnerability, but a similar issue.

So much of our economy depends on the security of our electronic infrastructure. The NSA was willing to forgo a chance to conduct both offensive cyber attacks and surveillance related exploits which arguably would have made us safer in favor of disclosing the vulnerability to Microsoft which released a patch which undoubtedly DID make us safer. The FBI, on the other hand, wants to make all phones less safe so it can investigate a handful of crimes (albeit serious ones). In the words of Spock son of Sarek, “The needs of the many outweigh the needs of the few.”

Mark Rasch is an attorney and author of computer security, Internet law, and electronic privacy-related articles. He created the Computer Crime Unit at the United States Department of Justice, where he led efforts aimed at investigating and prosecuting cyber, high-technology, and white-collar crime.