In the debate over government "exceptional access" to encrypted communications, opponents with a technical bent (and that includes me) have said that it won’t work: that such a scheme would inevitably lead to security problems. The response—from the policy side, not from techncial folk—has been to assert that perhaps more effort would suffice. FBI Director James Comey has said, "But my reaction to that is: I’m not sure they’ve really tried." Hillary Clinton wants a "Manhattan-like project, something that would bring the government and the tech communities together". More effort won’t solve the problem—but the misunderstanding lies at the heart of why exceptional access is so hard.
The Manhattan Project had to solve one problem. It was a very hard problem, one they didn’t even know could be solved, but it was just one problem. Exceptional access is a separate problem for every app, every service, every device. Possibly, a few will get it right. More likely, they’ll fail even more abysmally than they’ve failed at even simple encryption. Study ("Developers have botched encryption in seven out of eight Android apps and 80 percent of iOS apps") after study ("10,327 out of 11,748 applications that use cryptographic APIs—88% overall—make at least one mistake") after study ("root causes are not simply careless developers, but also limitations and issues of the current SSL development paradigm") after study ("We demonstrate that SSL certificate validation is completely broken in many security-critical applications and libraries.") have shown the same thing: programmers don’t get the crypto right—and these are primarily studies of apps that use standardized and well-understood protocols and APIs.
Oppenheimer and company had the advantage of effectively unlimited resources. When confronted with two design choices, they could try both. This let them avoid dead ends, such as trying to build a gun-type bomb with plutonium. They could try gaseous diffusion, thermal diffusion, centrifuges, and electromagnetic separation for uranium enrichment. (The latter required more than $1 billion dollars of silver wire—and they got it.) App developers don’t have that luxury. Even if one or two do, they don’t share their source code with each other. Besides, most developers don’t know if they’ve gotten it right or wrong; the failure mode here is silent insecurity. Most of the time, holes like these are found by people who do a serious penetration study—and these are generally the attackers.
One size doesn’t fit all when it comes to cryptography. That’s why cryptographic APIs are so complex. We can’t solve the exceptional access problem once and for all, and individual efforts won’t suffice for particular cases.