December 2017
Voluntary Reporting of Cybersecurity Incidents (4 December 2017)
Bitcoin---The Andromeda Strain of Computer Science Research (30 December 2017)

Voluntary Reporting of Cybersecurity Incidents

4 December 2017

One of the problems with trying to secure systems is the lack of knoweldge in the community about what has or hasn’t worked. I’m on record as calling for an analog to the National Transportation Safety Board: a government agency that investigates major outages and publishes the results.

In the current, deregulatory political climate, though, that isn’t going to happen. But how about a voluntary system? That’s worked well in avaiation—could it work for computer security? Per a new draft paper with Adam Shostack, Andrew Manley, Jonathan Bair, Blake Reid, and Pierre De Vries, we think it can.

While there’s a lot of detail in the paper, there are two points I want to mention here. First, the aviation system is supposed to guarantee anonymity. That’s easier in aviation where, say, there are many planes landing at O’Hare on a given day than in the computer realm. For that reason (among others), we’re focusing "near misses"—it’s less revelatory to say "we found an intruder trying to use the Struts hole" than to say "someone got in via Struts and personal data for 145 million people was taken".

From a policy perspective, there’s another important aspect. The web page for ASRS is headlined "Confidential. Voluntary. Non-Punitive"— with the emphasis in the original. Corporate general counsels need assurance that they won’t be exposing their organizations to more liability by doing such disclosures. That in turn requires buy-in from regulators. (It’s also another reason for focusing on near-misses: you avoid the liability question if the attack was fended off.)

All this is discussed in the full preprint, at LawArxiv or SSRN.

Bitcoin---The Andromeda Strain of Computer Science Research

30 December 2017

Everyone knows about Bitcoin. Opinions are divided: it’s either a huge bubble, best suited for buying tulip bulbs, or, as one Twitter rather hyperbolically expressed it, "the most important application of cryptography in human history". I personally am in the bubble camp, but I think there’s another lesson here, on the difference between science and engineering. Bitcoin and the blockchain are interesting ideas that escaped the laboratory without proper engineering—and it shows.

Let’s start with the upside. Bitcoin was an impressive intellectual achievement. Digital cash has been around since Chaum, Fiat, and Naor’s 1988 paper. There have been many other schemes since then, with varying properties. All of the schemes had one thing in common, though: they relied on a trusted party, i.e., a bank.

Bitcoin was different. "Satoshi Nakamoto" conceived of the block chain, a distributed way to keep track of coins, spending, etc. Beyond doubt, his paper would have been accepted at any top cryptography or privacy conference. It was never submitted, though. Why not? Without authoritative statements directly from "Nakamoto", it’s hard to say; my own opinion is that it originated from the anarchist libertarian wing of the cypherpunk movement. Cypherpunks believe in better living through cryptography; a privacy-preserving financial mechanism that is independent of any government fulfilled one of the ideals of the libertarian anarchists. (Some of them seemed to believe that the existence of such a mechanism would inherently cause governments to disappear. I don’t know why they believed this, or why they thought it was a good idea, but the attitude was unmistakable.) In any event, they were more interested in running code than in academic credit.

So what went wrong? What happened to a system designed as an alternative to, e.g.., credit cards where the "cost of mediation increases transaction costs, limiting the minimum practical transaction size and cutting off the possibility for small casual transactions"? Instead, today the Bitcoin network is overloaded, leading to high transaction costs. The answer is a lack of engineering.

When you engineer a system for deployment you build it to meet certain real-world goals. You may find that there are tradeoffs, and that you can’t achieve all of your goals, but that’s normal; as I’ve remarked, "engineering is the art of picking the right trade-off in an overconstrained environment". For any computer-based financial system, one crucial parameter is the transaction rate. For a system like Bitcoin, another goal had to be avoiding concentrations of power. And of course, there’s transaction privacy.

There are less obvious factors, too. These days, "mining" for Bitcoins requires a lot of computations, which translates directly into electrical power consumption. One estimate is that the Bitcoin network uses up more electricity than many countries. There’s also the question of governance: who makes decisions about how the network should operate? It’s not a question that naturally occurs to most scientists and engineers, but production systems need some path for change.

In all of these, Bitcoin has failed. The failures weren’t inevitable; there are solutions to these problems in the acdemic literature. But Bitcoin was deployed by enthusiasts who in essence let experimental code escape from a lab to the world, without thinking about the engineering issues—and now they’re stuck with it. Perhaps another, better cryptocurrency can displace it, but it’s always much harder to displace something that exists than to fill a vacuum.