A recent news story noted that a U.S. government agency had asked some researchers to withhold crucial details about an experiment that showed that the avian flu strain A(H5N1) could be changed to permit direct ferret-to-ferret spread. While the problem the govenment is trying to solve is obvious, it's far from clear that suppression is the right answer, especially in this particular case.
There are a few obvious parallels to other situations. Most notably, in 1940 American physicists decided to stop publishing papers on nuclear fission. That fact itself — the absence of published research — convinced at least one Soviet scientist, G.N. Flyorov, that the Americans and British were working a bomb. Arguably, this was the crucial factor in the Soviet decision to proceed with their project; certainly Flyorov mentioned this aspect in his letter to Stalin, and Stalin took that point very seriously. (I apologize for pointing to a paper behind a paywall; it's the most authoritative reference I know of. You can find other discussion here and on Flyorov's Wikipedia page.) In this case, while secrecy may have concealed important details, it gave away "the high-order bit": the area looked promising enough to investigate, despite the exigencies of wartime.
That moratorium was voluntary. In the 1960s and 1970s, though, the NSA tried to suppress outside knowledge of cryptography and NSA's own work; more to the point, they also tried to suppress civilian academic research on cryptography. There were obvious constitutional problems with that, but the Public Cryptography Study Group (formed by the American Council on Education in response to the NSA's call for a dialog) recommended a voluntary system: researchers could submit their papers to NSA; it in turn could request (but not demand) that certain things not be published.
As a vehicle for stopping or even slowing research, this notion was a failure. Possibly, the NSA's intelligence-gathering efforts have been hurt by widespread knowledge of cryptography; certainly, there's far more information out there today than there was a generation ago. In a very strong sense, though, they've won by losing: their real mission of protecting the country has been helped by the flourishing of cryptography for civilian use. To give just one example, cell phone cloning in the 1990s was largely done for drug dealers who wanted to be able to make and receive calls anonymously. Today, though, cryptographic authentication is used, eliminating an entire class of attacks.
It's also worth pointing to the tremendous achievements by academic cryptographers who have shown how to do more with modern cryptography than exchange keys and encrypt and sign messages. What James Ellis, the GCHQ researcher who invented non-secret encryption — what today is called public key cryptography — once said to Whit Diffie is quite accurate: "You did more with it than we did". But the NSA tried to suppress the entire field.
A third example is more recent still: the full disclosure of the details of security holes in software. It is still debated if it's a net benefit or not: do we benefit if the bad guys also learn of the attacks?. On the other hand, it's indisputable that many holes are closed (or closed promptly) solely because of disclosure or the threat thereof. Too many companies respond to reports of attacks by denying them, questioning the competence or integrity of the discoverer, or even using legal means to try to suppress the report. Far too often, it seems, bugs are fixed only because of this public disclosure; without that, they'd remain unfixed, leaving systems vulnerable to anyone who rediscovered the attack.
The conclusion, then, is that suppression has greater costs than it might seem. But what about this case? As before, as is shown in an interview with one of the scientists involved, Ron A. M. Fouchier, there are costs and benefits. For one thing, what these guys did can't easily be replicated in a garage lab by amateurs: "You need a very sophisticated specialist team and sophisticated facilities to do this." Terrorists have easier ways to launch bioattacks:
You could not do this work in your garage if you are a terrorist organization. But what you can do is get viruses out of the wild and grow them in your garage. There are terrorist opportunities that are much, much easier than to genetically modify H5N1 bird flu virus that are probably much more effective.And finally, there's the cost of suppression. It is clear from the interview that public health officials need to know the details, so they know which flu mutations to watch for. Too many people need to know for secrecy to be effective:
We would be perfectly happy if this could be executed, but we have some doubts. We have made a list of experts that we could share this with, and that list adds up to well over 100 organizations around the globe, and probably 1,000 experts. As soon as you share information with more than 10 people, the information will be on the street. And so we have serious doubts whether this advice can be followed, strictly speaking.(I have personal experience with this. Some 20 years ago, I invented DNS cache contamination attacks. After talking with various people, I decided not to publish; choosing instead to share the paper with trusted colleagues and with CERT. These colleagues, in Washington and elsewhere, undoubtedly shared it further still. Perhaps someone shared it imprudently, perhaps it was stolen by hacking, or perhaps the bad guys rediscovered the attack, but eventually the attack showed up in the wild — at which point I published. I concluded that the real effect of the delay was to hinder the development of countermeasures. In other words, I was wrong to have held back the paper.)
The ultimate decision may rest on personal attitudes. To quote Fouchier one more time, "The only people who want to hold back are the biosecurity experts. They show zero tolerance to risk. The public health specialists do not have this zero tolerance. I have not spoken to a single public health specialist who was against publication."