Steven M. Bellovin, Susan Landau, and Herbert S. Lin. Limiting the undesired impact of cyber weapons: Technical requirements and policy implications. Journal of Cybersecurity, 3(1), 2017. [ bib | http ]
Steven M. Bellovin and Adam Shostack. Input to the Commission on Enhancing National Cybersecurity, September 2016. [ bib | .pdf ]
Steven M. Bellovin. Comments on “Protecting the privacy of customers of broadband other telecommunications services”, July 2016. [ bib | .pdf ]
Steven M. Bellovin, Matt Blaze, and Susan Landau. Insecure surveillance: Technical issues with remote computer searches. IEEE Computer, 49(3):14--24, March 2016. An earlier version is available at https://www.cs.columbia.edu/~smb/papers/rsearch.pdf. [ bib | .pdf ]
Steven M. Bellovin, Matt Blaze, Susan Landau, and Stephanie Pell. It's too complicated: How the Internet upends Katz, Smith, and electronic surveillance law. Harvard Journal of Law and Technology, 30(1):1--101, Fall 2016. [ bib | .pdf ]
For more than forty years, electronic surveillance law in the United States developed under constitutional and statutory regimes that, given the technology of the day, distinguished content from metadata with ease and certainty. The stability of these legal regimes and the distinctions they facilitated was enabled by the relative stability of these types of data in the traditional telephone network and their obviousness to users. But what happens to these legal frameworks when they confront the Internet? The Internet's complex architecture creates a communication environment where any given individual unit of data may change its status---from content to non-content or visa-versa---as it progresses Internet's layered network stack while traveling from sender to recipient. The unstable, transient status of data traversing the Internet is compounded by the fact that the content or non-content status of any individual unit of data may also depend upon where in the network that unit resides when the question is asked. In this IP-based communications environment, the once-stable legal distinction between content and non-content has steadily eroded to the point of collapse, destroying in its wake any meaningful application of the third party doctrine. Simply put, the world of Katz and Smith and the corresponding statutes that codify the content/non-content distinction and the third party doctrine are no longer capable of accounting for and regulating law enforcement access to data in an IP-mediated communications environment. Building on a deep technical analysis of the Internet architecture, we define new terms, communicative content, architectural content, and architectural metadata, that better reflect the structure of the Internet, and use them to explain why and how we now find ourselves bereft of the once reliable support these foundational legal structures provided. Ultimately, we demonstrate the urgent need for development of new rules and principles capable of regulating law enforcement access to IP-based communications data.
Steven M. Bellovin. The danger of `exceptional access'. CNN.com, November 18, 2015. [ bib | .html ]
Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael A. Specter, and Daniel J. Weitzner. Keys under doormats: Mandating insecurity by requiring government access to all data and communications. Journal of Cybersecurity, 1(1), September 2015. [ bib | DOI | http ]
Twenty years ago, law enforcement organizations lobbied to require data and communication services to engineer their products to guarantee law enforcement access to all data. After lengthy debate and vigorous predictions of enforcement channels “going dark,” these attempts to regulate security technologies on the emerging Internet were abandoned. In the intervening years, innovation on the Internet flourished, and law enforcement agencies found new and more effective means of accessing vastly larger quantities of data. Today, there are again calls for regulation to mandate the provision of exceptional access mechanisms. In this article, a group of computer scientists and security experts, many of whom participated in a 1997 study of these same topics, has convened to explore the likely effects of imposing extraordinary access mandates.We have found that the damage that could be caused by law enforcement exceptional access requirements would be even greater today than it would have been 20 years ago. In the wake of the growing economic and social cost of the fundamental insecurity of today's Internet environment, any proposals that alter the security dynamics online should be approached with caution. Exceptional access would force Internet system developers to reverse “forward secrecy” design practices that seek to minimize the impact on user privacy when systems are breached. The complexity of today's Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws. Beyond these and other technical vulnerabilities, the prospect of globally deployed exceptional access systems raises difficult problems about how such an environment would be governed and how to ensure that such systems would respect human rights and the rule of law.
Steven M. Bellovin, Matt Blaze, and Susan Landau. Comments on proposed remote search rules, October 2014. [ bib | .pdf ]
Steven M. Bellovin, Matt Blaze, Sandy Clark, and Susan Landau. Lawful hacking: Using existing vulnerabilities for wiretapping on the Internet. Northwestern Journal of Technology & Intellectual Property, 12(1), 2014. [ bib | http ]
For years, legal wiretapping was straightforward: the officer doing the intercept connected a tape recorder or the like to a single pair of wires. By the 1990s, though, the changing structure of telecommunications---there was no longer just “Ma Bell” to talk to---and new technologies such as ISDN and cellular telephony made executing a wiretap more complicated for law enforcement. Simple technologies would no longer suffice. In response, Congress passed the Communications Assistance for Law Enforcement Act (CALEA), which mandated a standardized lawful intercept interface on all local phone switches. Technology has continued to progress, and in the face of new forms of communication---Skype, voice chat during multiplayer online games, many forms of instant messaging, etc.---law enforcement is again experiencing problems. The FBI has called this “Going Dark”: their loss of access to suspects' communication. According to news reports, they want changes to the wiretap laws to require a CALEA-like interface in Internet software. CALEA, though, has its own issues: it is complex software specifically intended to create a security hole---eavesdropping capability---in the already-complex environment of a phone switch. It has unfortunately made wiretapping easier for everyone, not just law enforcement. Congress failed to heed experts' warnings of the danger posed by this mandated vulnerability, but time has proven the experts right. The so-called “Athens Affair”, where someone used the built-in lawful intercept mechanism to listen to the cell phone calls of high Greek officials, including the Prime Minister, is but one example. In an earlier work, we showed why extending CALEA to the Internet would create very serious problems, including the security problems it has visited on the phone system. In this paper, we explore the viability and implications of an alternative method for addressing law enforcement's need to access communications: legalized hacking of target devices through existing vulnerabilities in end-user software and platforms. The FBI already uses this approach on a small scale; we expect that its use will increase, especially as centralized wiretapping capabilities become less viable. Relying on vulnerabilities and hacking poses a large set of legal and policy questions, some practical and some normative. Among these are: * Will it create disincentives to patching? * Will there be a negative effect on innovation? (Lessons from the so-called “Crypto Wars” of the 1990s, and, in particular, the debate over export controls on cryptography, are instructive here.) * Will law enforcement's participation in vulnerabilities purchasing skew the market? * Do local and even state law enforcement agencies have the technical sophistication to develop and use exploits? If not, how should this be handled? A larger FBI role? * Should law enforcement even be participating in a market where many of the sellers and other buyers are themselves criminals? * What happens if these tools are cpatured and repurposed by miscreants? * Should we sanction otherwise-illegal network activity to aid law enforcement? * Is the probability of success from such an approach too low for it to be useful? As we will show, though, these issues are indeed challenging. We regard them, on balance, as preferable to adding more complexity and insecurity to online systems.
Steven M. Bellovin, Renée M. Hutchins, Tony Jebara, and Sebastian Zimmeck. When enough is enough: Location tracking, mosaic theory, and machine learning. NYU Journal of Law and Liberty, 8(2):555--628, 2014. [ bib | .pdf ]
Steven M. Bellovin. Why healthcare.gov has so many problems. CNN.com, October 15 2013. [ bib | http ]
Steven M. Bellovin. Submission to the Privacy and Civil Liberties Oversight Board: Technical issues raised by the Section 215 and Section 702 surveillance programs, July 2013. [ bib | .pdf ]
Steven M. Bellovin, Matt Blaze, Sandy Clark, and Susan Landau. Going bright: Wiretapping without weakening communications infrastructure. IEEE Security & Privacy, 11(1):62--72, January--February 2013. [ bib | DOI | .pdf ]
Mobile IP-based communications and changes in technologies, including wider use of peer-to-peer communication methods and increased deployment of encryption, has made wiretapping more difficult for law enforcement, which has been seeking to extend wiretap design requirements for digital voice networks to IP network infrastructure and applications. Such an extension to emerging Internet-based services would create considerable security risks as well as cause serious harm to innovation. In this article, the authors show that the exploitation of naturally occurring weaknesses in the software platforms being used by law enforcement's targets is a solution to the law enforcement problem. The authors analyze the efficacy of this approach, concluding that such law enforcement use of passive interception and targeted vulnerability exploitation tools creates fewer security risks for non-targets and critical infrastructure than do design mandates for wiretap interfaces.
Steven M. Bellovin, Scott O. Bradner, Whitfield Diffie, Susan Landau, and Jennifer Rexford. Can it really work? Problems with extending EINSTEIN 3 to critical infrastructure. National Security Journal, 3, 2012. [ bib | .pdf ]
In 2004 the increasing number of attacks on U.S. federal civilian agency computer systems caused the government to begin an active effort to protect federal civilian agencies against cyber intrusions . This classified program, EINSTEIN, sought to do real-time, or near real-time, automatic collection, correlation, and analysis of computer intrusion information as a first step in protecting federal civilian agency computer systems . EINSTEIN grew into a series of programs, EINSTEIN, EINSTEIN 2, and EINSTEIN 3, all based on intrusion-detection and intrusion-prevention systems (IDS and IPS). Then there was public discussion of extending the EINSTEIN system to privately held critical infrastructure.
Extending an EINSTEIN-like program to the private sector raises serious technical and managerial issues. Scale matters, as do the different missions of the private sector and the public one. Expanding EINSTEIN-type technology to critical infrastructure is complicated by the complex legal and regulatory landscapes for such systems. There are simply fundamental differences between communication networks supporting the U.S. federal government and those supporting the private-sector critical infrastructures that create serious difficulties in attempting to extend EINSTEIN-type technologies beyond the federal sector. This paper examines the technology's limitations, pointing out the problems involved in expanding EINSTEIN beyond its original mandate.
Steven M. Bellovin, Scott O. Bradner, Whitfield Diffie, Susan Landau, and Jennifer Rexford. As simple as possible---but not more so. Communications of the ACM, 2011. Note: this is a shorter version of “Can it really work?”. [ bib | .pdf ]
Maritza L. Johnson, Steven M. Bellovin, and Angelos D. Keromytis. Computer security research with human subjects: Risks, benefits and informed consent. In Financial Cryptography and Data Security, Lecture Notes in Computer Science. Springer Berlin / Heidelberg, 2011. [ bib | .pdf ]
Computer security research frequently entails studying real computer systems and their users; studying deployed systems is critical to understanding real world problems, so is having would-be users test a proposed solution. In this paper we focus on three key concepts in regard to ethics: risks, benefits, and informed consent. Many researchers are required by law to obtain the approval of an ethics committee for research with human subjects, a process which includes addressing the three concepts focused on in this paper. Computer security researchers who conduct human subjects research should be concerned with these aspects of their methodology regardless of whether they are required to by law, it is our ethical responsibility as professionals in this field. We augment previous discourse on the ethics of computer security research by sparking the discussion of how the nature of security research may complicate determining how to treat human subjects ethically. We conclude by suggesting ways the community can move forward.
Steven M. Bellovin, Matt Blaze, Whitfield Diffie, Susan Landau, Peter G. Neumann, and Jennifer Rexford. Risking communications security: Potential hazards of the “Protect America Act”. IEEE Security & Privacy, 6(1):24--33, January--February 2008. [ bib | .pdf ]
Steven M. Bellovin, Matt Blaze, Whitfield Diffie, Susan Landau, Peter G. Neumann, and Jennifer Rexford. Internal surveillance, external risks. Communications of the ACM, 50(12), December 2007. [ bib ]
Paula Hawthorn, Barbara Simons, Chris Clifton, David Wagner, Steven M. Bellovin, Rebecca Wright, Arnold Rosenthal, Ralph Poore, Lillie Coney, Robert Gellman, and Harry Hochheiser. Statewide databases of registered voters: Study of accuracy, privacy, usability, security, and reliability issues, February 2006. Report commissioned by the U.S. Public Policy Committee of the Association for Computing Machinery. [ bib | http ]
Steven M. Bellovin, Matt Blaze, Ernest Brickell, Clinton Brooks, Vint Cerf, Whitfield Diffie, Susan Landau, Jon Peterson, and John Treichler. Security implications of applying the Communications Assistance to Law Enforcement Act to Voice over IP, 2006. [ bib | .pdf ]
Steven M. Bellovin, Matt Blaze, and Susan Landau. The real national-security needs for VoIP. Communications of the ACM, 48(11), November 2005. “Inside RISKS” column. [ bib | .pdf ]
Steven M. Bellovin. Cybersecurity research needs, July 2003. Testimony before the House Select Committee on Homeland Security, Subcommittee on Cybersecurity, Science, Research, & Development, hearing on “Cybersecurity---Getting it Right”. Transcript at https://archive.org/details/gov.gpo.fdsys.CHRG-108hhrg98150. [ bib | .ps | .pdf ]
Steven M. Bellovin, Matt Blaze, David Farber, Peter Neumann, and Gene Spafford. Comments on the Carnivore system technical review draft, December 2000. [ bib | .html ]
Matt Blaze and Steven M. Bellovin. Tapping on my network door. Communications of the ACM, 43(10), October 2000. [ bib | .html ]
Matt Blaze and Steven M. Bellovin. Open Internet wiretapping, July 2000. Written testimony for a hearing on “Fourth Amendment Issues Raised by the FBI's `Carnivore' Program” by the Subcommittee on the Constitution, House Judiciary Committee. [ bib | .html ]
Steven M. Bellovin. Wiretapping the Net. The Bridge, 20(2):21--26, Summer 2000. [ bib | .ps | .pdf ]
Fred Schneider, Steven M. Bellovin, and Alan Inouye. Critical infrastructures you can trust: Where telecommunications fits. In Telecommunications Policy Research Conference, October 1998. [ bib | .ps | .pdf ]
Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, and Bruce Schneier. The risks of key recovery, key escrow, and trusted third-party encryption, May 1997. A report by an ad hoc group of cryptographers and computer scientists. [ bib | .pdf ]
Yakov Rekhter, Paul Resnick, and Steven M. Bellovin. Financial incentives for route aggregation and efficient address utilization in the Internet. In Proceedings of Telecommunications Policy Research Conference, 1997. [ bib | .html ]