Fall 2008

DISTINGUISHED LECTURE SERIES

Integrity of Elections
Dr. Peter G. Neumann
SRI International
Monday, October 6, 2008
ABSTRACT: Elections demand end-to-end integrity of voting processes, with additional trustworthiness requirements such as system security, privacy, usability, and accessibility. The overall system aspects present a paradigmatic hard problem. In today’s systems and procedures, essentially everything is a potential weak link. The pervasive nature of the risks is astounding, with a serious lack of system architecture, good software engineering practice, and understanding of security problems. This talk will discuss limitations in existing systems, processes, standards, and evaluations. It will also consider some possible alternatives — including nontechnological approaches, computer-based systems, and possible roles for cryptography.
BIOGRAPHY: Peter G. Neumann, Principal Scientist in SRI International’s Computer Science Laboratory (where he has been since 1971), is concerned with computer systems and networks, trustworthiness with respect to security, reliability, survivability, and safety, and risks-related issues such as voting-system integrity (for over 20 years), crypto policy, social implications, and privacy. A computer professional since 1953, he was a member of technical staff at Bell Laboratories in Murray Hill throughout the 1960s, where he was heavily involved in the Multics system development 1965-1969. His 1995 book, Computer-Related Risks, is still timely. He is a Fellow of the ACM, IEEE, and AAAS. In 1985 he created the online ACM Risks Digest (comp.risks, www.risks.org). He created ACM SIGSOFT’s Software Engineering Notes in 1976, edited it for 19 years. During his 10 years at Bell Labs in Murray Hill, New Jersey, in the 1960s, he was heavily involved in the Multics development. He has doctorates from Harvard and Darmstadt, and taught at Darmstadt, Stanford, U.C. Berkeley, and the University of Maryland. See his website at http://www.csl.sri.com/neumann.

 

Measuring Scale Before Simplification
Herbert Edelsbrunner
Duke University
Monday, November 10, 2008
ABSTRACT: Nature is inherently multi-scalar and this talk presents an attempt to measure this aspect mathematically. Rooted in algebraic topology, this idea has ramifications inside and outside mathematics. A particularly important application is coping with noise in scientific data. As suggested by the title, we advocate to measure noise, but not necessarily remove it from the data, because doing so has side-effects.
BIOGRAPHY: Herbert Edelsbrunner is Professor at Duke University and Founder and Principal at Geomagic, a software company in the field of Digital Shape Sampling and Processing. He graduated from the Graz University of Technology, Austria, in 1982, and he was faculty at the University of Illinois at Urbana-Champaign from 1985 through 1999. His research areas are algorithms, computational geometry and topology, and structural molecular biology as well as systems biology. He has published two textbooks in the general area of computational geometry and topology. In 1991 he received the Alan T. Waterman Award from the National Science Foundation, in 2005 he was elected member of the American Academy of Arts and Sciences, in 2006 he received the honorary degree from the Graz University of Technology, and in 2008 he was elected member of the German Academy of Sciences, the Leopoldina. Herbert Edelsbrunner specializes in the combination of computing and advanced mathematics to solve problems in applications. His methodology is to search out the mathematical roots of application problems and to combine mathematical with computational structure to get working solutions.

 

Quo Vadis System Design?
Alberto Sangiovanni-Vincentelli
University of California, Berkeley
Wednesday, November 19, 2008
ABSTRACT: The electronics industry ecosystem is undergoing a radical change driven by an emerging three-layered architecture characterized by: Computing and communication infrastructure that will offer increasingly faster data transfer and manipulation via powerful data centers, compute farms and wired interconnection;

Access devices such as PDAs, cell phones, and laptops, which allow leveraging the immense capabilities of the infrastructure to users that can be humans, or any of the intelligent physical systems below;

A swarm of sensors, actuators and local computing capabilities “immersed in all kinds of physical systems that offer a wide variety of personal or broad-use services, e.g., a mechanical system such as an automobile, a train, a plane, an electrical system such as an electrical motor or generator, a chemical system such as a distillation plant, health-care equipment such as a pacemaker, a distributed environment monitoring and control system, or a security system for access control to protected areas”. Most refer to these swarms as embedded systems. Recently there has been a growing interest in Cyber Physical Systems (CPS) where the interaction between the computing and electronic elements with the physical systems they are immersed into is emphasized. CPS will allow developing a wide span of applications because of the availability of a new generation of sensors, actuators, and local computing that leverage novel interconnect capabilities and centralized computation.

Because of the incredible number of devices that will be available (a recent forecast by the Wireless World Research Forum talks about 7 trillion devices serving 7 billion people in 2017, i.e., a thousand devices per person!), novel applications will emerge to leverage the massive amount of sensing, computational, communication and actuation power. A few examples in the transportation and health sectors have been already proposed and intensively studied. A key technical challenge in these domains is to design components and communication infrastructure so as to have 100% connectivity and working services for 100% of the time in a safe, efficient, reliable and trustworthy way.

Dealing with system-level problems requires more than simply developing new tools, although of course they are essential to advancing the state of the art in design. Rather, the focus must be on understanding the principles of system design, the necessary changes to design methodologies, and the dynamics of the supply chain. Developing this understanding is necessary to define a sound approach that meets the needs of the system and component industries as they try to serve their customers better and develop their products more quickly and with higher quality.

I will present directions, challenges, and potential solutions to the design of future systems, for which heterogeneous subsystems such as mechanical and electrical components must be designed concurrently. The possible scenarios pose fundamental questions to the engineering and scientific worlds regarding how to deal with the design and management of global systems with such huge complexity. A unified design methodology that can extend from cyber physical systems (CPS) all the way down to chips, boards, and mechanical components with general environments capable of hosting specific design flows for the industry segments is the ultimate enabling technology. I will present a potential approach to such a unified design methodology, called platform-based design (PBD), and some examples of its use. Finally the advent of the new generation of systems requires a fresh look towards engineering education. I will present some consideration and outline the directions that UC Berkeley is taking.

BIOGRAPHY: Alberto L. Sangiovanni-Vincentelli holds the Buttner Chair of Electrical Engineering and Computer Sciences at the University of California at Berkeley. He was a cofounder of Cadence and Synopsys, the two leading companies in the area of electronic design automation. He is the chief technology adviser of Cadence. He is a member of the board of directors of Cadence, UPEK (a company he helped spin off from ST Microelectronics), Sonics, and Accent (an ST Microelectronics-Cadence joint venture he helped found). He was a member of the HP Strategic Technology Advisory Board and is a member of the Science and Technology Advisory Board of General Motors. He consulted for many companies, including Bell Labs, IBM, Intel, United Technology, COMAU, Magneti Marelli, Pirelli, BMW, Daimler-Chrysler, Fujitsu, Kawasaki Steel, Sony, and Hitachi. He is the founder and Scientific Director of PARADES, a European Group of Economic Interest supported by Cadence and ST Microelectronics. He is a member of the High-Level Group and of the steering committee of the EU Artemis Technology Platform. In 1981, he received the Distinguished Teaching Award of the University of California. He received the worldwide 1995 Graduate Teaching Award of the IEEE for “inspirational teaching of graduate students.” In 2002, he was the recipient of the Aristotle Award of the Semiconductor Research Corporation. In 2001, he was given the prestigious Kaufman Award of the Electronic Design Automation Council for pioneering contributions to EDA. He is an author of more than 800 papers and 15 books in the area of design tools and methodologies, large-scale systems, embedded controllers, hybrid systems and innovation. Dr. Sangiovanni-Vincentelli has been a fellow of the IEEE since 1982 and a member of the National Academy of Engineering since 1998.

 

Cognitive Crash Dummies: Where We Are and Where We’re Going
Bonnie John
Carnegie Mellon University
Monday, December 1st, 2008
ABSTRACT: Crash dummies in the auto industry save lives by testing the physical safety of automobiles before they are brought to market. “Cognitive crash dummies” save time, money, and potentially even lives, by allowing computer-based system designers to test their design ideas before implementing those ideas in products and processes. In this talk, I will review the uses of cognitive models in system design and the current state of research and practice. I will also present some exciting new research directions that promise to make predictive human performance modeling even more useful. Along the way, I will discuss the role of applications in science and validity v. useful approximation.
BIOGRAPHY: Dr. Bonnie E. John, a psychologist (PhD, Carnegie Mellon University, 1988) and engineer (BEng, The Cooper Union, 1977; MS, Stanford, 1978), has more than 25 years experience in usability analysis and design. She is a full professor at CMU and heads the Masters Program in Human-Computer Interaction. She researches both human performance modeling and software engineering and consults regularly in government and industry. Dr. John has been doing research in Human Performance Modeling for 25 years and has published over 100 papers on the topic. She is an ACM CHI Academy member, recognized for her contributions to HCI through her work in cognitive modeling and the implications of usability concerns on the design of software architecture.