Are you willing to trust computers with your life? Many new technologies are too complex or too time-critical to be controlled by humans or electromechanical devices, and so computers have been given the task.
But about 15 years ago, UW computer science and engineering professor Nancy Leveson recognized that the growing use of computers in safety-critical applications--such as commercial aircraft, chemical and nuclear power plants, medical devices, and public transportation systems--was potentially leading society down a dangerous path. As a result, she established the field of software safety to provide a formal basis for evaluating and eliminating or controlling software-related hazards.
At that time, only a few potentially dangerous systems were controlled by computers, but in the intervening decade, computers have taken over a large number of control functions—and serious accidents have indeed occurred. Leveson and students have developed techniques for identifying software-related hazards through improved design methods and techniques for verifying software safety.
Leveson is concerned not only with technical details but also with the management and organizational issues that lead to accidents in high-tech systems. From being a lone voice in the wilderness a decade ago, she has become the leader of a field that now has research institutes and new degree programs being established worldwide. Recently, Leveson was cited by the American Institute for Aeronautics and Avionics for "developing the field of software safety and for promoting responsible software and system engineering practices where life and property are at stake."
Leveson's modeling and analysis techniques have been used on many real systems, including an airborne collision avoidance system required on most commercial aircraft in the U.S., called TCAS II. The goal of TCAS II is to eliminate mid-air collisions between airplanes, but the complexity of the system is very great. So great, in fact, that it has been difficult to be sure that TCAS II would not cause accidents itself. Leveson's modeling techniques allowed the Federal Aviation Administration and the aircraft industry to understand thoroughly what this system does, and to establish confidence in its performance.
Leveson's safety analysis procedures were also used in the certification of the first computer-based nuclear power plant shutdown system in Canada. Many of the techniques originally developed and experimentally verified at the UW now have been successfully transferred to industry.
The human-computer interaction has been the focus of Leveson's most recent work. Serious accidents have occurred in aircraft and other shared control systems, in which the cause of the incident stemmed from how humans and computers interacted, rather than from failures or errors on either part. Solving this type of safety problem requires the use of techniques that span multiple fields: engineering, software engineering, cognitive psychology, and organization sociology.
Leveson has investigated serious accidents and raised awareness among engineers and scientists of the potential dangers. For example, her study of a series of massive radiation overdoses in a computer-controlled radiation therapy machine has been widely circulated in the medical physics community. And the results of a study she chaired on software in the Space Shuttle has influenced software development processes in several NASA programs, including the Space Station Freedom.
Government agencies around the world have used Leveson's results in establishing standards for the design of computer-controlled systems. In one instance, for example, she showed that techniques for assuring high reliability in control software didn't perform as expected. As a result of her efforts, government standards for software in aircraft and nuclear power plants have had to be changed.