NormalAccidentTheory-MIT


2023年12月16日发(作者:obeyed是什么意思)

ESD 83

Normal Accidents: Living with High-Risk Technologies1

Charles Perrow

It takes just the right combination of circumstances to produce a catastrophe, just as it

takes the right combination of inevitable errors to produce an accident. Perrow

Introduction

The theme of this book is vividly articulated by a brief passage about a DC-10 crash. The

author quotes from a National Transportation Safety Board (NTSB) report2:

The loss of control of the aircraft was caused by the combination of three events: the retraction

of the left wing’s outboard leading edge slats; the loss of the slat disagreement warning system;

and the loss of the stall warning system – all resulting from the separation of the engine pylon

assembly. Each by itself would not have caused a qualified flight crew to lose control of the

aircraft, but together during a critical portion of the flight, they created a situation, which

afforded the flightcrew an inadequate opportunity to recognize and prevent the ensuing stall of

the aircraft.

This paragraph and the apparently contradictory title of the book capture the core theme of

Charles Perrow’s book. He argues that for complex systems, accidents are “normal events.”

This school of thought is now known as Normal Accident Theory (NAT). Accidents in

complex systems are presumed to be unavoidable because innocent and seemingly unrelated

events accumulate and align to create major malfunctions that produce disastrous results.

Perrow presents numerous and detailed examples: such as, Three Mile Island (TMI), aircraft

crashes, marine accidents, dams, nuclear weapons, and DNA research. Through detailed

reconstruction of events, he shows that the accidents are not only “unexpected, but are

incomprehensible ...” to those responsible for the safe operation of those systems. “In part,

this is because in these human-machine systems the interactions literally cannot be seen.”3 For

these reasons, Perrow considers “normal accidents” and “system accidents” to be synonymous.

This is the second edition of the 1984 book. It was republished in 1999 and it has been

updated with summaries of new research. Perrow identifies himself as “an organizational

theorist.”4 He writes, “As an organizational sociologist, I was interested in the system

characteristics of large accidents, but the discipline also made me aware of the organizational

genesis of male-made disasters.”4 The vantage point, thesis and overall conclusions of this

book reflect a strong sociological bent of the author. He began writing this book in 1979. The

State University of New York and the National Science Foundation provided the initial

funding. Subsequently, the Behavioral Sciences Center of Stanford University and the Office

of Naval Research supported this work and the manuscript was finally completed at Yale. The

book is well researched, tightly reasoned, and analytic to a surprising degree. Perrow defines

all his terms, and frames his arguments in generally consistent conceptual structures from

which he is able to derive insightful conclusions. He concludes, “… these failures are

12 1999, Princeton University Press. Princeton, New Jersey.

page 138.

3 page 9.

4 page 10.

4 page364.

page 1

6/28/19

ESD 83

inevitable,” and takes up the challenge to propose a set of broad social policy alternatives to

answer the question: with what risky systems can we live with? and why?

To answer these questions, Perrow frames Normal Accident Theory (NAT). We will

now present a brief summary of the theory and will follow with a discussion its core concepts

of this theory.

Normal Accident Theory

The “essence of the normal accident [is]: the interaction of multiple failures that are not in a

direct operational sequence.” “Most normal accidents have a significant degree of

incomprehensibility.”6 This criterion of incomprehensibility is important to the framework of

normal accidents because to Perrow normal accidents are an inherent property of complex

systems. This means that engineers, operations process designers, human factors and the like

are completely helpless in preventing normal accidents. “Failures …can interact with other

failures, and thus be a source of system accidents…”7

What are the causes of accidents? Perrow identifies two interacting variables that

specify a space, which fully characterizes accidents. They are coupling and interactions.

Interactions are the reciprocal actions among elements of the system. These interactions can

be tightly coupled or loosely coupled. Tightly coupled interactions are those that do not

tolerate delay. They have invariant sequences and negligible slack. Loosely coupled

interactions have the opposite characteristics. The interactions are linear or complex. The term

“linear” means simple. The opposite is “complex.” With these definitions, Perrow creates the

following framework to classify systems.

interactions

complex

linear

nuclear

tight

plant

dams

power

grids

DNA nuclear

weapons

coupling

mining

assembly

line R&D

firms

loose

cars

67 page 23.

page 35

page 2

6/28/19

ESD 83

Using this taxonomy, Perrow then develops a set of policy recommendations, which

are targeted at specific systems. The three policies are: abandon, restrict, tolerate and

improve. The mapping of these policies is shown below.

catastrophic potential

low high

nuclear

low

weapons

space

abandon

nuclear

plants

Cost of

restrict

alternatives

DNA

dams

tolerate &

improve

mining chemicals

high

The author stakes the validity of these policies on two arguments. One is his field research,

which indicates that the systems in the “abandon” category are unacceptable to people. People

“dread” those systems because they have no control over them, because they fear the potential

accidents and the catastrophic human losses. Two, Perrow rejects the arguments that scientists

or engineers are capable of designing accident-free systems or being able to control “abandon-type” tightly-coupled complex-systems. The infinitesimal probabilities of catastrophe are

dismissed with “…like saying that in a war, only a small fraction of bullets kill anyone.”8

Likewise, in a set of corollary arguments he dismisses “bounded rationality” arguing that it

makes people reason with limited information and thus they form uninformed opinions.

Perrow claims that his model of social “dread” trumps bounded rationality and the “perfect

rationality” of scientists and statisticians. This he calls “social rationality.”9

To seal his arguments and close logical escape-hatches, he identifies and justifies his

dismissal of non-causes of accidents. He emphatically absolves “dumb operators,” which his

research shows to be the most convenient and traditional scape-goats of system accidents, He

dismisses the other usual suspects as well, technology, capitalism, and greed. Consistently and

relentlessly, Perrow puts the causes of system accidents on the system itself. Accidents are

embedded in the system. Accidents are an inherent property of complex and tightly coupled

systems. Accidents are normal.

Discussion

89 page 55

page 315.

page 3

6/28/19

ESD 83

It is our view that the author has an accurate view of what comprises a complex system. It is

very consistent with our characterization of systems. A system has identifiable boundaries, it

has a structure and purposeful behavior, and it is comprised of a large number of elements that

interact with each other. Furthermore, Perrow elaborates on the nature of the interactions to

develop a taxonomy of complex systems from which he derives a set of policies. Perrow

identifies and frames the problem, and proposes solutions. As engineers, this is intellectually

satisfying. The author’s system perspective is broad and comprehensive. For example, the

system boundaries in his analysis of aircraft and airways extend to include the FAA, the

NTSB, the aircraft, and the pilot. Likewise, the discussion of marine shipping accidents even

includes poor weather conditions. The system boundaries of his discussion of the collapse of

the Grand Teton dam are drawn such that the physical landscape and the incompetent

organizations that contributed to its collapse are all included. By taking a broadly scoped

system perspective, Perrow’s analyses give us a finely grained view of system accidents. The

addenda to the 1984 edition, which includes a summary of recent research, are a welcome

addition to the content of the book. The author appears at this time to feel much more

confident about the sociological dimensions of Natural Accident Theory.

Perrow has brought a fresh and thorough study of system accidents. The social

rationality approach to system accidents is innovative. It is also our belief that Perrow has

framed the issues quite logically. And he has proposed a policy framework that is consistent

with his mental model of the attitude of people and society to high-risk systems they dread,

cannot control or understand. “We can’t even predict the behavior of totally man-made

systems such as nuclear power plants, say, much less can we predict the behavior of very

much more complicated systems such as , of which we only know about half the

working parts.”10 However, there are some irritating inconsistencies that erode the forceful

and convincing case he is making.

The author in several passionate passages absolves human error and “dumb operators”

as key causes of system accidents. But then he presents nine typical behavior patterns in

situations of system accidents. Yet fully six of the nine patterns are human errors.11 He also

absolves capitalism and greed as causes of system accidents. However, in the discussion of

the Bhopal disaster in India, he writes, “we might look at plain old free-market capitalism

that allowed the Indian plant to be starved and run down…”

12 In addition, there is a

discernable unevenness in the depth of research. This could be possibly due to not having

access to documentation. For example, the information on the Apollo 13 system failures,

Perrow relies largely on Tom Wolf’s “The Right Stuff.”

There are some assertions that are not fully developed or explained. Perrow claims

“…elites – decide that certain technological possibilities are to be financed and put into

place.”13 Elsewhere he concludes, “They are systems that elites have constructed, and thus

1011 page 303

page 277

12 page 360

13 page 339

page 4

6/28/19

ESD 83

can be changed or abandoned.”14 Unfortunately, he does not elaborate how he reached these

conclusions.

The addenda for the 1999 edition of the book are in need of an experienced editor. The

authors excitement of the review of recent research that confirm and validate his Natural

Accident Theory suffers from a self-congratulatory tone. A more conservative and modest

approach would have been more effective. Unfortunately this exuberance spills over into the

final chapter of the book that deals with the Y2K problem. He describes the history and the

potential problems quite accurately. He writes with a slight alarmist tone, which is somewhat

jarring, especially because alarm is explicitly absent in the rest of the book. On Y2K, it almost

appears that he is positioning himself for an “I told you so.”

We think that he is too pessimistic about the creative and innovative capabilities of

people and organizations. We think that rather than “abandon,” it is preferable to adopt a

policy of limited, over-managed, and closely monitored pilots to learn. Over time, people will

learn the mysteries of complex systems in the same way that scientists can now model the

interactions of leptons or social scientists like Cyert and March have modeled decisions in

firms15. It is true that people have made a mess of things in the past; they have also shown

themselves to be noble and principled. We have created immensely complex systems that are

resilient, loosely coupled, and enormously beneficial to learning and commerce. The Internet

is an example.

Fortunately, these shortcomings do not significantly affect the overall persuasiveness

of the Natural Accident Theory. Although we think that Perrow errs by recommending an

“abandon” policy. On balance, “Normal Accidents” is logically reasoned and it presents us

with a useful framework to analyze and discuss difficult issues about complex systems.

“Normal Accidents” should be required reading for anyone engaged in technology policy and

system design.

1415 page 352

Cyert, R.M. and J.G. March, 1992. A Behavioral Theory of the Firm. Blackwell Business, Oxford, UK.

page 5

6/28/19


本文发布于:2024-09-23 06:29:51,感谢您对本站的认可!

本文链接:https://www.17tex.com/fanyi/7195.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

上一篇:风险管理术语
标签:
留言与评论(共有 0 条评论)
   
验证码:
Copyright ©2019-2024 Comsenz Inc.Powered by © 易纺专利技术学习网 豫ICP备2022007602号 豫公网安备41160202000603 站长QQ:729038198 关于我们 投诉建议