But that view is deeply flawed; most
organizations face all kinds of unpredictable challenges - large and
small - that collectively place huge demands on people's creativity
and imaginations. Indeed, in an ever-changing, rough-and-tumble business
environment, the assumption that the corporation is something stable
and secure becomes dangerous. When the unpredictable does happen, and
the world as we know it unravels, we are all the more likely to become
so paralyzed that we cannot survive the experience.
What can we do to better recognize and
manage the unpredictable? Few people are more qualified to answer that
question than Karl E. Weick, the Rensis Likert Distinguished University
Professor of Organizational Behavior and Psychology at the University
of Michigan Business School at Ann Arbor, and professor of psychology
at the university. Over the course of his career, Weick has become world
renowned for his insights into why people in organizations act the way
they do. His book The Social Psychology of Organizing, first
published in 1969, turned organizational psychology on its head by praising
the advantages of chaos, demonstrating the pitfalls of planning, and
celebrating the rewards of "sensemaking." These insights were expanded
in a later book, Sensemaking in Organizations (1995). Most recently,
Weick - along with University of Michigan colleague Kathleen M. Sutcliffe
- has turned his attention to Managing the Unexpected (2001).
Weick has journeyed widely in his search
for organizational meaning - from jazz orchestras to firefighters, from
the Skylab crew to Native American hunting parties - and his findings
stand in sharp contrast to most of the literature on business organizations.
Weick's view of corporations is as complex as the people who populate
them. His organizations chat, dissemble, disguise, mobilize, and "galumph."
In other words, they are alive. Not surprisingly, while most management
writers advise businesspeople to simplify and streamline, Weick challenges
executives to complicate themselves. For him, reality is not some black-and-white
matter "out there," but rather a fluid entity that organizations half
imagine and half create. In the following edited conversation, Weick
offers fresh perspectives on managing surprise, focusing on failure,
and surviving what he calls "cosmology" attacks.
Your most recent research focuses on
high-reliability organizations. What are HROs, and why are they important?
An HRO is, for instance, a nuclear power plant, an aircraft carrier,
an air-traffic-control team, a fire fighting unit, or a hospital's emergency
department. You could even think of restaurant kitchens, with orders
coming in rapid-fire and knives flying all over the place, as high-reliability
organizations. HROs operate under very trying conditions all the time
and still manage to have fewer than their fair share of accidents. An
aircraft carrier, for example, could have a disaster every time a plane
lands or takes off. But it doesn't, and the question is, Why not?
The key difference between HROs and other
organizations is the sensitivity or mindfulness with which people in
most HROs react to even very weak signs that some kind of change or
danger is approaching. In contrast to HROs, most companies today are
hugely unprepared for the unpredictable. Managers are under the illusion
that they know more or less what's going to happen next or how other
people are likely to act. That's both arrogant and dangerous. Not only
do those managers ignore the possibility that something unexpected will
happen but they also forget that the decisions they do make can have
unintended consequences. Consider the launch of New Coke in 1985. Immediately
after the product was introduced, the company got as many as 8,000 letters
a day from angry consumers. Clearly, Coca-Cola had failed to accurately
predict people's behavior. To its credit, however, the company came
back with Coke Classic within just three months. But as the story shows,
you have to take action at the earliest sign of danger, or you may get
killed. Everyday problems escalate to disaster status very quickly when
people don't respond appropriately to signs of trouble. HROs distinguish
themselves by being able to detect incredibly weak warning signs and
then taking strong, decisive action.
How might an HRO respond to a weak
Consider board operators in the control room of a nuclear power plant.
They pay close attention to small, unexpected events that may foreshadow
larger system problems - for instance, they note when an automatic system
doesn't respond as expected or when unusual data regarding plant parameters
crops up. They recognize when a procedure is inappropriate and navigate
to a different one. This watchful updating facilitates management of
the unexpected, and I believe it results in large part from a preoccupation
with failure. Think about it: Concerns about failure are what give nuclear
power plants their distinctive quality. But since complete failures
in nuclear power plants are extremely rare, the people working there
are preoccupied with something they seldom see. And this requires a
special kind of alertness. Workers in these facilities do not monotonously
watch dials, read printouts, or manipulate graphic displays and then
breathe wearily at the end of the day: "Terrific - I've just had another
dull, normal day." On the contrary, these workers make judgments and
adjustments and comparisons to keep their days dull and normal. Of course,
there is undoubtedly a kind of obsessiveness in all this, which is true
of all HROs and which can make them unpleasant places to work in. But
the minute a nuclear-plant worker says, "Hey, this job is boring," there
is the danger that he'll stop making the fine-tuned adjustments needed
to keep the job unexciting. And we all know how catastrophic it can
be when things get exciting in a nuclear power plant.
For a classic example of a company misreading
or ignoring a weak signal, you might consider the staffers at Ford's
recall office during the Pinto crisis in the 1970s. They were aware
that the Pinto could sometimes catch fire in low-speed, rear-end collisions.
But they saw no need to recall the car, because they couldn't find a
"traceable cause" for the incidents. They missed the fact that bolts
on the cars' rear axles had punctured the gas tanks of the Pintos involved
in those crashes. Their inability to pick up on weak signals spelled
Can organizations learn to be more
They can, by adopting some of the practices that high-reliability organizations
use. For instance, besides being fixated on failure, HROs are also fiercely
committed to resilience and sensitive to operations. Managers at these
organizations keep their attention focused on the front line, where
the work really gets done. For example, among wildland firefighters,
the most successful incident commanders are those who listen best to
the people out there actually fighting the fires. HROs also defer to
expertise, and they refuse to simplify reality. This last point is particularly
important because it has profound implications for executives. As I
have often written, leaders must complicate themselves in order to keep
their organizations in touch with the realities of the business world.
My worry when executives say, "Keep it simple, stupid," is that they're
underestimating the complexity of their own organizations and environments.
But contrary to how we often think about them, organizations are not
at all passive; they are extremely active, and they half create their
environments. So part of the solution to managing the unanticipated
is to get executives to step back and acknowledge just how messy reality
can sometimes be.
That reminds me of your famous battle
cry: "Believing is seeing."
Simple as it sounds, I really do
think that's the case more often than not. By inverting the cliché,
I'm trying to communicate that we can only see what we are prepared
to see. There are many illustrations of this fact, but the one that
really drove it home for me was the story of how child abuse first came
to be recognized in this country. Child abuse was "discovered" and the
treatment of it accelerated only in the 1960s when, in Boulder, Colorado,
pediatricians and radiologists who were treating children added social
workers to their teams. Until then, the pediatricians and radiologists
wouldn't even allow the possibility that parents could be hurting their
own kids because they didn't know what to do next. But when the social
workers came on board, they said, "Sure, child abuse happens, and we
know how to handle it by providing protective services." It was only
at this point that the physician teams could afford to see child abuse,
because then they knew how to deal with it. The moral, of course, is
that the greater the repertoire of responses you have on your team,
the more things you can do. And ultimately, the more ready you are to
deal with reality, the more you can acknowledge its complexity. That's
one of the reasons, I think, that we are seeing more concern about greed
and CEO conduct in the United States right now - because we now feel
we have a better idea what to do about it through governance.
MY WORRY WHEN EXECUTIVES
SAY, "KEEP IT SIMPLE, STUPID", IS THAT THEY'RE UNDERESTIMATING
THE COMPLEXITY OF THEIR OWN ORGANIZATIONS AND ENVIRONMENTS.
You say HROs are obsessed with failure.
But don't most organizations
marginalize leaders who fail?
There is a strong tendency in companies that aren't high-reliability
organizations to isolate failure, to blame the culprit, and to not learn
from mistakes. And that's idiotic, because few failures can be traced
to a single individual. Consider excess surgical deaths in hospitals.
Typically they are the consequences of understaffing, poor handoffs
of information about the patient as he is moved from the surgical suite
to the recovery room and then to the ward, and the low frequency of
performing a particular operation. But no matter how many people may
be involved in them, failures are easier to recover from if they are
spotted early on, when they are small. If you can catch a failure right
away, it's less difficult to say, "Look, there's been some kind of mistake
here, but it might just be a sign that the system has gone a little
Organizations can do a lot to encourage
their members to face up to failure, even to become preoccupied with
it. There is an interesting story that one of my colleagues tells about
the great German scientist Wernher von Braun. When a Redstone missile
went out of control during prelaunch testing, von Braun sent a bottle
of champagne to an engineer who confessed that he might have inadvertently
short-circuited the missile. An investigation revealed that the engineer
was right, which meant that expensive redesigns could be avoided. You
don't get a lot of admissions like that in organizations today. But
all it takes is one such story to make an individual in the company
buck up and say, "Hey, these folks are serious about facing up to failures,
so I'm going to take a chance and speak up."
I've also repeatedly found that employees
at HROs cultivate a fascination with failure by refusing to take shortcuts
or simplify reality. Let's say the workers at a nuclear power plant
have to shut down the plant's air supply system in response to some
emergency signal. They won't treat the plant blueprints as a reliable
guide for the system - which a businessperson might do in the interest
of getting the job done quickly. Instead, they will check the whole
system for valves, piping, or reroutes that may have been added since
the drawings were completed. They know that it's what's missing from
the blueprints that could cause the really serious surprises. In other
industries as well, successful companies often turn out to be those
that refuse to simplify reality - that go behind the blueprints. I'm
thinking of companies like retail giant Wal-Mart, with its legendary
attention to detail; the California-based design group Ideo; and Francis
Ford Coppola's American Zoetrope Productions.
Is there one kind of leader who's particularly
good at managing the unexpected?
Not surprisingly, newcomers to an organization catch a lot of stuff
that old-timers miss, which is one reason there is such a huge temptation
to bring outsiders into an organization during crises. But newcomers,
for good reason, also tend to shut up about what they see, lest they
come across as really dumb. That's why I place a lot of trust in executives
who are generalists. People who study liberal arts tend to get exposed
to a wider variety and greater richness of values than people normally
get in professional schools. At the same time, though, when I speak
of generalists, I mean more than those people who have studied literature
or art in college. I'm talking mainly about executives who have heterogeneous
work and industry experiences. Because of their diverse work histories,
these executives are in a good position to cope with problems in original
ways. I'm thinking here of Lou Gerstner, who landed at IBM with the
experiences he had gained at RJR Nabisco, a consumer products company;
American Express, a financial services company; and McKinsey, a consultancy.
Also consider the late Mike Walsh, who moved from Cummins Engine to
Union Pacific Railroad and Tenneco, and Larry Bossidy, who joined Allied
Signal from General Electric. Generalists such as these can often construct
a richer, more useful version of what's going on than specialists can.
At the very least, their broad experiences can help these executives
not to get paralyzed by what I call a "cosmology episode."
That's an intriguing term. Can you
Think back to 1993. That's when the Centers for Disease Control first
came up against hantavirus in the Southwest. The virus made no sense:
It had never appeared in landlocked regions before, and it was killing
people by attacking their lungs rather than their kidneys, the virus's
usual target. It seemed to defy explanation. And that's as close a parallel
to a cosmology event as I can describe. Basically, a cosmology episode
happens when people suddenly feel that the universe is no longer a rational,
orderly system. What makes such an episode so shattering is that people
suffer from the event and, at the same time, lose the means to recover
from it. In this sense, a cosmology episode is the opposite of a déjà
vu experience. In moments of déjà vu, everything suddenly feels familiar,
recognizable. By contrast, in a cosmology episode, everything seems
strange. A person feels like he has never been here before, has no idea
of where he is, and has no idea who can help him. An inevitable state
of panic ensues, and the individual becomes more and more anxious until
he finds it almost impossible to make sense of what is happening to
The continual merging and divesting and
recombining and changing of responsibilities and bosses over the years
has created intense cosmological episodes for many businesspeople. Even
senior executives are unsure of whom they're working for and why. If
you compound that with more globalization and high-velocity change in
the environment, it's not surprising that nobody seems to have a firm
sense of who they really are any more. Many people even have trouble
locating themselves on organizational charts. So I think it's fair to
say that in the course of their careers, most managers will have at
least one cosmology episode; their worlds will get turned upside down.
Having the kind of alertness to weak signals that we see at HROs can
help managers avoid this particular psychological crisis. In the case
of the hantavirus, for example, the puzzle was eventually solved when
epidemiologists discovered that recent climatic changes had produced
an explosion in the rodent population that carried the virus, which
increased the likelihood that humans might be exposed to hantavirus.
In cosmological episodes, paying very close attention to details can
definitely restore a sense of mastery.
So people can convert a cosmology episode
into something positive?
What I've repeatedly noticed is that the people who really get in trouble
during these crises are those who try to think everything through before
taking any action. The problem with defining and refining your hypotheses
without testing them is that the world keeps changing, and your analyses
get further and further behind. So you've got to constantly update your
thinking while you're sitting there and reflecting. And that's why I'm
such a proponent of what I call "sensemaking." There are many definitions
of sensemaking; for me it is the transformation of raw experience into
intelligible world views. It's a bit like what mapmakers do when they
try to make sense of an unfamiliar place by capturing it on paper. But
the crucial point in cartography is that there is no one best map of
a particular terrain. Similarly, sensemaking lends itself to multiple,
conflicting interpretations, all of which are plausible. If an organization
finds itself unsure of where it's going, or even where it's been, then
it ought to be wide open to a lot of different interpretations, all
of which can lead to possible action. The action and its consequence
then begin to edit the list of interpretations down to a more manageable
And this is the point I wish to underscore:
Action, tempered by reflection, is the critical component in recovering
from cosmology episodes. Once you start to act, you can flesh out your
interpretations and rework them. But it's the action itself that gets
you moving again. That's why I advise leaders to leap in order to look,
or to leap while looking. There's a beautiful example of this: Several
years ago, a platoon of Hungarian soldiers got lost in the Alps. One
of the soldiers found a map in his pocket, and the troops used it to
get out safely. Subsequently, however, the soldiers discovered that
the map they had used was, in fact, a drawing of another mountain range,
the Pyrenees. I just love that story, because it illustrates that when
you're confused, almost any old strategic plan can help you discover
what's going on and what should be done next. In crises especially,
leaders have to act in order to think - and not the other way around.
One of the cruelest things about organizations
today is that they hold executives to standards of rationality, clarity,
and foresight that are unobtainable. Most leaders can't meet such standards
because they're only human, facing a huge amount of unpredictability
and all the fallible analyses that we have in this world. Unfortunately,
the result is that many executives feel they just can't measure up.
That triggers a vicious psychological circle: Managers have rotten experiences
because they keep coming up short, which reinforces low self-esteem.
In the end, they get completely demoralized and don't contribute what
they actually could - and otherwise would.
A COSMOLOGY EPISODE
HAPPENS WHEN PEOPLE SUDDENLY FEEL THAT
THE UNIVERSE IS NO LONGER A RATIONAL, ORDERLY SYSTEM.
But if you tried telling today's leaders
to accept the fact that they're not quite as rational, deliberate, and
intentional as they claim to be - and that that's okay, because that's
the way humans are - I think most executives wouldn't understand. They've
internalized the pressure to be perfect. Caught in a nasty cycle of
insecurity that is covered up by hubris, many executives place a lot
of hope in unrealistic goals. Meanwhile, it is the people further down
in the organization who are actually doing all the improvising and patching
and scrambling to make plans work. And the people at the top don't have
any idea how much the people in the middle are breaking their backs
to keep the organization going.
What does sensemaking have to do with
our instinct to create stories to explain the unexpected?
As the writer Joan Didion once put it, "We tell ourselves stories in
order to live." In business, we tell ourselves stories in order to know
more and compete better. In a crisis, stories help us not to panic.
As reality unfolds, everyone starts asking themselves, "Do you have
any idea what's going on here?" Then someone spins a story, and the
moral is something like, "Don't worry, I have seen something vaguely
like this before." And that's more than comforting, it's motivating.
People don't need much to get moving - just a little kernel of meaning.
Even if the company is in a quite serious situation, someone will be
able to use that tiny core of meaning to convert their interpretations
In any organization, the most powerful
stories are created and spread through informal gossip. Indeed, I don't
think there's a fundamental difference between gossiping and storytelling.
Gossiping is just a way to rehearse different stories before they become
formalized and spread out across the organization. It can help employees
process information that might not otherwise make it into the "official"
story. At the same time, because it is mostly made up of exaggerations
and bluster, gossip can help prepare an organization for the unexpected
and, in this way, can serve as a prelude to sensemaking and action.
Indeed, I'm always surprised by how little factual information leaders
really need to get going.
Let me give you an example. One organization
that has struggled with reliability is Union Pacific. Back in the 1990s,
the company suffered repeatedly from managerial paralysis - even the
employees began to call it the Utterly Pathetic railroad. At that time,
the following story started circulating among employees and customers:
A locomotive engineer got so fed up with the railroad's incompetence
that he decided to commit suicide. So he went outside, lay down on the
railroad tracks - and starved to death. That kind of urban myth was
a perfect way to express just how frustrated people had become with
the railroad not doing anything during a period of intense upheaval.
You've often said that plans are overrated,
that they can actually make things worse for organizations.
Yes, I usually urge executives to fight their tendency to want to plan
everything. Most plans are too specific, and the details create the
illusion that the plan grasps everything that is going on and therefore
can be trusted. As a result, when you have a plan, you tend not to look
for things that disconfirm it. Plans are the opposite of gossip in that
they lure us into the trap of overlooking the unexpected. They also
deceive us into thinking that we know more than we do. The worst aspect
of plans is that they heighten the tendency to postpone action when
something unexpected happens. People do nothing while they stand around
asking themselves, "What was I supposed to do in this kind of emergency?"
I learned this lesson while watching some
training at a nuclear power plant. This particular firm had a mock-up
of a control room where they trained people, and they were very proud
of the fact that it was such an accurate copy of the real thing. And
it was great - a real knockout. But the unanticipated consequence of
the verisimilitude is that when people got out of the training facility
and went into the actual control room, they were hesitant to deal with
emergencies. In one instance when something went wrong, employees waited
for a long time before taking action. They just sat there, searching
their memories for where they had seen this situation before in the
training session. And it was the very fidelity of the mock-up to the
real control rooms that caused their delayed reactions. Meanwhile, the
reactor was getting hotter and hotter and hotter. The company would
have been better off if its employees had only had a few guidelines,
just enough to keep them moving in times of crisis.
All this is not to say that plans are
unimportant in organizations. They are important, but not for the reasons
that people think. Plans are signals, games, excuses for interactions;
they are not good for micromanaging the unexpected.
You've said companies need to encourage
their employees to "galumph." What is that, and why is it important?
It doesn't match the dictionary's definition, but I use the term to
mean a kind of purposeful playfulness. It is not frivolous or aimless
play but a kind of improvisation whereby organizations try out different
possibilities. In this sense, galumphing keeps people from becoming
too complacent; it helps executives see things in a new way. Consider
wildland firefighters: Did you know they are most likely to get killed
or injured in their tenth year on the job? That's just about the time
they start to think they've seen it all. They've adapted extremely well
to past challenges but have become less open to new information that
would allow them to adapt to new challenges. That's why firefighters,
like people in other organizations, should constantly be encouraged
to imagine different possibilities.
In wilderness fire training, for example,
it is crucial to learn how to escape from flames when you are in danger
of entrapment. One way to do this is to drop your tools so that you
can pick up speed. The problem is, it feels very unnatural to firefighters
to drop their tools - for them, it is almost like losing their identities.
In very recent training, therefore, firefighters play at dumping their
packs; they explore what it feels like to run both encumbered and unencumbered.
The crucial point in this exercise is that firefighters learn not to
take things for granted. If they understand that survival literally
depends on the ability to see things differently, they will learn to
be more mindful. It's the same for executives: Galumphing helps them
enlarge their repertoires and gain confidence in alternative ways of
acting. It is particularly critical in high-reliability organizations,
where the last thing anyone wants is for people to let down their guard
because they think they've seen everything.
Diane L. Coutu is a senior editor
at Harvard Business Review. Karl E. Weick is the Rensis Likert
Distinguished University Professor of Organizational Behavior and Psychology
at the University of Michigan Business School at Ann Arbor.
Copyright © 2003 Harvard
Business School Publishing.