Systems principles indicate that we must give real weight to the collective aspect of reality ... to the "primacy of the whole.'
First, systems have emergent properties that are not associated with any of the parts.
An example of an emergent property is wetness. Neither hydrogen nor oxygen has this property, but in combination as H2O, water is wet. We can consider consciousness itself to be an emergent property … examining individual neurons in a person's brain doesn't reveal consciousness.
As Peter Senge notes: "Dividing an elephant in half does not produce two small elephants. Living systems have integrity. Their character depends on the whole. The same is true for organizations; to understand the most challenging managerial issues requires seeing the whole system that generates the issues."
Second, a fundamental principle of system dynamics states that the structure of the system gives rise to its behavior. The inability to see the power of systemic effects is so powerful that there's a name for it. It's known as the "fundamental attribution error."
- "... blaming individuals instead of attributing the behavior to the system."
From The Fifth Discipline, The Art & Practice of the Learning Organization by Peter Senge, p. 42
- ... people have a strong tendency to attribute the behavior of others to dispositional rather than situational factors -- the so-called 'fundamental attribution error'. In complex systems the same policy (decision rule) can lead to very different behavior (decisions) as the state of the system changes. When we attribute differences in behavior to differences in personality we lose sight of the role of system structure in shaping our choices. The attribution of behavior to individuals and special circumstances rather than system structure systematically diverts our attention from the high leverage points where redesign of the system or governing policy can have significant, sustained, beneficial effects on performance. When we attribute behavior to people rather than system structure the focus of management becomes the search for extraordinary people to do the job rather than designing the job so that ordinary people can do it.
From "Learning in and about complex systems" by John Sterman, System Dynamics Review, Vol 10, Summer-Fall 1994, p. 308; also in Business Dynamics: Systems Thinking and Modeling for a Complex World, Irwin/McGraw-Hill, 2000 by John Sterman, p. 28).
Dr. Sterman is Jay W. Forrester Professor of Management and Director, MIT System Dynamics Group.
The "primacy of the whole" is illustrated by this question: "Why does a bell ring?"
Stop for a minute. Think about that. How would you answer?
Most people recognize immediately that this is a "trick question" even as they say the obvious, "Well, because someone strikes a bell."
That's true, but more fundamentally, a bell rings because it's designed to ring, not simply because it's struck. It's shaped like a bell and made of a material that will ring. Most tables don't "ring like a bell" when we strike them. Tables go "thunk" because their structure isn't designed to ring.
From The Fifth Discipline by Peter Senge, p. 78 (the italics are his).
From the systems perspective, the human actor is part of the feedback process, not standing apart from it. This represents a profound shift in awareness. It allows us to see how we are continually both influenced by and influencing our reality. It is the shift in awareness so ardently advocated by ecologists in their cries that we see ourselves as part of nature, not separate from nature. ...
In addition, the feedback concept complicates the ethical issue of responsibility. ... A linear view always suggests a simple locus of responsibility. When things go wrong, this is seen as blame -- "he, she, it did it" -- or guilt -- "I did it." At a deep level there is no difference between blame and guilt, for both spring from linear perceptions. From the linear view, we are always looking for someone or something that must be responsible ... .
In mastering systems thinking, we give up the assumption that there must be an individual, or individual agent, responsible. The feedback perspective suggests that everyone shares responsibility for problems generated by a system. That doesn't necessarily imply that everyone involved can exert equal leverage in changing the system, but it does imply that the search for scapegoats -- a particularly alluring pastime in individualistic cultures such as ours in the United States -- is a blind alley."
This is particularly debilitating in organizations and our society.
So in organizations we tend to look for "who screwed up," find the person, fire that person, spend a lot of time looking for the perfect replacement person for the job, and then find that the new person screws up, too. In too many cases, the problem is that no matter who we hire, that person will be subject to process or system failures that doom that person to screw up. This is a tragedy.
It's the same in social systems. We blame people for being poor and unemployed instead of seeing that government policy (actually it's the Fed, which is not government) dictates many unemployed and consequent poverty. See There's no 'free market' for Labor.
We blame individuals for making selfish decisions for shopping at Wal-Marts for cheap Chinese goods, which means they're to blame for the exponentially-increasing trade deficit. "All they have to do is stop shopping there!" Instead, the structure of the system is responsible. See The Trade Deficit and the Fallacy of Composition.
Yes, people do screw up. Yes, sometimes individuals are to blame. But most often, "blaming" is a clue that systems effects are involved. Unless we look for systemic causes, we give too few people the chance to not screw up.
On the power and ubiquity of systems effects ("side effects," "unintended consequences," and "behavior is a result of endogenous system structure") here is an excerpt from: All models are wrong: reflections on becoming a systems scientist
John D. Sterman, Jay Wright Forrester Prize Lecture, 2002
System Dynamics Review, Volume 18, Number 4 Winter 2002
The paper is based on the talk the author delivered at the 2002 International System Dynamics Conference upon presentation of the Jay W. Forrester Award.
While it's hard to define what system dynamics is, I don't have any trouble answering why it is valuable. As the world changes ever faster, thoughtful leaders increasingly recognize that we are not only failing to solve the persistent problems we face, but are in fact causing them. All too often, well-intentioned efforts to solve pressing problems create unanticipated "side effects." Our decisions provoke reactions we did not foresee. Today's solutions become tomorrow's problems. The result is policy resistance, the tendency for interventions to be defeated by the response of the system to the intervention itself. From California's failed electricity reforms, to road building programs that create suburban sprawl and actually increase traffic congestion, to pathogens that evolve resistance to antibiotics, our best efforts to solve problems often make them worse.
At the root of this phenomenon lies the narrow, event-oriented, reductionist worldview most people live by. We have been trained to see the world as a series of events, to view our situation as the result of forces outside ourselves, forces largely unpredictable and uncontrollable. The concept of unanticipated events and "side effects" I just mentioned provides a good illustration. People frequently talk about unexpected surprises and side effects as if they were a feature of reality. A doctor may say, "The patient was responding well to treatment, but died from unanticipated side effects." Our political leaders blame recession on unanticipated shocks such as corporate fraud or terrorism. Managers blame any difficulty on events outside their firms and (they want us to believe) outside their control, as for example when Cisco Systems blamed their record $2.2 billion inventory writeoff and massive layoffs on "reduced capital spending and the global macroeconomic environment, which resulted in the reduction in our workforce and inventory charges we announced." (Cisco Systems 2001 Annual Report). In fact, there is compelling evidence that, like other firms in the high-tech/telecommunications sector, Cisco's own policies -- from the design of its supply chain to pricing, production planning, and even the credit terms it offered customers -- were central to the inflation and implosion of the great demand bubble (Goncalves 2002; Shi 2002).
There are no side effects -- only effects. Those we thought of in advance, the ones we like, we call the main, or intended, effects, and take credit for them. The ones we didn't anticipate, the ones that came around and bit us in the rear -- those are the "side effects". When we point to outside shocks and side effects to excuse the failure of our policies, we think we are describing a capricious and unpredictable reality. In fact, we are highlighting the limitations of our mental models. System dynamics helps us expand the boundaries of our mental models so that we become aware of and take responsibility for the feedbacks created by our decisions.
(Almost) nothing is exogenous
It is hard to underestimate the power of the feedback view. Indeed, almost nothing is exogenous. If you ask people to name processes that strongly affect human welfare but over which we have no control, many people name the weather, echoing Mark Twain's famous quip that "Everybody talks about the weather, but nobody does anything about it." But today even the weather is endogenous. We shape the weather around the globe, from global warming to urban heat islands, the Antarctic ozone hole to the "Asian brown cloud." For those who feel that global warming, ozone holes, and the brown cloud are too distant to worry about, consider this: Human influence over the weather is now so great that it extends even to the chance of rain on the weekend. Cerveny and Balling (1998) showed that there is a seven-day cycle in the concentration of aerosol pollutants around the eastern seaboard of the United States. Pollution from autos and industry builds up throughout the workweek, and dissipates over the weekend. They further show that the probability of tropical cyclones around the eastern seaboard also varies with a seven-day cycle. Since there are no natural seven-day cycles, they suggest that the weekly forcing by pollutant aerosols affects cloud formation and hence the probability of rain. Their data show that the chance of rain is highest on the weekend, while on average the nicest day is Monday, when few are free to enjoy the out of doors. Few people understand that driving that SUV to work helps spoil their weekend plans.
In similar fashion, we are unaware of the majority of the feedback effects of our actions. Instead, we see most of our experience as a kind of weather: something that happens to us but over which we have no control. Failure to recognize the feedbacks in which we are embedded, the way in which we shape the situation in which we find ourselves, leads to policy resistance as we persistently react to the symptoms of difficulty, intervening at low leverage points and triggering delayed and distant, but powerful feedbacks. The problem intensifies, and we react by pulling those same policy levers with renewed vigor, at the least wasting our talents and energy, and all too often, triggering an unrecognized vicious cycle that carries us farther and farther from our goals. Policy resistance breeds a sense of futility about our ability to make a difference, a creeping cynicism about the possibility of changing our world for the better. One of the main challenges in teaching system dynamics is helping people to see themselves as part of a larger system, one in which their actions feed back to shape the world in ways large and small, desired and undesired. The greater challenge is to do so in a way that empowers people rather than reinforcing the belief that we are helpless, mere leaves tossed uncontrollably by storm systems of inscrutable complexity and scope.