Source: Continuous Improvement Associates
http://www.exponentialimprovement.com/cms/command.shtml

Libertarian Objections
Command and Control
By Bob Powell, 9/4/06

The conservative view encapsulated:

If our world indeed is ordered in accordance with a divine idea, we ought to be cautious in our tinkering with the structure of society; for though it may be God's will that we serve as his instruments of alteration, we need first to satisfy our consciences on that point. Again, Burke states that a universal equality among men exists; but it is the equality of Christianity, moral equality, or, more precisely, equality in the ultimate judgment of God; equality of any other sort we are foolish, even impious, to covet.

The Conservative Mind from Burke to Eliot
by Russell Kirk, 1953, p. 34

This "don't tinker with it" mentality has led conservatives and libertarians to this false belief: "You're better off to just leave it alone. Let the 'free market' take care of it, because 'government' will just screw things up!"

One important point is that the natural structure of "path dependence" (also known as the "Success to the Successful" archetype in systems thinking) leads to inequalities in the distribution of wealth, even when everyone starts out equal and with equal abilities (see Wealth Happens). In a way then, it's not surprising that conservatives would see this as the "hand of God" in action.

Conservatives praise inequalities in the distribution of wealth and social Darwinism. Here's a stark example from Wealth by Andrew Carnegie, June, 1889:

The price which society pays for the law of competition, like the price it pays for cheap comforts and luxuries, is also great; but the advantage of this law are also greater still, for it is to this law that we owe our wonderful material development, which brings improved conditions in its train. But, whether the law be benign or not, we must say of it, as we say of the change in the conditions of men to which we have referred : It is here; we cannot evade it; no substitutes for it have been found; and while the law may be sometimes hard for the individual, it is best for the race, because it insures the survival of the fittest in every department. We accept and welcome therefore, as conditions to which we must accommodate ourselves, great inequality of environment, the concentration of business, industrial and commercial, in the hands of a few, and the law of competition between these, as being not only beneficial, but essential for the future progress of the race.

But here's a story, a Gardening Analogy, conveying another view:

A man was walking by a church parish one day and saw a priest tending the garden beside it. He remarked to the priest what a beautiful garden he and God had created. The priest stood back and looked around, saying, "You know, you're right; it is beautiful. But you should have seen it when God had it all to Himself."

Find versions of this story on the internet.
Google "you should have seen it when God"

Conservatives and libertarians don't believe that man has an ability to use reason to design social systems to ameliorate social conditions. They oppose attempts to do so as "social engineering." Anyone who seeks to improve social conditions faces the charge of attempting "command and control" -- a phrase used to raise the specter of the failed "command and control" Soviet economy.

So the conservative viewpoint is "don't mess with it." If you do, it will just provoke "unintended consequences." Somehow they manage to ignore that large corporations, which they praise and allow to get larger and larger, are "command and control" dictatorships.

They would oppose attempting to institute policies to improve social conditions even if they thought it would be successful. They would because they do not believe improving conditions for all citizens would be a good idea.

Social and political equality, he declared, do not fall within the category of the real rights of man; on the contrary, hierarchy and aristocracy are the natural, the original, framework of human life; if we modify their influence, it is from prudence and convention, not in obedience to "natural right." These are the postulates for his praise of natural aristocracy and his condemnation of leveling.

The Conservative Mind from Burke to Eliot
by Russell Kirk, 1953, p. 58

Russell Kirk described in The Conservative Mind that the conservative belief is that poverty is part of the human condition; to even try to eliminate it is impious. That's because "the human heart, in reality, is the fountain of evil" and it's wrong to "think that established institutions must be the source of our afflictions," and that in any case "Reason ... [is] a tool weak at best [and] ... frequently treacherous" for figuring out what to do.

Though flawed, these arguments are powerful because there's actually good reason to be concerned about the ability of humans to design effective policies. Throughout human history, men have found that policies very often, perhaps most often, do not produce the desired results. The reason for this is known in systems thinking as "policy resistance" -- a result of living in a world of high dynamic complexity, which is fraught with what are called "side effects" and "unintended consequences" (some conservatives even think they came up with the concept ... see more on this below).

John Sterman, in Learning in and about complex systems (another link here), reflects on this very real and daunting challenge in the section on

Misperceptions of Feedback

Effective management is difficult in a world of high dynamic complexity. Our decisions may create unanticipated side effects and delayed consequences. Our attempts to stabilize the system may destabilize it. Our decisions may provoke reactions by other agents seeking to restore the balance we upset. Our decisions may move the system into a new regime of behavior where unexpected and unfamiliar dynamics arise because the dominant feedback loops have changed. Forrester (1971) calls such phenomena the "counter-intuitive behavior of social systems." It often leads to "policy resistance," the tendency for interventions to be delayed, diluted or defeated by the response of the system to the intervention itself (Meadows 1982). No less an organizational theorist than Machiavelli discussed policy resistance at length, observing in The Discourses (1979, 240 - 241):

    "... when a problem arises either from within a republic or outside it, one brought about either by internal or external reasons, one that has become so great that it begins to make everyone afraid, the safest policy is to delay dealing with it rather than trying to do away with it, because those who try to do away with it almost always increase its strength and accelerate the harm which they feared might come from it."
    Machiavelli, The Discourses, 1519
    quoted in John Sterman, "Learning in and about Complex Systems"
    System Dynamics Review 10, No 2 - 3, Summer-Fall 1994

Recent experimental studies confirm these observations. Human performance in complex dynamic environments is poor relative to normative standards, and even compared to simple decision rules.

    • Subjects, including experienced managers, in a simple production-distribution system (the Beer Distribution Game) generate costly fluctuations, even when consumer demand is constant. Average costs were more than ten times greater than optimal (Sterman 1989b).
    • Subjects responsible for capital investment in a simple multiplier-accelerator model of the economy generate large amplitude cycles even though consumer demand is constant. Average costs were more than thirty times greater than optimal (Sterman 1989a).
    • Subjects managing a firm in a simulated consumer zplummet when a 'greater fool' can no longer be found to buy. These speculative bubbles do not disappear when the participants are investment professionals, when monetary incentives are provided, or when short-selling is allowed (Smith, Suchanek, and Williams 1988).
    • In a forest fire simulation, many people allow their headquarters to burn down despite their best efforts to put out the fire (Brehmer 1989).
    • In a medical setting, subjects playing the role of doctors order more tests while the (simulated) patients sicken and die (Kleinmuntz and Thomas 1987).

... These studies led me to suggest that the observed dysfunction in dynamically complex settings arises from 'misperceptions of feedback'. I argued that the mental models people use to guide their decisions are dynamically deficient. Specifically, people generally adopt an event-based, 'open-loop' view of causality, ignore feedback processes, fail to appreciate time delays between action and response and in the reporting of information, do not understand stocks and flows, and are insensitive to nonlinearities that may alter the strengths of different feedback loops as a system evolves. ...

Though subjects improved with experience, they learned little: Subjects accumulated fifty years of simulated experience in an environment with perfect, immediate outcome feedback Yet in the last trial the naive strategy still outperformed 83% of the subjects. Most important, subjects did not learn how to improve their performance in the dynamically complex conditions. Even in the last trial, the stronger the feedback complexity of the environment, the lower profits were relative to potential. The degradation of performance relative to potential caused by high feedback complexity is not moderated by experience. Estimation of subject decision rules showed subjects actually became less responsive to critical variables and more vulnerable to forecasting errors -- their learning hurt their ability to perform well in the complex conditions. ...

The robustness of the misperceptions of feedback and the poor performance they lead us to create across many domains are due to two basic and related deficiencies in our mental models of complexity. First, our cognitive maps of the causal structure of systems are vastly simplified compared to the complexity of the systems themselves. Second, we are unable to infer correctly the dynamics of all but the simplest causal maps. Both are direct consequences of bounded rationality (Simon 1979, 1982); that is, the many limitations of attention, memory, recall, information processing, and time that constrain human decision making.

So it's true that many attempts to design effective policies have failed. But just because that's been the case doesn't mean that it should continue to be the case (see the "is-ought problem" below). And just because Machiavelli and conservatives have concluded that it's better to just leave things alone and don't interfere, doesn't mean that systems thinking doesn't give us an approach to understand system behavior and determine what to do to improve system functioning -- to change or stabilize behavior.

As with any endeavor, we may not get it right the first time, but the systems approach applies the scientific method, iterating between theory and data collection (experiment), to guide the development of appropriate policies and actions.

As John Sterman writes in All models are wrong: reflections on becoming a systems scientist

The ... challenge is to ... [help people to see themselves as part of a larger system, one in which their actions feed back to shape the world in ways large and small, desired and undesired] in a way that empowers people rather than reinforcing the belief that we are helpless, mere leaves tossed uncontrollably by storm systems of inscrutable complexity and scope.

The "is-ought problem" or the "naturalistic fallacy." 

One flaw in Kirk's, Burke's and conservative's reverence for tradition is the "is-ought problem," also known as the "naturalistic fallacy."

The question of whether it is possible to derive a moral value purely from a statement of fact. ... In [David] Hume's view, the facts of a case do not in themselves dictate a conclusion about the way things ought to be; that step requires the interjection of an opinion about the facts. ... Around this problem has grown a major philosophical debate: can we derive values from facts? Hume's position was that we cannot; indeed, he stated the problem as a part of his argument that morality arises not from REASON but from an innate "moral sense.

From A World of Ideas -- A Dictionary of Important Theories, Concepts, Beliefs, and Thinkers by Chris Rohmann, 1999

In other words, just because things are a certain way based on tradition and history, does not imply they should be that way.

For example, the natural state of a plot of land may be that is grown up with weeds, but a farmer is not compelled to keep it that way. The farmer can nurture conditions and cultivate the soil to grow crops. One can argue that physicians do not heal; they "set the bones" to set the conditions that allow the body to heal itself.

The systems approach is analogous to what the farmer does in a garden. For reinforcing processes we provide sun, water, and food. For balancing processes, if they're limiting growth or needed change, we pull the weeds. If balancing feedbacks are needed to stabilize the system, we foster their ability to stabilize the system. This is an organic perspective that is the systems view leadership: designing the system to produce a desired future, not reacting to a predicted set of external events and influences. This view of leadership is "designing the plane to fly stably under turbulent conditions."

So just because government policies have failed in the past does not mean they must always fail. In fact, in many cases not only does the lack of government intervention lead to less than optimal results, it leads the system to fail completely (see The Trade Deficit and the Fallacy of Composition for examples).

This organic view, designing feedback structures to produce a desired future, is similar to, but more appropriate than, the mechanistic "tangible mechanisms" analogy used by Collins & Porras in Built to Last: Successful Habits of Visionary Companies: "The single most important point ... is the critical importance of creating tangible mechanisms aligned to preserve the core and stimulate progress. This is the essence of clock building [as opposed to 'telling the time']." [Note: Flaws in the analogy aside, this is an excellent book; I learned a lot from it.]

So using systems thinking to design the system is not "command and control" any more than is gardening or farming. It's true that one cannot employ "command and control" in complex social systems; that's the reason many corporations, which are "command and control" dictatorships, fail much sooner than they should; their operational view of themselves is mechanistic, rather than organic (see The Living Company by Arie De Geus, 1997).

So the idea is to "garden," not use "command and control" ... that is, foster the system's ability to produce desired outcomes -- to herd it in the right direction.

 

For more on unintended consequences read the article on Jay Forrester from "strategy+business" magazine:
"The Prophet of Unintended Consequences" by Lawrence M. Fisher; it even describes an example that has offended liberals. To get access there, one must register with email address, a password, and name. It's available without that here on this site.

Conservatives are incapable of seeing the unintended consequences of what they do. Examples:

  • inadequate regulation and taxes on growth due to conservative policies that result in pressure for tax increases ... blamed on liberals.  See the Growth Facts of Life.
  • Conservative Federal Reserve policies causing vast unemployment and underemployment and a need for welfare and minimum wage ... blamed on liberals. See There's no 'free market' for Labor.
  • Conservatives starve education funding, not realizing that capitalism systematically under invests in education and not appreciating the need to take advantage of positive externalities, ... and then blame the failures on liberals. Described in Explaining Liberal Principles.

It's vitally important to avoid short-term, myopic thinking. From the Counterintuitive Behavior of Social Systems (pdf) by Jay Forrester:

  • There are no utopias in social systems.
  • ... social systems exhibit a conflict between short-term and long-term consequences of a policy change.
  • A policy that produces improvement in the short run is usually one that degrades a system in the long run. Likewise, policies that produce long-run improvement may initially depress behavior of a system.
  • This is especially treacherous. The short run is more visible and more compelling. Short-run pressures speak loudly for immediate attention.
  • However, sequences of actions all aimed at short-run improvement can eventually burden a system with long-run depressants so severe that even heroic short-run measures no longer suffice.
  • Many problems being faced today are the cumulative result of short-run measures taken in prior decades.

The result of this is addictive behavior, which is rampant in our society due to conservative dominance. As examples, see the papers on Addiction and The Crisis Syndrome on drug addiction and organizational addiction to the "quick fix," Addiction to Prisons, and the Growth Facts of Life that describes addiction to growth.

© 2003 Continuous Improvement Associates

Top of Page