Objection:
"... the bottom line is that just about whenever the Government sticks their nose into any financial area, they do far more damage then they help."
Response:
Libertarians object to government in general, especially interference in economic activity. The fundamental libertarian view is that when government does something, it's always harmful. While it's true that government policies are often wrong, it's also true that corporate policies are wrong just as often.
Thinking that "policies that work" can't be designed is so 16th century ... as illustrated by this quote:
"When a problem arises either from
within a republic or outside it,
one brought about either by
internal or external reasons,
one that has become so great that
it begins to make everyone afraid,
the safest policy is to delay dealing with it
rather than trying to do away with it,
because those who try to do away with it
almost always increase its strength and accelerate
the harm which they feared might come from it."
Niccolo Machiavelli, The Discourses, 1519 ... quoted in
Sterman, John, "Learning in and about Complex Systems", System Dynamics Review 10, No 2 - 3, Summer-Fall 1994,
and in
Sterman, John, Business Dynamics: Systems Thinking and Modeling for a Complex World, Irwin/McGraw-Hill, 2000, p. 8.
Professor Sterman read this quote in his keynote speech at the 1994 "Systems Thinking in Action" conference. Here is a portion of what he said as he continued:
Now, I don't agree with Machiavelli's policy prescription, which is "Don't touch it with a ten-foot pole," but unfortunately his observation about what happens when people do intervene is correct. And a lot of research has recently shown just how deeply embedded these problems are. How many people here have played the "beer distribution game?" Great. Most people.
Well you remember in that game that what happens is, even though it's a very simple system, performance is extremely poor. In fact average costs are 10 times worse than the potential. Ten times, not 10%, ten times worse. And people understand that very well after they play. They're angry; they're upset; they blame their colleagues. Well that system is far, far simpler than any of the real production-distribution systems we're dealing with in real life. The only difference is, in real life outside the learning laboratory, we don't know how well you could have done and so the potential for that embarrassment is not usually there.
Other experiments confirm the same results. I created a management flight simulator called B&B Enterprises for Boom and Bust. It simulates the introduction and management of a new durable good into a competitive market. We find that our students at MIT as well as experienced managers such as yourself playing this game usually go bankrupt, even after they've been give the opportunity to generate about 50 years worth of personal experience in an environment that's far simpler and far easier to deal with than the real world.
Other people's research has shown the same thing. My colleague Vernon Smith in Arizona has done experiments with simulated stock markets. It gives people simulated assets that they buy and sell just like on the New York Stock Exchange. What he finds is that people consistently generate speculative bubbles. They bid the prices of these assets way above their fundamental value. Then finally somebody become reluctant to purchase something just because they expect somebody else to be a greater fool to buy it in the expectation that it will continue to rise. And the whole thing collapses and there's a panic in their market. Now this behavior doesn't disappear when he uses investment professionals. It doesn't disappear when he pays people significant real money in proportion to their profits in the game. We found the same thing. ...
My colleague Don Climates of Illinois designed a medical simulation. He had subjects playing the role of doctors and they could either test the patients to find out what's wrong with them, treat them, or do nothing. Just like real doctors, he had a range of options there. What he found is that most people would test and test and test and test trying to figure out what was wrong with these poor patients, meanwhile the patients got sicker and sicker and finally died.
So what's going on here in study after study after study is that people, even when placed in quite simple, compared to real life, just don't have the cognitive wherewithal to understand that dynamic complexity, to interpret what it means, and then make good decisions.
There are two fundamental reasons for this. And both need to be overcome if we're going to learn effectively.
The first is that our mental maps are flawed. Bob Axelrod at the University of Michigan studied the mental maps of political leaders. ... What he found is that there are no feedback loops in their mental models. They view the world as a sequence of events. If I do this, this other leader will do this; if I do this, this other leader will do that. Events causing events. No feedback loops.
Well, a fundamental principle of system dynamics is that the structure of the system is what gives rise to its dynamic behavior. This is a basic lesson of the beer distribution game. Get away from the idea that it's events causing events to see those underlying patterns and the structures that create them.
It's worse than that. What happens is that people tend to blame whatever problems there are on the other people in the system. This also emerges in the beer distribution game; the first thing people do in that game is point the finger at other people ... blame other people. This is so pervasive that psychologists have come to call it the "fundamental attribution error." That is, blaming our problems on dispositional rather than situational features of the environment. ...
When we attribute our problems to individuals and special circumstances, rather than to the system structure we systematically diverting our attention from the high leverage points in the system. There's no potential for any significant learning. And worse, we then grow cynical about the intentions and the capabilities of our fellow people.
That's not the only problem though that exists up in our mental models. There's an even more serious set of issues that have to do with the poor quality of our inquiry skills. We don't have good abilities as intuitive scientists. That is, we don't bring the basic principles of scientific reasoning to our everyday reasoning tasks.
He goes on the describe the work such as that by Kahneman and Tversky on what's called "anchoring and adjustment" in decision-making based on irrelevant information ... Kahneman won a Nobel prize after Tversky died.
Among the failures to think scientifically that's most inimical to learning is what's called the "confirmation bias." This is the tendency to seek evidence that's consistent with our prior beliefs, instead of testing to see whether those beliefs could possibly be wrong or in need of improvement.
He also talks about Wade Boggs eating chicken before games because it seemed to be working for him. And he talks about Dr. Benjamin Rush who believed in blood letting by rationalizing that those who survived did because of the efficacy of the treatment and those who died were just really sick. Rush believed so strongly that he himself died from bloodletting he prescribed for himself.
The other fundamental reason why we can't deal with dynamic complexity and make good decisions that he addresses is that we can't do accurate mental simulations in our heads. More on this in the paper on Languages, Brains, & Skills in the section on "A Brain" that explains that humans have not evolved to successfully do mental simulations in systems with multiple feedbacks and long delays.
So the belief that it always has to be that government will always "do far more damage then they help" is based on 16th century thinking that is unaware of advances in our ability to understand how the world works and design policies that improve the health and wealth of our society.
See also comments at Command and Control.