Phil Schrodt, a very wise leader in the sub-field of quantitative political methodology, recently wrote a “suffer no fools and take no prisoners” review of the state of the sub-field (“Seven Deadly Sins of Contemporary Quantitative Political Analysis”). Needless to say, I work in a different sub-field, that of *qualitative* political methodology. But Schrodt’s critique of (my words) mathematically sophisticated studies erected on a scientific foundation of sand, contains lessons for those of us who focus on cognitive processes for thinking intelligently about the future, as well as for the number-crunchers who have the luxury of using real data. Whether your analytical problem comes with hard data or not, your science (i.e., your cognitive rigor) needs to be valid.

Here, I wish to consider just the following point from Schrodt’s review:

the political analyst typically confronts a situation where an assortment of equally plausible theories suggest several closely related (and therefore highly correlated) variables as possible causal factors.

Linear models do not deal well with such situations…. Leave out a relevant variable–the omitted variable bias problem–and its explanatory power is reflected in whatever related variables happen to be in the equation.

Schrodt has, in these words, provided a very nice rationale for such non-linear methods as scenario analysis and complex-adaptive systems. However, that is not to say that the “omitted variable bias problem” cannot exist with these approaches as well.

Scenario analysis is essentially a totally unscientific approach, indeed one that is justified precisely by the recognition that one has little theoretical understanding of causality. Complexity theory is a perspective on how the world is organized that sits at a very high, domain-independent level. It informs us that our system is composed of interrelated, adapting parts and that counter-intuitive behavior can be anticipated at one level of organization as a result of behavior at another level, but so far contributes no real theoretical teaching at all to predict the specific behavior of a particular government, say, at a particular point in time. A third method, system dynamics, is technically a linear method (smooth, exponential curves are still linear). Each of these methods helps us to think about possible political futures and seems vulnerable to the danger of assuming that one has discovered “the explanation” simply because one has discovered one plausible explanation.

Let us start with the least rigorous approach, scenario analysis. In the standard approach, one identifies two drivers that appear to be the most important causal variables and, on that basis, derives four scenarios. That exercise stimulates thinking but contains few guard posts to defend against alternative causal patterns that might produce the same outcome: your scenario may be accurate but for the wrong reasons. One defensive tactic might be to do a second scenario analysis with two different drivers to see if the two pair results in a plausible (remember, this is scenario analysis, not science – there is no proof, only—at best—plausibility) scenario matching one of the scenarios generated by the initial pair of causal variables (drivers). It might be quite useful to discover that if Drivers A and B are eliminated in order to prevent the occurrence of the much-feared Bad Scenario, that Drivers C and D could also generate the same outcome!

The technical innovation to address the omitted variable bias problem within scenario analysis, then, is as follows:

1) do the standard scenario analysis, asking “what are the possible futures?”;

2) repeat with a new set of drivers, asking “how else might we get there?”;

3) analysis: you should expect that a new set of drivers will generate new scenarios (probably ridiculous ones, since you already selected the two most important drivers for the initial exercise), so if the second set of drivers also generates one or more of the same scenarios, you should feel very ill at ease!

For those who have looked carefully at the Scenario Evolution page of this site, it should be obvious that I advocate the integration of system dynamics into scenario analysis—to place the focus squarely on “how” rather than “what”—in part to address this possibility. But studying the dynamics that drive only one set of scenarios clearly falls short of guarding against the possibility that a completely independent set of drivers could also generate the same outcome. CIESD’s innovative concept of a map of all drivers operating in a domain may well provide the best approach developed so far for prodding researchers to consider alternative explanations because a visual map in front of your face showing the myriad pathways in a complex domain fairly begs the question, “How many different roads to Future X exist?”

As for how all this relates to complex adaptive systems, that is a question for…future analysis.

Schrodt has offered a warning from quantitative, linear analysis that is equally useful for qualitative, non-linear analysis. Suggestions on how to minimize the danger of the omitted variable bias problem when doing blue-sky scenario analysis or contemplating the probable evolution of, say, the U.S.-Turkish-Iranian-Israeli complex system (yes, dear physicists, this is a four-body problem), would make a needed contribution to putting these mental exercises on a scientific foundation.