Chapter 12. Visions and Challenges – Priorities for Professionals

Bias Against Probable Foresight

We introduced this critical topic in the section, Valuing Probabilistic Foresight in Chapter 1. We called it Underdetermination Bias in the list of common Emotional-Cognitive Biases in Chapter 2. We return to it now as one of the top insights we hope you gain from this Guide.

It is unfortunately common for some practitioners to have an emotional or cognitive bias against probable futures. They ignore all the predictors (predictive analytics, prediction markets, individual predictors), forecasters, trend extrapolators, scientists, developmentalists, risk managers, intelligence professionals, investors, and others seeking to discover, describe, and take advantage of what is most likely to happen next. Some even argue these folks aren’t part of our field, a denial strategy that only serves to reduce insight, method-sharing and professionalization of our practice.

From a Three Ps perspective, folks who hold this view understand foresight as only two thirds (possible and preferable) of what it truly is. Both their ability to affect change and the case they can make for the value of foresight are seriously weakened as a result. From an evo devo perspective this view is even more damaging. In living systems, predictable developmental processes, few though they are, are equally as important as unpredictable evolutionary ones in keeping the system adaptive. In many ways, each process acts in opposition to and balances the other.

We’ve all been guilty of believing, sometimes firmly, that most of what is relevant to our clients is intrinsically unpredictable. We can appreciate the advances of science and statistics to date, but we often don’t trust that the application of scientific and analytical methods, including modeling, data collection, hypothesis testing, and forecasting, will reveal important probable and actionable futures for our clients. We may be happy to propose options, consider alternatives, and help surface motivating goals, visions, and preferences, especially if such thinking styles fit our strengths. If we have insufficient interest in helping them see probable futures, and this aspect of our practice suffers. Assessing the most probable future, and using any kind of quantitative process, often just seems too hard, especially if our training and culture have failed to educate us in such methods.

If you find yourself ignoring the probable future in your thinking, strategy, planning, and action, consider getting additional training in forecasting, via either basic qualitative or “judgmental” forecasting (surveys, interviews, Delphi with bias correction, etc.) and/or basic quantitative forecasting (prediction via time series methods, basic modeling, etc.). Alternatively, you might need more work on intelligence (assessing the product, service, or strategy a competitor is employing now and next) or risk management (quantifying and minimizing predictable threats), or simply to get a better understanding of the current (always imperfect) state of science and systems theory and thinking in the area in question.

In some aspects of social systems, like accelerating technology or demographic shifts, predictions can be so easy to make that we ignore them, the way US business and government ignored all the predictable needs of baby boomers until shortage crises occurred. For example, shortages of hospitals, of housing, of schools, etc., all of which happened in predictable succession and were ignored in most US cities until the crises hit. Many people currently ignore the implications on their jobs and industries of accelerating digital technologies. Fortunately, our prediction failures often fall into common biases and patterns, and the more we understand those, the better we can correct for them, and gain great economic and competitive value as a result.

We’ve mentioned Philip Tetlock’s Superforecasting (2015) as a great place to start to convince yourself of the value of quantitative prediction. Another leader in evidence-based anticipatory foresight is Scott Armstrong, at We need many more scientists, systems theorists, statisticians, modelers and programmers in our profession, to improve the quantitative quality of our STEEPS foresight. Online platforms that help us to annually review our predictions and probability statements, and incorporate accelerating levels of analytics and machine intelligence, are going to be key tools for professional development in coming years.

In the evo devo (Three Ps) foresight model, training in probable futures should represent at least a third (along with possible and preferable futures) of our modern foresight degrees and certificate programs. We’ve mentioned specialty associations for Forecasting (International Institute of Forecasters, which conduct annual forecasting methods workshops), Intelligence (SCIP) and Risk Management (RIMS) which can help us improve our treatment of probable futures. Pick one of these and start learning. Get comfortable with numbers and probability estimates, and use them whenever you can in your client engagements.