Chapter 6. Methods and Frameworks – Building Adaptive Foresight Skills

2. Anticipation (Convergent thinking)

Key Practice Specialties and Communities for Adaptive Foresight


Key Anticipation-Associated Practitioner Methods

Actuarial Science
Risk data collection, reference class formation, and other methods of quantitative risk assessment.

Analytical Hierarchy Process
Use of hierarchical mapping and pairwise comparison for quant. decision-making, modeling, forecasting.

Bias Identification and Bias Mitigation
Finding cultural biases & cognitive biases in foresight environment, and exercises to mitigate bias.

Causal Modeling, Systems Analysis, and Simulation
Representing system actors and behaviors in causal or computer models (Example: agent models)

Classic method to seek convergence from groups via successive opinion and feedback cycles.

Developmental Foresight
Anticipating optimal, convergent, irreversible trends and emergences, at multiple systems levels.

Discontinuity and Wildcard Anticipation
Finding key trend reversals/discontinuities and low probability, high impact (positive or negative) events.

Evolutionary Foresight
Identifying processes of creative, divergent, unpredictable change, at multiple systems levels.

Forecast Value Added (FVA) Analysis
Predictive evaluation relative to the null hypothesis, to see if team’s forecast truly beats a naive model.

Foresight Workshops
Facilitative and normative methods used in groups to generate desirable future states for the firm.

Genius Forecasting (Genius Visioning)
Gifted and respected experts are asked for predictions or aspirational visions, often outside their fields.

Intellectual Property Strategy
Defensive or offensive techniques to create or protect a firm’s intellectual property.

Learning Curves
Modeling exponential, power-law, S-curve, U-curve & experience curves, while seeking discontinuities.

Prediction Analysis
Examining past predictions and assessing their methods, bias, accuracy, and utility (benefit to cost).

Predictive Analytics
Techniques from statistics, modeling, data mining, and machine learning to make quant. predictions.

Prediction Markets and Prediction Platforms
Markets and platforms for making predictions and finding the best predictors by subject area.

Psychological Trait Assessment (Personality Typing)
Diagnostic models for future-predictable psych. traits (OCEAN, StrengthsFinder, MBTI, DISC, etc.).

Reference Class Forecasting
Quantitative method of predicting the future by comparing to similar past outcomes (a reference class).

Predicting a past event with your forecasting model, then seeking evidence for it. Good validation tool.

Resiliency Analysis and Resilient Control Systems
Infrastructure, policies and strategies to make a system resilient to damage. (Or better yet, to benefit from damage – see Antifragile, Taleb, 2014)

Risk Avoidance, Risk Reduction and Risk Insurance Analysis
Risk prioritization, risk avoidance, reduction, and acceptance/insurance options and plans.

Risk Models and Risk Prediction
Building statistical models of risk occurrence, making them causally predictive.

Statistical Models
Probabilistic relationships between variables in math models, e.g. Demographic & Econometric models.

Trend Extrapolation and Regression Analysis
Acquisition and projection of historical time-series data as a forecast, subject to error and uncertainty.

Vulnerability Assessment
Qualitative risk assessment regarding potential accidents, crime, lawsuits, other adverse events.

Strategy games that deal with threat and security operations of various types, real or fictional.

  • Alex Teselkin

    To bomb our readers with math! An example of an anticipation analysis workflow:


    Getting rid of useless data — Filtering, Noise Reduction
    – Fourier Transformation: Converts time-series to frequency domain and vice versa
    – High pass filter: The high frequencies gets a pass, low frequencies (including steady and none-periodic signals) are filtered out.
    – Low pass filter: The low frequencies gets a pass
    – Binary masking/gating: take noise sample, zero the frequency bins where energy is less than that of the noise profile
    – Averaging: gets rid of random noise and leave stead signal

    Making sense of the rest of the data — Representations (graphs), Periodicity, Statistics
    – Time-frequency trade off: uncertainty principal
    – Welch Periodogram: Fourier transformation window by window with overlaps
    – Spectrogram: Fourier transformation with time AND frequency information
    – Wavelet Transformation: adaptive representation of frequency domain signals
    – Cross correlation: ways to find correlation between two signals, “likelihood”
    – Auto correlation: ways to find correlation with itself, hence, periodicity
    – Zero crossing: count the number of zeros after normalizing data set
    – Logarithmic representations: Log X, Log Y, Log X Y
    – Distributions: Uniform distribution – Binomial distribution – Normal distribution – Poisson distribution
    – Error range of distributions and what they mean
    – Significance test: T Test, Z Test, etc
    – Relations: Derivatives, Integrals, aka difference and sum over time
    – Stochastic Processes: modelling the unknown, how to deal with data that humans cannot find all the intricate rules of, how to set appropriate time window, how to give appropriate estimations, etc
    – Numerical methods: Iterative calculations

    Categorizing data — Clustering, Correlation
    – Multi Linear Regression
    – Cluster Analysis

    Making decision — Machine Learning, Neural Networks
    – Neural Concepts: Hodgkin Huxley Model – Squid Axon Mathematical Model – Neural Adaptation – Soma, dendrites and axons – Time-frequency conversion and thresholding – Fractal complexity
    – Information Theory: Shannon, describing information as entropy (order) with a focus on defining transmission, storage, and attempts to bridge to not only sensory communications but neural principals in general
    – Game Theory: algorithmic approach using a combination of stochastic and deterministic processes, trying to arrive at a decision based on such estimations
    – Machine Learning: Using neural concepts, modelling of each node as a summing location. In every network there are features (initial nodes), intermediate nodes (might or might not have meaning), and final outcomes. Training stage trains the nodes, application stage applies the already trained nodes.
    – Example: 5 features — colour, shape, taste, size, weight. 2 decisions —like, don’t like. 5 training data — apple, orange, banana, monkey, rock. Apple is red, round, sweet, small, not-heavy, and you like it. All those nodes +1. Orange is orange, round, sweet, small, not-heavy, and you like it. All those nodes + 1. Banana is yellow, long, sweet, small, not-heavy, and you like it. All those nodes + 1. Monkey is brown, humanoid, chewy, big, heavy, and you don’t like it. All those nodes – 1. Rock is grey, round, hard, small, heavy, and you don’t like it. All those nodes -1. In the end, taste is the most prominent node and will have the biggest weighting. You trained the system, now you pass something like “pineapple” into the system, it breaks it into features, compares each feature to the weighting, and arrives at a conclusion: “the decision is you like it”.

    All of these methods can be run iteratively, if needed.
    All of them can be done in Matlab, some just a one-liner. That’s why I encourage to use them.

Share your Feedback

Better Wording? References? Data? Images? Quotes? Mistakes?

Thanks for helping us make the Guide the best intro to foresight on the web.