How can organisations proactively look forward to identify future opportunities and threats? This series of five short think pieces examines some of the key ways that organisations can reduce the level of uncertainty surrounding what’s next.

#2: Hidden assumptions

Author: Richard Watson

Given the dangers of simplistic thinking that were discussed in the last post, a systems approach to thinking can clearly pay dividends. In a globalised, hyper-connected world, few important things exist in isolation and one of the main reasons that long-term planning can go so spectacularly wrong is the over simplification of complex systems and relationships.

Another major factor for failure is assumption, especially the hidden assumptions about how industries or technologies will evolve or how individuals will behave in relation to new ideas or events. The historical hysteria about Peak Oil might be a case in point. Putting to one side the assumption that we’ll need oil in the future, the amount of oil that’s available depends upon its price. If the price is high there’s more incentive to discover and extract more oil. A high oil price fuels a search for alternative energy sources, but also incentivises behavioural change at both an individual and governmental level. It’s not an equal and opposite reaction, but the dynamic tensions inherent within powerful trends means that balancing forces do often appear over time.

Thus we should always think in terms of technology plus psychology or one factor combined with others. In this context, one should also consider wildcards. These are forces that appear out of nowhere or which blindside us because we’ve discounted their importance.

For example, who could have foreseen the importance of the climate climate debate in the early 2000s or its relative disappearance in the 2010s during the economic crisis.

Similarly, it can often be useful to think in terms of future plus past. History gives us clues about how people have behaved before and may behave again. Therefore it’s often worth travelling backwards to explore the history of industries, products or technologies before travelling too far forwards.

If hidden assumptions, the extrapolation of recent experience, and the interplay of multiple factors are three traps, cognitive biases are a fourth. The human brain is a marvellous thing, but too often it tricks us into believing that something that’s personal or subjective is objective reality. For example, unless you are aware of confirmation bias it’s difficult to unmake your mind once it’s made up. Back to Peak Oil and Climate Change scepticism perhaps.

Once you have formed an idea about something – or someone – your conscious mind will seek out data to confirm your view, while your subconscious mind will block anything that contradicts it. Of course, being subconscious you don’t realise what going on. This is why couples argue, why companies steadfastly refuse to evolve their strategy and why countries accidently go to war. Confirmation bias also explains why we persistently think that things we have experienced recently will continue into the future. Similar biases mean that we stick to strategies long after they should have been abandoned (loss aversion) or fail to see things that are hidden in plain sight (inattentional blindness).

Next post: Group Think: The good, the bad and the unexpected

This post was written by Richard Watson, who works with the Technology Foresight Practice at Imperial College.

 

Feature Photo by from Pexels