This post is the beginning of a series of explorations starting with a single question:
What might a complexity-informed foresight practice look like?
What does ‘complexity-informed foresight practice’ even mean? And why should you care?
I don't expect the question will tickle everyone's fancy right at first glance. But hear me out before you decide to make like a tree. Because, in my opinion, the answer to this question should matter to every futurist and foresight practitioner, every business and government leader, and every educator and community changemaker.
Here are the basic assumptions of this exploratory series, a jumping-off point if you like, that gives a sense of why it might matter to you:
Many of the challenges leaders in government, business and wider society confront today are complex, i.e. they involve complex adaptive systems with hard-to-predict dynamics;Â Â
Complex systems demand a different approach from the bureaucratic, mechanistic and linear approaches to problem-solving that have dominated decision-making since the industrial revolution began (and, in fact, failing to make this shift can have catastrophic consequences);
We humans, both individually and collectively, have some degree of agency in this world, a capacity to perceive, to think, and to act;
The way we use our agency can affect - for better or worse - the kinds of futures that emerge, and we should therefore strive to use that agency (both to act and not-act) as wisely as possible.
So, if that sounds like stuff you can jive with (or at least entertain for a while), then please read on.
A tale of complex systems and unintended consequences
I'd like to enter this exploration with a story. I should warn you that this retelling of events is accurate to the best of my knowledge, but very likely incomplete, representing only a shallow flitting across the surface of the actual complexity of events.
Picture this. It's the early 1990s in India. The country's 80 million strong vulture population has been pushed to the brink of extinction. Simultaneously, there was an explosion in human deaths from rabies across the country, increased leopard attacks on people living in the urban fringes, and the Zoroastrian community known as the Parsis abandoned an ancient customary funeral rite it had upheld for two and a half millennia.
What could these events possibly have in common?
Bovine inflammation (of course!).
We enter the story in the early 1990s. Indian farmers began to treat their sick cows with a new anti-inflammatory drug called Diclofenac (also sold under its brand name Voltaren). Soon after, vultures began to die in rapidly accelerating numbers. Given the sacred status of the cow in India, almost all are bred for dairy production rather than meat, so when a cow dies their carcasses are often discarded and left for the vultures to feast on. The vulture population functioned as an enormous and efficient waste disposal system. But despite their otherwise lead-lined stomachs, vultures had a particular sensitivity to Diclofenac. The smallest trace of the drug caused their kidneys to fail, and within just a few short years, India's vulture population dropped by 95% with remaining birds numbering only in the tens of thousands.
Then began the cascade of consequences. With vultures almost extinct but cattle production still high, the suddenly abundant food source of unpicked bovine carcasses saw the rodent and wild dog population explode. More wild dogs led to more rabid dog attacks on humans, producing a spike in deaths from rabies infections. Human deaths from anthrax and plague also surged into the thousands because, unlike vultures whose digestive system is a dead-end for dangerous pathogens, rodent and canine scavengers become carriers and spreaders of disease.
An estimated 18 million dogs were now on the loose in India's streets, and the leopard population observing from jungled urban fringes took note. The dogs were an abundant food source for the big cats, so they ventured more frequently into human habitats to prey on the dogs, presenting new hazards for both people and leopards as the former sought to violently repel the latter's presence.
But perhaps the most surprising consequence was that felt by the Parsis people, an Indian Zoroastrian community. According to their ancient creed, cremation or burial of their dead for the Parsis is sacrilegious. The Parsis believe that vultures are sacred creatures who function as intermediaries between earth and heaven, and in their ancient custom, the Parsis would place their dead on a Tower of Silence where vultures would liberate the soul of their loved one by consuming the body. Ordinarily vultures could skeletonise a body within mere hours. But with the vultures approaching extinction, the bodies of the dead were taking months rather than days to be consumed by the few remaining birds, presenting public hygiene risks that forced the Parsis to find alternatives to their age-old practice.
In addition to the unquantifiable displacement of cultural rites, it is estimated the costs of dealing with the consequences of the loss of India's vultures was US$25 billion per year.
Yet all they had ever wanted to do was to treat sick cows. And the solution had seemed so simple, so obvious.
The crux of this inquiry
As a foresight practitioner, I believe the discipline of foresight can help us to have more wisdom in the ways we perceive, think and act in the present that support the emergence of futures in which life flourishes, be it human or non-human life. But there’s a caveat: our approach must be fit-for-system.Â
In my own practice over the past decade, I've often sensed the desire among leaders across a range of corporate settings to instrumentalise foresight completely, to pull it into the strategic planning machine that prizes the linear conversion of foresight into insight into action into roadmap, all of which is aimed like an arrow at some rigidly pre-defined outcome. Sometimes this is fine. When we're dealing with a complicated or clear system (as opposed to complex), the creation of a linear roadmap might be both appropriate and effective (more on Dave Snowden's Cynefin Framework in later posts).
But I've also commonly witnessed diclofenac-type solutions imposed on complex systems before, and have seen the surprise and frustration when such approaches produce unanticipated resistance in the system, or unintended consequences that are counter-productive, or at worst, downright destructive. In complex systems, because dynamics are non-linear, it's often difficult if not impossible to foresee how a small intervention can have a wildly disproportionate effect. In the case of the vultures, one study constructed a simulation model showing that Indian vulture populations would fall by 60-90 percent if only 1 percent of bovine carcasses had been contaminated with diclofenac. In reality, contamination was detected in around 10 percent of bovine carcasses, and consequently, the Indian vulture population today sits at around 19,000 birds, a 99 percent decline since the 1990s. In complex systems, the stakes are high.
One of the causes of the failure to differentiate between system types may be found in the DNA of the modern organisation. Many organisations and leaders today evidently remain in the firm grip of the strategic management approach, which presumes the world is largely mechanistic. In that context, the primary (perhaps only?) use of foresight is to identify a preferred destination and the best strategic levers to pull to manipulate the system in such a way that we might control the future that emerges, or as former Head of Futures Literacy at UNESCO Riel Miller puts it, to "colonise the future". Miller summed it up this way:
The traditional, bureaucratic structure adopted by organisations and institutions derives from an understanding of systems and problems that precedes the discovery of complexity. These structures are tailored to addressing 'complicated' – not 'complex' – systems and problems: they work as if problems could be addressed individually and in a piecemeal way, without outputs systematically proportionate to relevant inputs, and aim to manage and control underlying systems.1
Then Miller cuts right to the crux of the inquiry I'm wading into:
The problem that surfaces here is dramatically urgent: while there is considerable expertise and experience with the invention and implementation of bureaucratic structures meant to act within the existing framework of agency – how to 'use-the-future' for optimization and contingency – we are still in the deepest fog about how to build up anticipatory structures able to organically deal with complex problems and systems.
I'm in furious agreement with Riel on this issue. This is not entirely unexplored territory, of course (see the work of Riel Miller, Frank Spencer, Wendy Schultz, John Sweeney, their intellectual forbearers, and probably more I haven't acquainted myself with yet). But the integration of complexity theory into foresight practice has much room for development. And if we accept the near-cliché that we live in an age of increasing complexity (not to mention exponential technological risk), and if we care to us our agency with greater wisdom, then continuing to explore how foresight and complexity might be synthesised into a coherent practice does indeed seem a 'dramatically urgent' matter.
Read the next instalment in this series.
Hit subscribe below to follow this exploration. And please drop me a line. I welcome all comments, contributions and references.
Miller, R (Ed.) Transforming the Future: Anticipation in the 21st Century, UNESCO/Routledge: Paris (2018), p. 61.
Interesting to see you start this series. You might find some of the work in the EU Field Guide of interest - that takes a complexity (not a systems thinking) approach to creating human sensor networks for both real time situational assessment and micro-scenario generation. That of itself is a switch from foresight in normal use, to creating capability to better understand where we are and what might happen next. It also uses diverse networks to generate potential unintended consequences rather than assuming that can be done.
The more radical work is just emerging, building on constructor theory in Physics to measure the energy/time nature of constraints in order to get better indications of the future, both counterfactuals and also the general principle that what ever has the lowest energy gradient is likely to win. Wendy Schultz and I are planning a workshop in the new year on that.
The third area is the switch from anticipation to anticipatory triggers, but that is for another day
EU Field Guide link here : https://publications.jrc.ec.europa.eu/repository/handle/JRC123629
Thanks James, I enjoyed reading this and I agree there is a real need to rethink our collective practice in this space. I thought I’d share the immediate thoughts that came to mind from reading this and which I need to explore further! (1) Embedding a systems thinking approach - having a more mature understanding of the operating environment and broader system we work in will help us get better at the ‘situational vs contextual’ conversations and hopefully the ‘what role should I play in this bigger picture’? type questions. (2) Future risk profiles - building in unintended consequences from possible solutions in a way that helps us better understand future risk can help us think through the ‘quick fixes’ more deeply.