Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM

Page created by Wendy Mcguire
 
CONTINUE READING
Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM
Black Swan Events:
                               … preparing for the indefensible

            Professor Paul Arbon AM

Torrens Resilience Institute                                      © 2010
                                                                   Taleb 2010 1
Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM
What are Black Swans?

      The metaphor of the black swan corresponds to the
      ancient idea of a ‘rare bird.’ The Roman poet Juvenl
          refers to a ‘bird as rare as the black swan.’

  Modern philosophers - Popper, Mill, Hume, and others
  – used this concept to describe the fundamental risk in
           the use of modern empirical science.

Torrens Resilience Institute                             © 2010
                                                         Taleb 2010 18
Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM
Before the discovery of Australia, natural historians were
 convinced that all swans were white, an unassailable belief
 completely confirmed by repeated observations (empirical
 evidence).

 The sighting of the first black swan… illustrates a severe
 limitation to our learning from observations or experience
 and the fragility of our (factual) knowledge.

 One single observation can invalidate a general statement
 derived from millennia of confirmatory sightings … All you
 need is one single black bird.
Torrens Resilience Institute                                  © 2010
                                                               Taleb 2010 2
Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM
Why Black Swan thinking?

 Mainstream risk
 management assumes
 risks are foreseeable.

 In Black Swan thinking we
 focus on “unforeseeable”
 risks and events that may
 not be predicted.
Torrens Resilience Institute                              © 2010
                                                           Cavallo 2010
Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM
Two main underpinnings of black swan thought:

 • the nonsense of approaching complex systems
   empirically (quantitatively); being too complex to
   be reduced to reliable models or formulas

 • the inadequacy of experience (the qualitative
   approach) because the context may change

Torrens Resilience Institute                       © 2010
                                                    Cavallo 2010
Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM
“Let’s face it, the universe is messy. It is nonlinear,
 turbulent and chaotic. It is dynamic. It spends its time
 in transient behaviour on its way to somewhere else,
 not in mathematically neat equilibria. It self-organises
 and evolves. It creates diversity, not uniformity. That’s
 … what makes it work.” (Meadows).

Torrens Resilience Institute                             © 2010
Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM
• In the context of high uncertainty and complexity, normal
   tools of risk management are not enough.
 • A normal approach involves identification of hazards,
   likelihood and consequences in order to deal with larger
   risks by avoiding them, accepting them (e.g. insurance),
   transferring them to another party or reducing them by
   applying more efficient controls.
 • This cannot be done with unforeseeable risks because
   there is not enough information about them.

Torrens Resilience Institute                              © 2010
                                                           Cavallo 2010
Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM
Are these Black Swans?
 Nowadays, we are continuously confronted with questions
 about the validity of traditional risk management in complex
 systems globally:
 • MERS (Novel) Corona Virus
 • Multi resistant infections
 • The Icelandic volcano Eyjafjallajokull
 • The Asian Tsunami
 • Impact of “natural disasters” on global work/economy
 • Interdependent system crash
 • Climate variability impacts
 • The Japan “triple” disaster
Torrens Resilience Institute                               © 2010
                                                            Cavallo 2010
Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM
Key attributes of the Black Swan:

        It is an outlier, outside the realm of regular
        expectation(s)...

        It carries an extreme impact.

        It is rare and is often understood
        after the fact.

Torrens Resilience Institute                                       © 2010
                                                                    Taleb 2010 3
Black Swan Events: preparing for the indefensible - Professor Paul Arbon AM
Key Question
 Is modern risk management a
 process that excludes
 the possibility of the
 Black Swan?

Torrens Resilience Institute                  © 2010
                                               Taleb 2010 6
Black Swan thinking:

 • Black Swan logic argues what you don’t know is far
   more relevant than what you do know.

 • Many Black Swans can be caused and exacerbated
   by their being unexpected.

 • Think of the terrorist attack of September 11, 2001:
   had the risk been reasonably conceivable on
   September 10, it may not have happened.

Torrens Resilience Institute                          © 2010
                                                          Taleb 2010 7
The Black Swan is the result of collective and
  individual knowledge limitations (or distortions), mostly
       confidence in knowledge; it is not an objective
                       phenomenon.

Torrens Resilience Institute                             © 2010
                                                          Taleb 2010 8
The most severe mistake … is to try to define an
      ‘objective Black Swan’ that would be invariant in the
                      eyes of all observers.

          The events of September 11, 2001, were a Black
            Swan for the victims, but certainly not to the
                           perpetrators.

Torrens Resilience Institute                                 © 2010
                                                              Taleb 2010 9
Prediction is firmly institutionalized in our world

Torrens Resilience Institute                                      © 2010
                                                                  Taleb 2010 27
Being a Turkey:

     Consider a turkey that is fed every day. Every
  single feeding will firm up the bird’s belief that it is
      the general rule of life to be fed every day by
   friendly members of the human race ‘looking out
      for its best interests.’ On the afternoon of the
      Wednesday before Thanksgiving, something
  unexpected will happen to the turkey. It will incur a
                      revision of belief.

Torrens Resilience Institute                            © 2010
                                                        Taleb 2010 22
The Turkey Problem:
                      The problem of inductive knowledge

     Consider that the feeling of safety reached its
  maximum when the risk was at the highest. But the
   problem is even more general than that; it strikes
       at the nature of empirical knowledge itself.
   Something has worked in the past, until – well, it
   unexpectedly no longer does, and what we have
      learned from the past turns out to be at best
   irrelevant or false, at worst viciously misleading.

Torrens Resilience Institute                               © 2010
                                                           Taleb 2010 23
The Problem of Inductive Knowledge:

      • How can we logically go from specific instances to
        reach general conclusions?
      • How do we know what we know?
      • How do we know that what we have observed from
        given objects and events suffices to enable us to
        figure out their other properties?
      • There are traps built into any kind of knowledge
        gained from observation.
      • Predictions are helpful but may mislead us…

Torrens Resilience Institute                              © 2010
                                                          Taleb 2010 21
The Idea of Robustness:

 • Why do we formulate theories leading to projections
   and forecasts without focusing on the robustness of
   these theories and the consequences of errors?

 • It is much easier to deal with the Black Swan
   problem if we focus on our robustness to errors
   rather than improving the predictions.

Torrens Resilience Institute                             © 2010
                                                         Taleb 2010 11
Evidence based practice:
                               The new holy grail..

                                                      As much as it is
                                                      ingrained in our
                                                      habits and
                                                      conventional wisdom,
                                                      confirmation can lead
                                                      to dangerous error.

Torrens Resilience Institute                                              © 2010
                                                                          Taleb 2010 24
Learning and what we know:

       • We tend to learn the precise, not the general.

       • We are conditioned to be specific.

       • The French, after the Great War, built a wall along
         the previous German invasion route to prevent
         reinvasion- Hitler just (almost) effortlessly went
         around it. The French had been excellent students
         of history; they just learned with too much precision.
         They were too practical and exceedingly focused for
         their own safety.
Torrens Resilience Institute                                © 2010
                                                            Taleb 2010 12
Defending against the black swan

                                            It is necessary … to stand
                                            conventional wisdom on its
                                            head and to show how
                                            inapplicable it is to our
                                            modern, complex, and
                                            increasingly recursive
                                            environment.

Torrens Resilience Institute                                             © 2010
                                                                         Taleb 2010 13
Recursive here means that the world in which we
    live has an increasing number of feedback loops,
     causing events to be the cause of more events.

Torrens Resilience Institute                       © 2010
                                                   Taleb 2010 14
We live in an world
 where impact flows
 globally and rapidly and
 inter-connections and
 dependencies among
 systems are common.

Torrens Resilience Institute   © 2010
                               Taleb 2010 15
What can we do?
 Assume that a US legislator with courage, influence,
 intellect, vision, and perseverance manages to enact a
 law that goes into universal effect and employment on
 September 10, 2001; it imposes continuously locked
 bulletproof doors in every cockpit (at high costs to the
 struggling airlines) - just in case terrorists decide to
 use planes to attack the World Trade Center in New
 York City.

                               This is a thought experiment.

Torrens Resilience Institute                                   © 2010
                                                               Taleb 2010 16
The person who imposed locks on cockpit doors gets
    no statues in public squares, not so much as a quick
         mention of his contribution in his obituary.

   Seeing how superfluous his measure was, and how it
  squandered resources, the public, with great help from
      airline pilots, might well boot him out of office.

Torrens Resilience Institute                           © 2010
                                                       Taleb 2010 17
Perception and understanding

                                                Additional knowledge
                                                of the minutiae of
                                                daily business can
                                                be useless, even
                                                actually toxic.

Torrens Resilience Institute                                           © 2010
                                                                       Taleb 2010 29
• Show two groups of people a blurry image of a fire
  hydrant, blurry enough for them not to recognise what it
  is. For one group, increase the resolution slowly, in ten
  steps. For the second, do it faster, in five steps.
• Stop at a point where both groups have been presented
  an identical image and ask each of them to identify what
  they see. The members of the group that saw fewer
  intermediate steps are likely to recognise the hydrant
  much faster.
• The more information you give someone, the more
  hypotheses they will formulate along the way. They see
  more random noise and mistake it for information.
Torrens Resilience Institute                              © 2010
                                                          Taleb 2010 30
Our ideas are sticky!

      • Once we produce a theory, or make a decision we are
        less likely to change our minds.
      • Those who delay developing their theories are better off.
      • When you develop your opinions on the basis of weak
        evidence, you will have difficulty interpreting subsequent
        information that contradicts these opinions, even if this
        new information is obviously more accurate.

Torrens Resilience Institute                                  © 2010
                                                               Taleb 2010 31
Sticky ideas lead to
 confirmation bias…
 and belief perseverance:

 We treat ideas like
 possessions, and it is
 hard for us to part with them.

Torrens Resilience Institute      © 2010
                                  Taleb 2010 32
The Expert Problem

  You invoke the outlier. Something happened that was
  outside the system, outside the scope of your science.
  Given that it was not predictable, you are not to blame.
    It was a Black Swan and you are not supposed to
   predict Black Swans and certainly cannot be blamed
                    for their occurrence.

Torrens Resilience Institute                            © 2010
                                                        Taleb 2010 33
The thinking problem
 Heuristics and biases (Adapted from Esgate, et al., 2005.)

                    The belief that a limited finding is representative of a wider
 Representativeness population. E.g. yesterday and today it was rainy, hence tomorrow
                    it is going to rain for sure.
                    Tendency to look for information confirming your own opinion
 Confirmation bias
                    neglecting information that may contradict it.
                               The belief that if an event occurs consecutively and independently
                               more than once, then the event that has not occurred till now, is
 Gambler’s fallacy
                               going to happen next. E.g. playing roulette: after three times red,
                               the belief that black is coming out next.
                               A probability of a general event is estimated higher than one of
                               the singular events. This is impossible as the product of the
                               singular probabilities will always give a lower probability value for
 Conjunction fallacy
                               the total probability. E.g. A and B have respectively a probability of
                               0.6. The total probability of the occurrence of both events A and B
                               will be 0.36.
                               The more information available in one’s mind, the more it bears
 Availability                  on the outcome, e.g. “If I can think of it, then it must be
Torrens Resilience Institute   important”.                                                     © 2010
                                                                                                  Cavallo 2010
Dancing with complexity

 • ‘We can’t control [complex] systems or figure
   them out. But we can dance with
   them!’(Meadows 2002).
 • We can be adaptable as long as we are trained
   to face ‘complex’ uncertainty.
 • Personal characteristics are very important.
 • Many organisations invest in contingency plans,
   rather than development of their human
   resources.

Torrens Resilience Institute                             © 2010
                                                          Cavallo 2010
Training for adaptability
                                (Meadows 2002, Snowden 1999).
 Adaptability

                               The ability to react in a timely manner to change in the environment
                               - i.e. in the context of a set of strategies that are already being executed, and a
 Responsiveness                change occurring that creates a new threat or opportunity, responsiveness is the
                               capacity to recognise and deal with the new threat or opportunity as effectively as
                               if there were ample time to plan and prepare for it

                               The ability to recover from or adjust to misfortune or damage, and the ability to
                               degrade gracefully under attack or as a result of partial failure
 Resilience                    - i.e. the core functions of the force continue to achieve essential levels of
                               capability when individual elements are disabled one or more at a time; there is
                               quick recovery from damage with minimal interruption or loss of capability
                               The ability to recognise when to shift from one strategy to another and to do so
                               easily.
 Agility                       - i.e. producing a rapid change of tack to more effective behaviours when
                               significant external and/or internal changes arise requiring significant and different
                               responses
                               The ability to create and maintain effectiveness across a wide range of tasks,
                               situations, and conditions.
 Flexibility                   - i.e. structure and capability of the force can be reconfigured in different ways to
                               do different things, under different sets of conditions, and the need to do this can
                               be recognised and implemented rapidly
Torrens Resilience Institute                                                                                    © 2010
                                                                                                                 Cavallo 2010
Guidelines to increase adaptability
                               Starting from the system’s history allows a more objective view; particularly, at
 History
                               the outset of a project facts are more reliable than theories.
                               Encouraging the challenging of opinions and models is very important to be able
 Creativity
                               to notice emerging system’s behaviour.
                               Drafting is a good technique to correct mistakes, admit uncertainties and hence
 Outline
                               stay flexible.
 Speak up                      Nobody in the team should pretend things are under control if they are not.
                               Particularly, at the outset of a project, it is important to avoid absolutist and rigid
 Humbleness
                               approaches.
 Protect                       True and complete information should be a value to honour in a system.
                               Creating feedback loops allows a team to stay dynamic and to adjust
 Feedback
                               continuously to the changing environment of a SoS.
                               It is crucial while managing risks in a complex project to consider consequences
 Whole system                  in the whole SoS in both its spatial and time-related components, i.e. in the short
                               and long term, too.
                               Experts should be willing to learn further, admit ignorance and be taught by
 Interdisciplinary             other specialists without being obsessed by the ‘academically correct’ cast of
                               mind.
                               Independently from their specific task team players should expand their
 Care                          interests, competencies and personal feelings to the areas of the project which
                               would be affected by their decisions.
Torrens Resilience Institute                                                                                      © 2010
                                                                                                                   Cavallo 2010
Responding to Black Swans

    In these circumstances we have to move from
      planning to preparations. The high level of
  uncertainty means that our plans will be redundant
  and we must rely on the ability of the organisation-
   at a cellular level-to respond consistently in the
                   face of uncertainty.

Torrens Resilience Institute                               © 2010
                                                            Cavallo 2010
Hazard
                          RiskEvent                     Hazard Modification
         Risk Reduction

                                           Event
                          RiskDamage                    Absorbing Capacity
                                           Damage

                                                                               Resilience
                          Risk? Function
Preparedness

                                                       Buffering Capacity
                                 Changes in Function
                                                       Local Response
                                                       Capacity
                                            Needs
                                                     Local Responses     Emergency
                            RiskDisaster
                                                     Outside Responses
                                                                       Risk
  Torrens Resilience Institute             Disaster                           Emergency
                                                                                 © 2010
Building resilient systems

           • Risk management
                    • Identify hazards and treat risks
           • Absorbing capacity
                    • Absorb impact and limit damage
           • Buffering capacity
                    • Overcome change in function - redundancy
           • Response Capacity
                    • Rescue and relief, surge capacity

Torrens Resilience Institute                                     © 2010
Questions

                 www.torrensresilience.org
                 www.csarn.org.au

Torrens Resilience Institute                        © 2010
You can also read