Climate extinction & causal statistics
Using indeterminate calculations to produce insights on the likelihood of human annihilation by global warming
Extinction Rebellion’s cofounder Roger Hallam spreads the fearful message that the youngest on Earth will be annihilated by climate change. They will starve or be killed by gangs of starving people. This has prompted some grisly public demonstrations in protest of this macabre future.
Is this scenario probable? Humans often rely on their gut or intuition when estimating probabilities. Kahneman and Tversky delighted in revealing just how wrong those intuitions can be and together they named over a dozen biases and short-cuts humans are apt to draw upon. In one famous test, they gave subjects a description of a shy, meek, & detailed person (SMD for short) then asked “what is the likelihood this person is a librarian?” Most people had never met a librarian whose personality wasn’t SMD so they judged the probability relatively high, say 50%. But, this neglects the relatively small number of librarians that exist in a society, so the actual probability of meeting an SMD individual with the profession of librarian is more like 0.05%. Subjects tended to overestimate the probability that a SMD person being a librarian by three orders of magnitude (1000x). Why?
One explanation is they answered the wrong question. In statistical notation “Given an SMD individual, what is the % chance this is a librarian” is written as: P(librarian|SMD). What most people actually answered was the inverse: “given a librarian, what is the % chance this person is SMD?” written: P(SMD|librarian). These are completely different questions; the former is 0.05% and the latter is perhaps as high as 50%.
This librarian story is an example of a single-level of conditional logic, but many problems have several levels. Consider the example from Pearl et al1: “What is the likelihood that the pavement is slippery given that there are clouds, no rain, and the pavement is currently dry?” P(slippery|clouds, no rain, dry pavement). A reasonable guess might be 10% but using the product rule, this question can be decomposed into several probabilities that are each individually easier to estimate:
P(clouds)P(no rain|clouds)P(dry pavement|no rain)P(slippery|dry pavement) (1)
Suppose we estimate P(clouds) = 50% meaning local weather is cloudy half the time. The probability that it raining when there are clouds is perhaps only 15%, therefore P(no rain|clouds) is (1-15%) = 85%. The probability that the pavement is dry at a moment when there is no rain is likely to be high, perhaps 90%. Finally the probability that pavement is slippery given that it is dry must be quite low: 5%. Substituting each of these probabilities into (1) gives: 0.5*0.85*0.9*0.05 = 0.0191 = 1.91%. Each estimated probability has error bars and this decomposition allows for sensitivity testing to determine which factor has the greatest contribution to the product total uncertainty.
Conditional probabilities in climate
Suppose we wanted to rationally engage XR’s claim and estimate the probability that humans will be annihilated (go extinct) as a direct result of anthropogenic global warming (AGW) induced from our own anthropogenic greenhouse gas (AGHG) emissions: P(extinction|AGHG). This is hard to estimate directly, but it can be decomposed as follows:
In plain English, you can read this as:
The probability humans are emitting GHGs into the atmosphere: P(AGHG) x
The probability that human emissions are causing atmospheric concentrations to rise (e.g., the ocean isn’t absorbing it all, or there is a non-human cause for the rise): P(↑GHG|AGHG) x
The probability that radiative forcing (RF) increases, given the higher concentrations of GHGs: P(↑RF|↑GHG) x
The probability that surface temperatures will rise, given the increased RF: P(↑T|↑RF) x
The probability that conditions become dangerous for humans, given the surface temperature rise: P(danger|↑T) x
The probability humans will go extinct, given a dangerous surface temperature rise: P(extinction|danger)
It’s not at all intuitive that estimating one probability—the chance of extinction given anthropogenic emissions—is actually computing the product of six distinct probabilities. It is scarcely in the awareness of the person making such an estimation; it is the hidden complexity. Let us now attempt crude estimates for these values.
P(AGHG) = 100%
The probability humans are adding GHGs to the atmosphere P(AGHG) = 100%. You can verify this is true by driving a petrol-powered car or flying in a commercial airliner. The energy source was a fossil fuel and the combustion products are now in the atmosphere.
P(↑GHG|AGHG) = 99.9±0.1%
It is virtually certain that atmospheric CO₂ concentration in the atmosphere is rising. There are many instruments actively measuring the CO₂, but the two considered to be the highest quality are Mauna Loa (northern hemisphere) and Cape Grim (southern hemisphere).
No measurement can be 100% accurate, so this 99.9% estimate is a declaration of very high confidence. Now, nature’s annual CO₂ emission and absorption is more than 10x the annual anthropogenic emission quantity, so if nature happens to emit 10% more than usual, it would be difficult to distinguish from anthropogenic emissions. But, it seems unlikely such an imbalance would persist for decades. Determining whether the added CO₂ is completely natural vs anthropogenic can be determined through a study of the changing isotopic ratio of carbon in the atmosphere. The declining ¹³C ratio in atmospheric CO₂ is most readily explained by the burning of very old carbon (fossil fuels).
P(↑RF|↑GHG) = 99.999±0.001%
The probability that clear-sky radiative forcing is higher, given the elevated concentration of GHGs is 99.999±0.001%. This can be measured on the surface using the Atmospheric Emitted Radiance Interferometer in multiple locations. It can also be measured from space and the consilience of these measurements gives extremely high level of confidence RF is increasing. It also matches theory. Note that current estimates of the RF change over the last two centuries are +3 W/m² increase over the natural (pre-industrial) greenhouse effect of 150 W/m².
P(↑T|↑RF) = 99±1%
The probability that surface temperatures will rise where we live and grow crops as the primary response to increased RF is extremely likely (99%), but not ‘virtually certain’ (99.999%). Surface heating is indeed the most likely result, but a small chance of an unexpected responses in the thermal structure of the atmosphere producing more/higher albedo clouds or perhaps a shift in the Atlantic thermohaline circulation cools off Europe. There are jet streams that move unpredictably and all of this can be summarized in a 1% uncertainty to the expectation of surface warming.
P(danger|↑T) = 0.1% to 99%
The probability that (gradual) surface warming will be dangerous is the second-hardest number to estimate. Generally, a cold spell induces excess mortality with a time lag of 7-21 days. In a heat wave, the excess deaths emerge within a few days. Compared to their preferred temperature, humans can tolerate 10-15 degrees of cooling better2 than 10-15 degrees of warming (note in fig. 2 slope of the red curve is much steeper than the slope of the blue curve).
“Dangerous” is also a subjective term. Some consider it dangerous to fly in an airplane (1 chance in 10⁹ of death by crash) while others routinely ride motorcycles and do not regard them as dangerous (1 chance in 10⁵ of death by crash, 10,000x greater than flying). Therefore, we must acknowledge everyone has a different perspective on what “dangerous” means.
Having just been through a pandemic, perhaps we can borrow the CDC’s definition of a “dangerous” pathogen and apply it to climate: COVID-19 was a CDC Category 4/5 pathogen with an unmitigated death toll of >0.3% of the population. But what level of surface temperature change would induce this amount of excess mortality? CarbonBrief summarizes published literature and reports excess deaths due to heat+cold will decline as temperatures warm by +2C, but at +4C the increasing number of heat-related death exceeds the decline of cold-related deaths, producing more overall loss of life. Will we even reach 4C of warming? Or will current stabilization efforts succeed at 2.0-2.5C? Lacking a crystal ball, we are left with a wide range of feasible probabilities on this term.
P(extinction|danger) = 0.000 000 001% to 10%.
Now for the hardest number to estimate: the probability that so much dangerous warming occurs that the planet becomes unlivable. It may not even be possible to get this amount of warming; credible scientists state there isn’t enough extractable fossil fuels to get us to this point, therefore this probability should be 0%. However, let’s entertain the concept for a moment: before going extinct, there will doubtless have been a myriad of atmospheric geoengineering intervention attempts with sulfate aerosols to increase albedo. Humans have always fought extermination through unrelenting innovation and ingenuity. Wielding the most fantastic body of scientific knowledge in history combined with cutting-edge engineering capability, I expect our collective resilience will rise to a level heretofore unseen. This is no guarantee of success, but whether the chance of failure to survive is one a thousand, a million, or a trillion, who can say? It is a difficult number to fathom.
A useless or useful result?
The product of each of the terms above computes the probability of human extinction at between 0.000 000 000 001% and 10%, which does not appear to be helpful. We tend to prefer a tidy, singular number like “0.034%.” Yet, the calculation was useful for removing any mistaken probability inversions—P(librarian|SMD) vs P(SMD|librarian)—and it is useful for another reason:
When estimating the probability of extinction, one need not focus on the reliability of the CO₂ radiative forcing science at all, since that is not where the uncertainty resides.
There just isn’t enough uncertainty (e.g., 99±1%) in the physics of GHG radiative transfer when calculating the survival of the species; all of the uncertainty about survival is in the last two terms. This insight is analogous to what the Drake equation revealed about the chances of there being extraterrestrial life: seven different parameters are multiplied, but the majority of the uncertainty was in a singular estimate of “length of time a technologically-advanced civilization broadcasts signals into space.” No one knows if that length of time a century, a millennium, or a hundred million years. Since there is such a wide range of credible answers, this produced the following insight: detection of an extraterrestrial civilization implies that intelligent civilizations tend to be long-lived. The corresponding insight from the climate equation is: predictions of human extinction imply a very low estimate of human resilience.
If you encounter anyone attempting to connect the reliability-of-peer-reviewed science with the likelihood-of-human-extinction, that person is not properly considering the conditionals and the chain of probabilities involved. When it comes to extinction (or extermination/annihilation) it is the probability of a successful human response that matters, not the reliability of the science measuring the warming. Having completed this exercise, I find I am less interested in further reading on radiative transfer and I am more interested in reading the body of literature (if it exists) on human resilience and related topics. If you have suggestions on what to read, please comment below.
Glymour, M., Jewell, N. P., Pearl, J. (2016). Causal Inference in Statistics: A Primer. Germany: Wiley.
Fu, S. H., Gasparrini, A., Rodriguez, P. S., & Jha, P. (2018). Mortality attributable to hot and cold ambient temperatures in India: a nationally representative case-crossover study. PLoS medicine, 15(7), e1002619.