Top Climate Model Improved to Show ENSO Skill

Previous posts (linked at end) discuss how the climate model from RAS (Russian Academy of Science) has evolved through several versions. The interest arose because of its greater ability to replicate the past temperature history. The model is part of the CMIP program which is now going the next step to CMIP7, and is one of the first to test with a new climate simulation. Improvements to the latest model, INMCM60, show an enhanced ability to replicate ENSO oscillations in the Pacific ocean, which have significant climate impacts world wide.

This news comes by way of a new paper published in the Russian Journal of Numerical Analysis and Mathematical Modelling February 2024.  The title is ENSO phase locking, asymmetry and predictability in the INMCM Earth system model Seleznev et al. (2024) Excerpts in italics with my bolds and images from the article.

Abstract:

Advanced numerical climate models are known to exhibit biases in simulating some features of El Niño–Southern Oscillation (ENSO) which is a key mode of inter-annual climate variability. In this study we analyze how two fundamental features of observed ENSO – asymmetry between hot and cold states and phase-locking to the annual cycle – are reflected in two different versions of the INMCM Earth system model (state-of-the-art Earth system model participating in the Coupled Model Intercomparison Project).

We identify the above ENSO features using the conventional empirical orthogonal functions (EOF) analysis which is applied to both observed and simulated upper ocean heat content (OHC) data in the tropical Pacific. We obtain that the observed tropical Pacific OHC variability is described well by two leading EOF-modes which roughly reflect the fundamental recharge-discharge mechanism of ENSO. These modes exhibit strong seasonal cycles associated with ENSO phase locking while the revealed nonlinear dependencies between amplitudes of these cycles reflect ENSO asymmetry.

We also assess and compare predictability of observed and simulated ENSO based on linear inverse modeling. We find that the improved INMCM6 model has significant benefits in simulating described features of observed ENSO as compared with the previous INMCM5 model. The improvements of the INMCM6 model providing such benefits arediscussed. We argue that proper cloud parametrization scheme is crucial for accurate simulation of ENSO dynamics with numerical climate models

Introduction

El Niño–Southern Oscillation (ENSO) is the most prominent mode of inter-annual climate variability which originates in the tropical Pacific, but has a global impact [41]. Accurately simulating ENSO is still a challenging task for global climate modelers [3,5,15,25]. In the comprehensive study [35] large-ensemble climate model simulations provided by the Coupled Model Intercomparison Project phases 5 (CMIP5)and 6 (CMIP6) were analyzed. It was found that the CMIP6 models significantly outperform those fromCMIP5 for 8 out of 24 ENSO-relevant metrics, especially regarding the simulation of ENSO spatial patterns, diversity and teleconnections. Nevertheless, some important aspects of the observed ENSO are still not satisfactorily simulated by the most of state-of-the-art models [7,38,49]. In this study we are aimed at examination of how two such aspects – ENSO asymmetry and ENSO phase-locking to the annual cycle –are reflected in the INMCM Earth system model [44, 45].

The asymmetry between hot (El Nino) and cold (La Nina) states is a fundamental feature in the observed ENSO occurrences [39]. El Niño events are often stronger than La Niña events, while the latter ones tend to be more persistent [10]. Such an asymmetry is generally attributed to nonlinear feedbacks between sea surface temperatures (SSTs), thermocline and winds in the tropical Pacific [2,19,28]. The alternative conceptions highlight the role of tropical instability waves [1] and fast atmospheric processes associated with irregular zonal wind anomalies [24]. ENSO phase-locking is identified as the tendency of ENSO-events to peak in boreal winter.

Several studies [11,17,34] argue that the phase-locking is associated with seasonal changes in thermocline depth, ocean upwelling velocity, and cloud feedback processes. These processes collectively contribute to the coupling strength modulation between ocean and atmosphere, which, in the context of conceptual ENSO models [4,18], provides seasonal modulation of stability (in the sense of decay rate) of the “ENSO oscillator”. Another theory [20,42] supposes the phase-locking results from nonlinear interactions between the seasonal forcing and the inherent ENSO cycle. Both the asymmetry and phaselocking effects are typically captured by low-dimensional data-driven ENSO models [14, 21, 26, 29, 37].

In this work we identify the ENSO features discussed above via the analysis of upper ocean heat content (OHC) variability in the the tropical Pacific. The recent study [37] analyzed high-resolution reanalysis dataset of the tropical Pacific (10N – 10S, 120E – 80W) OHC anomalies in the 0–300 m depth layer using the standard empirical orthogonal function (EOF) decomposition [16]. It was found that observed OHC variability is effectively captured by two leading EOFs, which roughly describe the fundamental recharge-discharge mechanism of ENSO [18]. The time series of the corresponding principal components (PCs) demonstrate strong seasonal cycles, reflecting ENSO phase-locking, while the revealed inter-annual nonlinear dependencies between these cycles can be associated with ENSO asymmetry [37].

Here we apply similar analysis to the OHC data simulated by two different versions of INMCM Earth system model. The first is the INMCM5 model [45] from CMIP6, and the second is the perspective INMCM6 [44] model with improved parameterization of clouds, large-scale condensation and aerosols. Along with the traditional EOF decomposition we invoke the linear inverse modeling to assess and compare predictability of ENSO from observed and simulated data.

The paper is organized as follows. Sect. 2 describes the datasets we analyze: OHC reanalysis dataset and OHC data obtained from the ensemble simulations of global climate with two versions of INMCM model. Data preparation, including separation of the forced and internal variability, is also discussed. The ensemble EOF analysis is represented, which is used for identifying the meaningful processes contributing to observed and simulated ENSO dynamics. Sect. 3 presents the results we obtain in analyzing both observed and simulated OHC data. In Sect. 4 we summarize and discuss the obtained results, particularly regarding the significant benefits of new version of INMCM model (INMCM6) in simulating key features of observed ENSO.

Fig. 1: Two leading EOFs of the observed tropical Pacific upper ocean heat content (OHC) variability

Fig. 2: Two leading EOFs of the INMCM5 ensemble of tropical Pacific upper ocean heat content simulations

Fig. 3: The same as in Fig. 2 but for INMCM6 model simulations

The corresponding spatial patterns in Fig. 1 have clear interpretation. The first contributes to the central and eastern tropical Pacific, where most significant variations of sea surface temperature (SST) during El Niño/La Nina events occur [9]. The second predominates mainly in the western tropical Pacific and can be associated with the OHC accumulation and discharge before and during the El Niño events [48].

What we can see from Fig. 2 is that the two leading EOFs of OHC variability simulated by the INMCM5 model do not correspond to the observed ones. The corresponding time series and spatial patterns exhibit smaller-scale features, as compared to those we obtain from the reanalysys data, indicating their noisier spatio-temporal nature.

The two leading EOFs of the improved INMCM6 model (Fig. 3), by contrast, capture well both the spatial and temporal features of observed EOFs. In the next section we focus on furtheranalysis of these EOFs assuming that they contain the most meaningful information about ENSO dynamics.

Discussion

In this study we have analyzed how two different versions of the INMCM model [44,45] (state-of-the-art Earth system model participating in the Coupled Model Intercomparison Project, CMIP) simulate some features of El Niño–Southern Oscillation (ENSO) which is a key mode of the global climate. We identified the ENSO features via the EOF analysis applied to both observed and simulated upper ocean heat content(OHC) variability in the the tropical Pacific. It was found that the observed tropical Pacific OHC variability is captured well by two leading modes (EOFs) which reflect the fundamental recharge-discharge mechanism of ENSO involving a recharge and discharge of OHC along the equator caused by a disequilibrium between zonal winds and zonal mean thermocline depth. These modes are phase-shifted and exhibit the strong seasonal cycles associated with ENSO phase locking. The inter-annual dependencies between amplitudes of the revealed ESNO seasonal cycles are strongly nonlinear which reflects the asymmetry between hot (ElNino) and cold (La Nina) states of observed ENSO. We found that the INMCM5 model (the previous version of the INMCM model from CMIP6) poorly reproduces the leading modes of observed ENSO and reflect neither the observed ENSO phase locking nor asymmetry. At the same time, the perspective INMCM6 model demonstrates significant improvement in simulating these key features of observed ENSO. The analysis of ENSO predictability based on linear inverse modeling indicates that the improved INMCM6 model reflects well the ENSO spring predictability barrier and therefore could potentially have an advantage in long range weather prediction as compared with the INMCM5.

Such benefits of the new version of the INMCM model (INMCM6) in simulating observed ENSO dynamics can be provided by using more relevant parametrization of sub-grid scale processes. Particularly, the difference in the amplitude of OHC anomaly associated with ENSO between INMCM5 and INMCM6 shown in Fig.2-3 can be explained mainly by the difference in cloud parameterization in these models. In short, in INMCM5 El-Nino event leads to increase of middle and low clouds over central and eastern Pacific that leads to cooling because of decrease in surface incoming shortwave radiation.

While decrease in low clouds and increase in high clouds in INMCM6 over El-Nino region during positive phase of ENSO lead to further upper ocean warming [43]. This is consistent with the recent study [36] which argued that erroneous cloud feedback arising from a dominant contribution of low-level clouds may lead to heat flux feedback bias in the tropical Pacific, which play a key role in ENSO dynamics. Fast decrease in OHC in central Pacific after El-Nino maximum in INMCM6 can probably occur because of too shallow mixed layer in equatorial Pacific in the model, that leads to fast surface cooling after renewal of upwelling and further increase of tradewinds. Summarizing the above we can conclude that proper cloud parameterization scheme is crucial for accurate simulation of observed ENSO with numerical climate models.

Background on INMCM6

New 2023 INMCM RAS Climate Model First Results

The INMCM60 model, like the previous INMCM48 [1], consists of three major components: atmospheric dynamics, aerosol evolution, and ocean dynamics. The atmospheric component incorporates a land model including surface, vegetation, and soil. The oceanic component also encompasses a sea-ice evolution model. Both versions in the atmosphere have a spatial 2° × 1° longitude-by-latitude resolution and 21 vertical levels up to 10 hPa. In the ocean, the resolution is 1° × 0.5° and 40 levels.

The following changes have been introduced into the model compared to INMCM48.

Parameterization of clouds and large-scale condensation is identical to that described in [4], except that tuning parameters of this parameterization differ from any of the versions outlined in [3], being, however, closest to version 4. The main difference from it is that the cloud water flux rating boundary-layer clouds is estimated not only for reasons of boundary-layer turbulence development, but also from the condition of moist instability, which, under deep convection, results in fewer clouds in the boundary layer and more in the upper troposphere. The equilibrium sensitivity of such a version to a doubling of atmospheric СО2 is about 3.3 K.

The aerosol scheme has also been updated by including a change in the calculation of natural emissions of sulfate aerosol [5] and wet scavenging, as well as the influence of aerosol concentration on the cloud droplet radius, i.e., the first indirect effect [6]. Numerical values of the constants, however, were taken to be a little different from those used in [5]. Additionally, the improved scheme of snow evolution taking into account refreezing and the calculation of the snow albedo [7] were introduced to the model. The calculation of universal functions in the atmospheric boundary layer in stable stratification has also been changed: in the latest model version, such functions assume turbulence at even large gradient Richardson numbers [8].

 

Hot Climate Models Not Fit For Policymaking

Roy Spencer has published a study at Heritage Global Warming: Observations vs. Climate Models.  Excerpts in italics with my bolds.

Summary

Warming of the global climate system over the past half-century has averaged 43 percent less than that produced by computerized climate models used to promote changes in energy policy. In the United States during summer, the observed warming is much weaker than that produced by all 36 climate models surveyed here. While the cause of this relatively benign warming could theoretically be entirely due to humanity’s production of carbon dioxide from fossil-fuel burning, this claim cannot be demonstrated through science. At least some of the measured warming could be natural. Contrary to media reports and environmental organizations’ press releases, global warming offers no justification for carbon-based regulation.

KEY TAKEAWAYS
  1. The observed rate of global warming over the past 50 years has been weaker than that predicted by almost all computerized climate models.
  2. Climate models that guide energy policy do not even conserve energy, a necessary condition for any physically based model of the climate system.
  3. Public policy should be based on climate observations—which are rather unremarkable—rather than climate models that exaggerate climate impacts.

For the purposes of guiding public policy and for adaptation to any climate change that occurs, it is necessary to understand the claims of global warming science as promoted by the United Nations Intergovernmental Panel on Climate Change (IPCC).  When it comes to increases in global average temperature since the 1970s, three questions are pertinent:

  1. Is recent warming of the climate system materially attributable to anthropogenic greenhouse gas emissions, as is usually claimed?
  2. Is the rate of observed warming close to what computer climate models—used to guide public policy—show?
  3. Has the observed rate of warming been sufficient to justify alarm and extensive regulation of CO2 emissions?

While the climate system has warmed somewhat over the past five decades,
the popular perception of a “climate crisis” and resulting calls for economically
significant regulation of CO2 emissions is not supported by science.

Discussion Points

Temperature Change Is Caused by an Imbalance Between Energy Gain and Energy Loss.

Recent Warming of the Climate System Corresponds to a Tiny Energy Imbalance.

Climate Models Assume Energy Balance, but Have Difficulty Achieving It.

Global Warming Theory Says Direct Warming from a Doubling of CO2 Is Only 1.2°C.

Climate Models Produce Too Much Warming.

Climate models are not only used to predict future changes (forecasting), but also to explain past changes (hindcasting). Depending on where temperatures are measured (at the Earth’s surface, in the deep atmosphere, or in the deep ocean), it is generally true that climate models have a history of producing more warming than has been observed in recent decades.

This disparity is not true of all the models, as two models (both Russian) produce warming rates close to what has been observed, but those models are not the ones used to promote the climate crisis narrative. Instead, those producing the greatest amount of climate change usually make their way into, for example, the U.S. National Climate Assessment,  the congressionally mandated evaluation of what global climate models project for climate in the United States.

The best demonstration of the tendency of climate models to overpredict warming is a direct comparison between models and observations for global average surface air temperature, shown in Chart 1.

In this plot, the average of five different observation-based datasets (blue) are compared to the average of 36 climate models taking part in the sixth IPCC Climate Model Intercomparison Project (CMIP6). The models have produced, on average, 43 percent faster warming than has been observed from 1979 to 2022. This is the period of the most rapid increase in global temperatures and anthropogenic greenhouse gas emissions, and also corresponds to the period for which satellite observations exist (described below). This discrepancy between models and observations is seldom mentioned despite that fact that it is, roughly speaking, the average of the models (or even the most extreme models) that is used to promote policy changes in the U.S. and abroad.

Summertime Warming in the United States

While global averages produce the most robust indicator of “global” warming, regional effects are often of more concern to national and regional governments and their citizens. For example, in the United States large increases in summertime heat could affect human health and agricultural crop productivity. But as Chart 2 shows, surface air temperatures during the growing season (June, July, and August) over the 12-state Corn Belt for the past 50 years reveal a large discrepancy between climate models and observations, with all 36 models producing warming rates well above what has been observed and the most extreme model producing seven times too much warming.  

The fact that global food production has increased faster than population growth in the past 60 years suggests that any negative impacts due to climate change have been small. In fact, “global greening” has been documented to be occurring in response to more atmospheric CO2, which enhances both natural plant growth and agricultural productivity, leading to significant agricultural benefits.

These discrepancies between models and observations are never mentioned when climate researchers promote climate models for energy policy decision-making. Instead, they exploit exaggerated model forecasts of climate change to concoct exaggerated claims of a climate crisis.

Global Warming of the Lower Atmosphere

While near-surface air temperatures are clearly important to human activity, the warming experienced over the low atmosphere (approximately the lowest 10 kilometers of the “troposphere,” where the Earth’s weather occurs) is also of interest, especially given the satellite observations of this layer extending back to 1979.

Satellites provide the only source of geographically complete coverage of the Earth, except very close to the North and South Poles.

Chart 3 shows a comparison of the temperature of this layer as produced by 38 climate models (red) and how the same layer has been observed to warm in three radiosonde (weather balloon) datasets (green), three global reanalysis datasets (which use satellites, weather balloons, and aircraft data; black), and three satellite datasets (blue).

Conclusion

Climate models produce too much warming when compared to observations over the past fifty years or so, which is the period of most rapid warming and increases in carbon dioxide in the atmosphere. The discrepancy ranges from over 40 percent for global surface air temperature, about 50 percent for global lower atmospheric temperatures, and even a factor of two to three for the United States in the summertime. This discrepancy is never mentioned when those same models are used as the basis for policy decisions.

Also not mentioned when discussing climate models is their reliance on the assumption that there are no natural sources of long-term climate change. The models must be “tuned” to produce no climate change, and then a human influence is added in the form of a very small, roughly 1 percent change in the global energy balance. While the resulting model warming is claimed to prove that humans are responsible, clearly this is circular reasoning. It does not necessarily mean that the claim is wrong—only that it is based on faith in assumptions about the natural climate system that cannot be shown to be true from observations.

Finally, possible chaotic internal variations will always lead to uncertainty in both global warming projections and explanation of past changes. Given these uncertainties, policymakers should proceed cautiously and not allow themselves to be influenced by exaggerated claims based on demonstrably faulty climate models.

Roy W. Spencer, PhD, is Principal Research Scientist at the University of Alabama in Huntsville.

 

 

Climate Models Hide the Paleo Incline

Figure 1. Anthropgenic and natural contributions. (a) Locked scaling factors, weak Pre Industrial Climate Anomalies (PCA). (b) Free scaling, strong PCA

In  2009, the iconic email from the Climategate leak included a comment by Phil Jones about the “trick” used by Michael Mann to “hide the decline,” in his Hockey Stick graph, referring to tree proxy temperatures  cooling rather than warming in modern times.  Now we have an important paper demonstrating that climate models insist on man-made global warming only by hiding the incline of natural warming in Pre-Industrial times.  The paper is From Behavioral Climate Models and Millennial Data to AGW Reassessment by Philippe de Larminat.  H/T No Tricks Zone. Excerpts in italics with my bolds.

Abstract

Context. The so called AGW (Anthropogenic Global Warming), is based on thousands of climate simulations indicating that human activity is virtually solely responsible for the recent global warming. The climate models used are derived from the meteorological models used for short-term predictions. They are based on the fundamental and empirical physical laws that govern the myriad of atmospheric and oceanic cells integrated by the finite element technique. Numerical approximations, empiricism and the inherent chaos in fluid circulations make these models questionable for validating the anthropogenic principle, given the accuracy required (better than one per thousand) in determining the Earth energy balance.

Aims and methods. The purpose is to quantify and simulate behavioral models of weak complexity, without referring to predefined parameters of the underlying physical laws, but relying exclusively on generally accepted historical and paleoclimate series.

Results. These models perform global temperature simulations that are consistent with those from the more complex physical models. However, the repartition of contributions in the present warming depends strongly on the retained temperature reconstructions, in particular the magnitudes of the Medieval Warm Period and the Little Ice Age. It also depends on the level of the solar activity series. It results from these observations and climate reconstructions that the anthropogenic principle only holds for climate profiles assuming almost no PCA neither significant variations in solar activity. Otherwise, it reduces to a weak principle where global warming is not only the result of human activity, but is largely due to solar activity.

Discussion

GCMs (short acronym for AOCGM: Atmosphere Ocean General Circulation Models, or for Global Climate model) are fed by series related to climate drivers. Some are of human origin: fossil fuel combustion, industrial aerosols, changes in land use, condensation trails, etc. Others are of natural origin: solar and volcanic activities, Earth’s orbital parameters, geomagnetism, internal variability generated by atmospheric and oceanic chaos. These drivers, or forcing factors, are expressed in their own units: total solar irradiance (W m–2), atmospheric concentrations of GHG (ppm), optical depth of industrial or volcanic aerosols (dimless), oceanic indexes (ENSO, AMO…), or by annual growth rates (%). Climate scientists have introduced a metric in order to characterize the relative impact of the different climate drivers on climate change. This metric is that of radiative forcings (RF), designed to quantify climate drivers through their effects on the terrestrial radiation budget at the top of the atmosphere (TOA).

However, independently of the physical units and associated energy properties of the RFs, one can recognize their signatures in the output and deduce their contributions. For example, volcanic eruptions are identifiable events whose contributions can be quantified without reference to either their assumed radiative forcings, or to physical modeling of aerosol diffusion in the atmosphere. Similarly, the Preindustrial Climate Anomalies (PCA) gathering the Medieval Warm Period (MWP) and the Little Ice Age (LIA), shows a profile similar to that of the solar forcing reconstructions. Per the methodology proposed in this paper, the respective contributions of the RF inputs are quantified through behavior models, or black-box models.

Now, Figures 1-a and 1-b presents simulations obtained from the models identified under two different sets of assumptions, detailed in sections 6 and 7 respectively.

Figure 1. Anthropgenic and natural contributions. (a) Locked scaling factors, weak Pre Industrial Climate Anomalies (PCA). (b) Free scaling, strong PCA

In both cases, the overall result for the global temperature simulation (red) fits fairly well with the observations (black).  Curves also show the forcing contributions to modern warming (since 1850). From this perspective, the natural (green) and anthropogenic (blue) contributions are in strong contradiction between panels (a) and (b). This incompatibility is at the heart of our work.

Simulations in panel (a) are calculated per section 6, where the scaling multipliers planned in the model are locked to unity, so that the radiative forcing inputs are constrained to strictly comply with the IPCC quantification. The remaining parameters of the black-box model are adjusted in order to minimize the deviation between the observations (black curve) and the simulated outputs (red). Per these assumptions, the resulting contributions (blue vs. green) comply with the AGW principle. Also, the conformity of the results with those of the CMIP supports the validity of the type of behavioral model adopted for our simulations.

Paleoclimate Temperatures

Although historically documented the Medieval Warm Period (MWP) and the Little Ice Age (LIA) don’t make consensus about their amplitudes and geographic extensions [2, 3]. In Fig. 7.1-c of the First Assessment Report of IPCC, a reconstruction from showed a peak PCA amplitude of about 1.2 °C [4]. Then later on, a reconstruction by the so-called ‘hockey stick graph’, was reproduced five times in the IPCC Third Assessment Report (2001), wherein there was no longer any significant MWP [5].

After, 2003 controversies reference to this reconstruction had disappeared from subsequent IPCC reports:it is not included among the fifteen paleoclimate reconstructions covering the millennium period listed in the fifth report (AR5, 2013) [6]. Nevertheless, AR6 (2021) revived a hockey stick graph reconstruction from a consortium initiated by a network “PAst climate chanGES” [7,8]. The IPCC assures (AR6, 2.3.1.1.2): “this synthesis is generally in agreement with the AR5 assessment”.

Figure 2 below puts this claim into perspective. It shows the fifteen reconstructions covering the preindustrial period accredited by the IPCC in AR5 (2013, Fig. 5.7 to 5.9, and table 5.A.6), compiled (Pangaea database) by [7]. Visibly, the claimed agreement of the PAGES2k reconstruction (blue) with the AR5 green lines does not hold.

Figure 2. Weak and strong preindustrial climate anomalies, respectively from AR5 (2013) in green and AR6 (2021) in blue.

Conclusion

In section 8 above, a set of consistent climate series is explored, from which solar activity appears to be the main driver of climate change. To eradicate this hypothesis, the anthropogenic principle requires four simultaneous assessments:

♦  A strong anthropogenic forcing, able to account for all of the current warming.
♦  A low solar forcing.
♦  A low internal variability.
♦  The nonexistence of significant pre-industrial climate anomalies, which could indeed be explained by strong solar forcing or high internal variability.

None of these conditions is strongly established, neither by theoretical knowledge nor by historical and paleoclimatic observations. On the contrary, our analysis challenges them through a weak complexity model, fed by accepted forcing profiles, which are recalibrated owning to climate observations. The simulations show that solar activity contributes to current climate warming in proportions depending on the assessed pre-industrial climate anomalies.

Therefore, adherence to the anthropogenic principle requires that when reconstructing climate data, the Medieval Warming Period and the Little Ice Age be reduced to nothing, and that any series of strongly varying solar forcing be discarded. 

Background on Disappearing Paleo Global Warming

The first graph appeared in the IPCC 1990 First Assessment Report (FAR) credited to H.H.Lamb, first director of CRU-UEA. The second graph was featured in 2001 IPCC Third Assessment Report (TAR) the famous hockey stick credited to M. Mann.

Rise and Fall of the Modern Warming Spike

 

Self Imposed Energy Poverty Coming to Canada

Jock Finlayson describes how climate change policies are depleting Canadians’ financial means in his article Millions of Canadians May Face ‘Energy Poverty’.  Excerpts in italics with my bolds and added images.

The term “energy poverty” is not yet part of day-to-day political debate in Canada, but that’s likely to change in the next few years. In Europe, the high and rising cost of energy has become a political lightning rod in several countries including Britain and France. Something similar may be in store for Canada.

The Trudeau government and some of the provinces are
aggressively pursuing the holy grail of decarbonization.

To achieve this, they’re engineering dramatic increases in carbon and other taxes on fossil fuels and promising to pour vast sums of money into building new electricity generation and transmission infrastructure to help reduce reliance on oil, refined petroleum products, natural gas and coal. Both strategies point to higher energy costs.

Tax advocates say it is a small % of GDP. But it is still $10 Billion extracted from Canadian households

The Trudeau government has legislated a national minimum carbon tax set to reach $170 per tonne of emissions by 2030, up from $50 in 2022 and $65 currently. Ottawa has also imposed a “clean fuel standard” that will further raise the cost of fuel. These policies are driven by concerns over climate change, which is a risk, to be sure, but so is the prospect of rapidly escalating energy prices for Canadian households and businesses.

Energy poverty arises when households and families must devote a significant fraction of their after-tax income to cover the cost of energy used for transportation, home heating and cooking, and the provision of electricity. In 2022, the United Kingdom government estimated that 13.4 percent of households were in energy poverty, which it defined as needing to spend more than 10 percent of income to cover the cost of directly consumed energy.

There’s no single agreed methodology for assessing the prevalence of energy poverty. A recent Canadian study reports that in 2017, between 6 percent and 19 percent of Canadian households experienced some form of energy poverty, with an above-average incidence in rural areas, Atlantic Canada and among people living in older single-family homes. If accurate, this finding suggests that many more Canadians will soon become acquainted with the term as taxes on fossil fuels climb and governments impose new regulations affecting the energy efficiency of buildings, vehicles, industrial equipment, appliances and agricultural operations.

Canada is blessed with plentiful and diverse supplies of energy. Over time, we have become an important global producer and exporter of energy, with oil, natural gas and electricity together expected to account for one-quarter of Canada’s merchandise exports in 2023. Canada is also an intensive consumer of energy, in part because of our cold climate, dispersed population and relatively high living standards.

80% of the Other Renewables is solid biomass (wood), which leaves at most 1% of Canadian total energy supply coming from wind and solar.

End-use energy demand in Canada is around 13,000 petajoules. Of this, industry is responsible for about half, followed by transportation, residential buildings, commercial buildings and agriculture. Refined petroleum products—all based on oil—are the largest fuel type consumed in Canada (around 40 percent of the total), followed by natural gas (36 percent) and electricity (16 percent). Biofuels and other smaller sources comprise the rest. These data underscore Canadians’ overwhelming dependence on fossil fuels to meet their energy needs.

Politicians in a hurry to slash greenhouse gas emissions via higher taxes
and more regulations must be alert to the risk that millions of Canadians
could find themselves in energy poverty by the end of the decade.

Jock Finlayson is a Senior Fellow at the Fraser Institute.

See Also Canada Budget Officer Quashes Climate Alarm

 

IPCC Guilty of “Prosecutor’s Fallacy”

IPCC made an illogical argument in a previous report as explained in a new GWPF paper The Prosecutor’s Fallacy and the IPCC Report.  Excerpts in italics with my bolds and added images.

London, 13 September – A new paper from the Global Warming Policy Foundation reveals that the IPCC’s 2013 report contained a remarkable logical fallacy.

The author, Professor Norman Fenton, shows that the authors of the Summary for Policymakers claimed, with 95% certainty, that more than half of the warming observed since 1950 had been caused by man. But as Professor Fenton explains, their logic in reaching this conclusion was fatally flawed.

“Given the observed temperature increase, and the output from their computer simulations of the climate system, the IPCC rejected the idea that less than half the warming was man-made. They said there was less than a 5% chance that this was true.”

“But they then turned this around and concluded that there was a 95% chance
that more than half of observed warming was man-made.”

This is an example of what is known as the Prosecutor’s Fallacy, in which the probability of a hypothesis given certain evidence, is mistakenly taken to be the same as the probability of the evidence given the hypothesis.

As Professor Fenton explains

“If an animal is a cat, there is a very high probability that it has four legs.
However, if an animal has four legs, we cannot conclude that it is a cat.
It’s a classic error, and is precisely what the IPCC has done.”

Professor Fenton’s paper is entitled The Prosecutor’s Fallacy and the IPCC Report.

What the number does and does not mean

Recall that the particular ‘climate change number’ that I was asked to explain was the number 95: specifically, relating to the assertion made in the IPCC 2013 Report of ‘at least 95% degree of certainty that more than half the recent warming is man-made’.  The ‘recent warming’ related to the period 1950–2010. So, the assertion is about the probability of humans causing most of this warming.

Before explaining the problem with this assertion, we need to make clear that (although superficially similar) it is very different to another more widely known assertion (still promoted by NASA) that ‘97% of climate scientists agree that humans are causing global warming and climate change’. That assertion was simply based on a flawed survey of authors of published papers and has been thoroughly debunked.

The 95% degree of certainty is a more serious claim.
But the  case made for it in the IPCC report is also flawed.

[Commment: In the short video above, Norman Fenton explains the fallacy IPCC committed.  Synopsis of example.  A man dies is a very rowdy gathering of young men.  A size 13 footprint is found on the body.  Fred is picked up by the police.  He admits to being there but not to killing anyone, despite wearing size 13 shoes.  Since statistics show that only 1% of young men have size 13 feet, the prosecutor claims a 99% chance Fred is guilty.  The crowd was reported to be on the order of 1000, so  there were likely 10 others with size 13 shoes.  So in fact there is only a 10% chance Fred is guilty.]

The flaw in the IPCC summary report

It turns out that the assertion that ‘at least 95% degree of certainty that more than half the recent warming is man-made’ is  based on the same fallacy. In my article about the programme, I highlighted this concern as follows:

The real probabilistic meaning of the 95% figure. In fact it comes from a classical hypothesis test in which observed data is used to test the credibility of the ‘null hypothesis’. The null hypothesis is the ‘opposite’ statement to the one believed to be true, i.e. ‘Less than half the warming in the last 60 years is man-made’. If, as in this case, there is only a 5% probability of observing the data if the null hypothesis is true, the statisticians equate this figure (called a p-value) to a 95% confidence that we can reject the null hypothesis.

But the probability here is a statement about the data given the hypothesis. It is not generally the same as the probability of the hypothesis given the data (in fact equating the two is often referred to as the ‘prosecutors fallacy’, since it is an error often made by lawyers when interpreting statistical evidence).

IPCC defined ‘extremely likely’ as at least 95% probability.  The basis for the claim is found in Chapter 10 of the detailed Technical Summary, which describes various climate change simulation models, which reject the null hypothesis (that more than half the warming was not man-made) at the 5% significance level. Specifically, in the simulation models, if you assumed that there was little man-made impact, then there was less than 5% chance of observing the warming that has been measured. In other words, the models do not support the null hypothesis of little man-made climate change. The problem is that, even if the models were accurate (and it is unlikely that they are) we cannot conclude that there is at least a 95% chance that more than half the warming was man-made, because doing so is the fallacy of the transposed conditional.

The illusion of confidence in the coin example comes from ignoring (the ‘prior probability’) of how rare the double-headed coins are. Similarly, in the case of climate change there is no allowance made for the prior probability of man-made climate change, i.e. how likely it is that humans rather than other factors such as solar activity cause most of the warming. After all, previous periods of warming certainly could not have been caused by increased greenhouse gases from humans, so it seems reasonable to assume – before we have considered any of the evidence – that the probability humans caused most of the recent increase in temperature to be very low. 

Only the assumptions of the simulation models are allowed,
and other explanations are absent.

In both of these circumstances, classical statistics can then be used to deceive you into presenting an illusion of confidence when it is not justified.

See Also 

Beliefs and Uncertainty: A Bayesian Primer

 

You pick one unopened door. Monty opens one other door. Do you stay with your choice or switch?

Monty Hall Problem Simulator

 

Koonin’s Climate Honesty

Steven Koonin shared his honest and wise perspective on global warming/climate change in the interview above.  For those who prefer reading, an excerpted transcript from the closed captions provides the highlights in italics with my bolds and added images.

PR: Welcome to uncommon knowledge; I’m Peter Robinson. Now a professor at New York University and a fellow at the Hoover Institution, Steven Koonin received a Bachelor of Science degree at Caltech and a doctorate in physics at MIT during a career in which he published more than 200 peer-reviewed scientific papers and a textbook on computational physics. Dr Koonin rose to become Provost of Caltech. In 2009 President Obama appointed him under Secretary of science at the Department of Energy a position Dr Koonin held for some two and a half years. During that time he found himself shocked by the misuse of climate science in politics and the press. In 2021 Dr Koonin published Unsettled. What climate science tells us, what it doesn’t and why it matters.

In Unsettled you write of a 2014 workshop for the American physical society, which means it’s you and a bunch of other people who I cannot even begin to follow. Serious professional scientists such as you and several colleagues were asked to subject current climate science to a stress test: to push it, to prod, to test it to see how good it was. From Unsettled I’m quoting you now Steve:

“ I’m a scientist; I work to understand the world through measurements and observations. I came away from the workshop not only not only surprised but shaken by the realization that climate science was far less mature than I had supposed.”

Let’s start with the end of that. What had you supposed?

SK: Well I had supposed that humans were warming the globe; carbon dioxide was accumulating in the atmosphere causing all kinds of trouble, melting ice caps, warming oceans and so on. And the data didn’t support a lot of that. And the projections of what would happen in the future relied on models that were, let’s say, shaky at best.

PR: All right. Former Senator John Kerry is now President Biden’s special Envoy for climate. Let me quote from John Kerry in a 2021 address to the UN Security Council:

“Net zero emissions by 2050 or earlier is the only way that science tells us we can limit this planet’s warming to 1.5 degrees Celsius. Why is that so crucial? Because overwhelming evidence tells us that anything more will have catastrophic implications. We are Marching forward in what is tantamount to a mutual suicide pact.”

Overwhelming evidence science tells us. What’s wrong with that?

SK: Well you should look at the actual science which I suspect that Ambassador Kerry has not done. The U.N puts out assessment reports every five or six years. Those are by the IPCC the Intergovernmental Panel for Climate Change and are meant to survey, assess and summarize the state of our knowledge about the climate. The most recent one came out about a year ago in 2022, the previous one in 2014 or so.

Those reports are massive to read; the latest one is three thousand pages and it took 300 scientists a couple years to write. And you really need to be a scientist to understand them. I have a background in theoretical physics, I can understand this stuff. But still it took me a couple years to really understand what goes on. Now Ambassador Kerry and other politicians certainly have not done that.

Likely he’s getting his information from the summary for policy makers, or more likely for an even further boiled down version. And as you boil down the good assessment into the summary, into more condensed versions, there’s plenty of room for mischief. That Mischief is evident when you compare what comes out the end of that game of telephone with what the actual science really is.

PR: All right: what we know and what we don’t. Let’s start with what we know. I’m quoting you again Steve from Unsettled  “Not everything you’ve heard about climate science is wrong.” In particular you grant in this book two of the central premises or conclusions of climate science that the Press is always telling us about. here’s one and again I’m going to quote you:

“Surely we can all agree that the globe has gotten warmer
over the last several decades.”

SK: No debunking. In fact it’s gotten warmer over the last four centuries Now that’s a different assertion, but it’s equally supported by the assessment reports.  We’ll have to come back to that because the time scale is important. It’s one thing to say this about in my own lifetime the the the climate of the the surface of this planet, and it’s an entirely different thing to say beginning 150 years before this nation was founded temperatures began to rise.

PR: Yes, it’s a different statement but it’s equally true and has some bearing on the warming that we’ve seen over the last century. Here’s the premise that you do grant again I’m going to quote Unsettled

“There is no question that our emission of greenhouse gases in particular CO2 is exerting a warming influence on the planet.” We’re pumping CO2 into the atmosphere, CO2 is a greenhouse gas it must be having some effect of course.”

Absolutely that’s as far as you’re willing to go.   But then you say so actually those are pretty two benign premises that you grant: the Earth has been warming and it’s been warming for a long time. CO2 is a greenhouse gas and it must be having some effect it’s coming from human activities and it’scoming from Humanity, mostly fossil fuels. Now now on to what we don’t know okay again from Unsettled

“Even though human influences could have serious consequences for the climate, they are small in relation to the climate system as a whole. That sets a very high bar for projecting the consequences of human influences.”

That is so counter to the general understanding that informs the headlines, particularly this hot summer we’ve had . So explain that.

SK: Human influences as described in the IPCC are a one percent effect on the radiation flow–the flow of heat radiation and sunlight in the atmosphere. That means your understanding had better be at the one percent level or better if you’re going to predict how the climate system is going to respond. And the one percent makes sense because the changes in temperature we’re talking about are three degrees Kelvin right whereas the average temperature of the earth is about 300 degrees Kelvin.

PR: So human influences are a one percent effect on a complicated chaotic multi-scale system for which we have poor observations  You seem to you seem to quite relaxed about the original science

SK: The underlying science is expressed in the data and expressed in the research literature the journals the research papers that people produce the conference proceedings and so on. The IPCC takes those and assesses and summarizes them and in general it does a pretty good job at that level. And there’s not going to be much politics in that although they might quibble among themselves about adjectives and adverbs; this is extremely certain or this is unlikely or highly unlikely and so on. But by and large it’s pretty good, this is done by fellow Professionals in a professional manner

Now things begin to go wrong. The next step is because nobody who isn’t deeply in the field is going to read all that stuff, so there is a formal process to create a summary for policy makers which is initially drafted by the governments not by the scientists. Well it’s not of course all of them, there’s some subcommittee to do the summary for policy makers and that gets drafted and passed by the scientists for comment. In the end it’s the governments who have approved the summary for policy makers line by line and that’s where the disconnect happens.

For the disconnect I’ll give you an example. Look at the most recent report and the summary for policy makers is talking about deaths from extreme heat incremental deaths and it says that you know extreme heat or heat waves have contributed to uh mortality okay and that’s a true statement But they forgot to tell you that the warming of the planet decreases the incidence of extreme cold events. And since nine times as many people around the globe die from extreme cold than from extreme heat, the warming from the planet has actually cut the number of deaths from extreme temperatures by a lot. That’s not in there at all.

So the statement was completely factual, but factually incomplete
in a way meant to alarm, not to inform. 

And then John Kerry stands up and gives a speech. Maybe he read the SPM I don’t know or his staff read it and probably some of their talking points. And so you get Kerry saying that, you get the Secretary General of the U.N Gutierrez saying, we’re on a highway to climate hell with our foot on the accelerator. But they’re Preposterous of course, even by the IPCC reports they’re Preposterous.  The climate scientists are negligent for not speaking up and saying that’s not okay.

PR: Another one of the things going wrong you write about in a way that I have never seen anyone write about computer models. I have never seen anybody make computer models interesting. So congratulations Steve you did something special as far as I know in the entire Corpus of English language.

Here I’m going to quote from a piece you published in the Wall Street Journal not long ago:

“Projections of future climate and weather events rely on models
demonstrably unfit for the purpose.”

SK: Well, to make a projection of future climate you need to build this big complicated computer model which is really one of the grand computational challenges of all time.

This is not something I wrote a textbook in 1980s when the first PCS came out about how to do modeling on computers with physics. I do know what I’m talking about okay. And then you have to feed into the model what you think future emissions are going to be and the IPCC has five or six different scenarios, High emissions ,low emissions. If you take a particular scenario and feed it into the roughly 50 different models that exist that are developed by groups around the world

So Caltech has a model, Harvard has a model, yeah Oxford. But the Chinese have several models, the Russians and so on. When you feed the same scenario into those different models you get a range of answers. The range is as big as the change you’re trying to describe itself okay, And we can go into the reasons why there is that uncertainty, and in the latest generation of models about 40 percent of them were deemed to be too sensitive to be of much use.

Too sensitive meaning that when you add the carbon dioxide in and the temperature goes up too fast compared to what we’ve seen already. So that’s really disheartening the world’s best models are trying as hard as they can, and they get it very wrong at least 40 percent of the time.

This is not only my assessment you can look at papers published by Tim Palmer and Bjorn Stevens who are serious modelers in the consensus. And their own phrases are that these models are not fit for purpose. at least at the regional or more detailed Global level .

PR: Quoting Unsettled again, and this is one of the most astonishing passages in the book.  Writing about the effects of the increases in computing power over the years:

“Having better tools and information to work with should make the models more accurate
and 
more in line with each other.  This has not happened.
The spread in results among different computer models is increasing.”

This one you’re going to have to explain to me.  As our modeling power, as our processing power increases, we should be closing in on reliable conclusions and yet they seem to be receding faster than we can approach them. if I got that correct that’s right how can that be

SK: Because  as the models become more sophisticated  that means either you made the boxes a little bit smaller in the model the grid boxes so there are more of them or you made more sophisticated your description.

The whole globe is sort of divided into 10 million slabs really.  The average size of a grid box in the current generation is 100 kilometers 60 miles okay and within that 60 miles there’s a lot that goes on that we can’t describe explicitly in the computer because clouds are maybe five kilometers big and Rain happens here and not there within the grid box we can’t describe all that.

One day we’ll be able to , but not really very soon and let me explain why. The current grid boxes are 100 kilometers so you might say well why not make them 10. well suddenly the number of boxes has gone up by a hundred okay so you need a hundred times more powerful computer but it’s worse than that because the time steps have to be smaller also because things shouldn’t move more than a grid box in one time step and so the processing power actually goes up as the cube of the grid size and so if you want to go from 100 kilometers to 10 kilometers that’s a factor of 10. the processing power required goes up by a factor of a thousand and it’s going to be a long time before we got a computer that’s a thousand times more powerful than what we have.

PR: You and I are speaking in the middle of August I just started collecting headlines thinking I’ll just read this to Steve and see what he says about it.

CBS News this past May “Scientists say climate change is making hurricanes worse.”

Koonin in Unsettled:  “Hurricanes and tornadoes show no changes attributable to human influences.” 

[The graph above shows exhibit 2a from Truchelut and Staehling overlaid with the record of atmospheric CO2 concentrations.  From NOAA combining Mauna Loa with earlier datasets.]
To determine Integrated Storm Activity Annually over the Continental U.S. (ISAAC) from 1900 through 2017, we summed this landfall ACE spatially over the entire continental U.S. and temporally over each hour of each hurricane season. We used the same methodology to calculate integrated annual landfall ACE for five additional geographic subsets of the continental U.S.

Well what do you think you’re doing taking on CBS?

SK: Well you know what science does CBS know?  The media gets their information from reporters who have no or very little scientific training. (PR: you mean you didn’t graduate people from Caltech who went to work there?)  Probably one or so and they do a good job. But they have reporters on a climate beat who have to produce stories the more dramatic the better:  If it bleeds It leads. and so you get that kind of stuff I quote

When I say something about hurricanes, I quote right from the IPCC reports and it doesn’t say that at all. Actually the most recent report said it based on a paper which was subsequently corrected

PR: Floods here’s a 2020 headline this is from an article or press release published by the UN environment program quote climate change this is the U.N now not the IPCC but it is a U.N agency:

UNEP: “Climate change is making record-breaking floods.”

Steve Koonin in Unsettled:  “We don’t know whether floods globally are increasing, decreasing or doing nothing at all.”

SK: I would say the U.N needs to be consistent and and they should check their press release against the IPCC reports before they say anything. 

When I wrote unsettled I tried very hard to stick with the gold standard which was the IPCC report at the time or the subsequent research literature I had available to me when I wrote the book only the fifth assessment report which came out in 2014 as we’ve discussed.

The sixth assessment report came out about a year ago and I’m proud to say there’s essentially nothing in there now that needs to be changed in the paperback edition. I will do an update of course but the paperback edition is not going to be totally rewritten.

PR:  All right agriculture. Here’s a 2019 headline

New York Times: “Climate change threatens world’s food supply United Nations warns.”

Steve Koonin in Unsettled:  “Agricultural yields have surged during the past Century even as the globe has warmed.  And projected price impacts from future human induced climate changes through 2050 should hardly be noticeable among ordinary market dynamics.”

SK: It’s not what I said but what the IPCC said. Take current media and almost any climate story, I can write a very effective counter-– it’s like shooting fish in a barrel. I’ve got I’ve actually gotten to the point where I say oh no not another one do I have to do that too. So this is endemic to a media that is ill-informed and has an agenda to set.

The agenda is to promote alarm and induce governments to decarbonize.

I think that probably the primary agenda is to get clicks and eyeballs but and you know there are organizations it’s wonderful there’s an organization called Covering Climate Now which is a non-profit membership organization it’s got the guardian it’s got various other media NPR I believe and their mission is to promote the narrative. They will not allow anything to be broadcast or written that is counter to the narrative The Narrative is: We’ve already broken the climate.

PR: These are headlines in July of 2023. This is last month here as you and I tape this.

New York Times on July 6th: ” Heat records are broken around the globe as Earth warms fast from north to south. Temperatures are surging as greenhouse gases combined with the effects of El Nino.

New York Times on July 18: “Heat waves grip three continents as climate change warms Earth.  Across North America, Europe and Asia hundreds of millions endured blistering conditions.  A U.S official called it a threat to all humankind.”

Wall Street Journal on July 25th:  “July heat waves nearly impossible without climate change study says.  Record temperatures have been fueled by decades of fossil fuel emissions.” 

New York Times on July 27th; “This looks like Earth’s warmest month, hotter ones appear to be in store. July is on track to break all records for any month scientists say,  as the planet enters an extended period of exceptional warmth.”

Unsettled came out in April 2021 so we will forgive you not knowing in April 2021 what would happen last month July of 2023.  But now July 2023 is in the record books,  and doesn’t it prove that climate science is settled?

SK: That statement together with all those headlines confuse weather and climate. So weather is what happens every day or maybe even every season; climate the official definition is a multi-decade average of weather properties. That’s what the IPCC and another U.N agency, the World Meteorological organization (WMO) says.

We have satellites that are continually monitoring the temperature of the atmosphere and they report out every month what the monthly temperature is or more precisely what the monthly temperature anomaly is namely how much warmer or colder is it than the average what would have been expected for that month.  We have data that go back to about 1979. so we have good monthly measures of the global temperature on the lower atmosphere for 40 something years.

You see month-to-month variations of course but a long-term Trend that’s going up no question about it. I I won’t get the number exactly right, but it’s going up at about 0.13 degrees per decade all right. That’s some combination of natural variability and greenhouse gases. Human influences are more general and then every couple years you see a sharp Spike going up, and that’s El Nino.   It’s weather, and so it goes up and then goes back down.

So there’s a long-term Trend which is greenhouse gases and natural variability and then there’s this natural Spike every once in a while, but an eruption goes off you see something, El Ninos happen you see something.  And so on the last month in July there was another Spike in the anomaly the anomalies about as large as we’ve ever seen but not unprecedented okay

The real question is why did it Spike so much right?
Nothing to do with CO2

CO2 is kind of the well human influences a kind of the base on which this uh phenomenon occurs so because the the CO2 even if you stipulate that CO2 is causing some large proportion of this warming,  it’s a slow steady process you would not expect to see spikes you wouldn’t expect to see sudden step functions absolutely not all right and there are various reasons people hypothesize we don’t know yet why we’ve seen the spike in the last month

PR: You better take just a moment to explain what is El Nino

SK: El Nino is a phenomenon in the climate system that happens once every four or five years heat builds up in the equatorial Pacific to the west of Indonesia and so on and then when enough of it builds up it kind of surges across the Pacific and changes the currents and the winds uh as it surges toward South America all right it was discovered in the 19th century and it kind of well understood at this point 19th century means that phenomenon has nothing to do with CO2.

Now people talk about changes in that phenomena as a result of CO2 but it’s there in the climate system already and when it happens it influences weather all over the world we feel it we feel it it gets Rainier in Southern California for example and so on so we had it we we have been in the opposite of an El Nino, a La Nina for the last 3 years, part of the reason people think the West Coast has been in drought and it is Shifting.

It has now shifted in the last months to an El Nino condition that warms the globe and is thought to contribute to this Spike we have seen. But there are other contributions as well one of the most surprising ones is that back in January of 22 an enormous underwater volcano went off in Tonga and it put up a lot of water vapor into the upper atmosphere. It increased the upper atmosphere of water vapor by about 10 percent, and that’s a warming effect and it may be that that is contributing to why the spike is so high. so you’re let me go

PR: Back to New York since you spent you spent July there. I happened to visit in July and we have Canadian wildfires and the Press telling us that the wildfires are because of climate change. And for the first time anybody I know could remember smoke is so heavy and it gets blown into New York And this sky feels as though there’s a solar eclipse taking place for three days it’s so dark in New York

Meanwhile New York is hot it’s really hot and we’re reading reports that Europe is hot and there’s sweltering even in Madrid, a culture built around heat in the midday where they take siestas. Even in Madrid they don’t quite know how to handle this heat and it’s perfectly normal for people to say wait a minute this is getting scary. It feels for the first time as though the Earth is threatening, it’s unsafe in New York of all places where you didn’t have to worry about earthquakes. But the other thing you didn’t have to worry about was breathing the air, but suddenly you can’t breathe the air it feels uncomfortable it’s scary. And you’re saying and your response to that is what?

SK: So we have two responses. First we have a very short memory for weather. Go back in the archives or the newspapers and you can read from even the 19th century on the East Coast descriptions of so-called yellow days when the atmosphere was clouded by smoke from Canadian fires. So look at the historical record first and if it happened before human influences were significant you got a much higher bar to clear to say that’s CO2.

Secondly, there’s a lot of variability. Here in California we had two decades of drought and the governor was screaming New Normal. New Normal. And then what happened last year: historical record torrential rains because people forgot about the 1860 some odd event where the Central Valley was under many feet of water.

PR: So climate is not weather and the weather can really fool you. all right Steve some last questions.  From Unsettled:

“Humans have been successfully adapting to changes in climate for millennia.
Today’s society can adapt to climate changes whether they are
natural phenomena or the result of human influences.”

So you draw the distinction between adapting to climate change on the one hand and the John Kerry approach on the other which is trying to stop climate change. Explain that distinction and why you favor one over the other

SK: Okay. I would take issue though with your description of Kerry’s approach. It’s not trying to stop climate change, it’s to reduce human influences on the climate. Because the climate will keep changing even if we reduce emissions carry the night okay then I would even dream all right go ahead.

Let me talk about adaptation a little bit and give you some examples that are probably not well known, at least it wasn’t really known to me until I looked into it. If you go back to 1900 and you look from 1900 till today the globe warmed by about 1.3 degrees Celsius. That’s This Global temperature record that everybody more or less agrees upon . And before we get to the consequences, the other statement is that the IPCC projects about the same amount of warming over the next hundred years. You might ask what’s going to happen over the next hundred years as that warming happens.

We can look at the past to get some sense of how we might fare,
okay not perfect, but a good indication.

Since 1900 until now:

♦  The global population has gone up by a factor of five, we’re now 8 billion people.
♦  The average lifespan or life expectancy went from 32 years to 73 years
♦  The GDP per capita in constant dollars went up by a factor of seven
♦  The literacy rate went up by a factor of four
♦   The nutrition etc etc

The greatest flourishing of human well-being ever as the globe warmed by 1.3 degrees. And the kicker of course is that the death rate from extreme weather events fell by a factor of 50, due to better prediction, better resilience of infrastructure, and so on. So to think that another 1.3 or 1.4 whatever degrees over the next century is going to significantly derail that beggars belief.

Okay so not an existential threat perhaps some drag on the economy a little bit; the IPCC says not very much at all. So the notion that the world is going to end unless we stop Greenhouse Gas Energy is just nonsense. This is not a mutual suicide pact, not at all.

PR: On August 16th of last year a year ago President Biden signed legislation that included some 360 billion of climate spending, at least the Biden Administration claimed it was climate spending over the next decade. President Biden:

“The American people won and the climate deniers lost and the inflation reduction act takes the most aggressive action to combat climate change ever.”

Curiously enough, they called it the inflation reduction act while it seems to have prompted inflation rather than reduced it. Good legislation or not?

SK: It would be if it focused on useful adaptation, but it’s aimed at mitigation by and large, namely reducing emissions. I think there are parts of it that are good in particular the spur to innovate. New technologies are the only way we’re going to reduce emissions if that is the goal. We need to develop Energy Technologies that are no more expensive than fossil fuels technologies

PR: But our low emission or zero emission goals? Let’s take that one. Because here I have the Provost of Caltech, let’s ask  what tech what we can reasonably hope and what we cannot reasonably hope. Can we reasonably hope you and I are talking after 10 days after the internet went crazy with some claim of cold fusion, no it was room temperature superconductivity. Is this a problem we can crack?

SK:  So  I think it’s going to be really difficult there is one existing solution and that’s nuclear power fission right we know about Fusion separately Fission exists yes uh it can be done right; it’s more expensive than other methods,  because of the regulatory order and it’s got a large lead time, but also because at least in the U.S we build every plant to a custom design.  So one of the things I helped catalyze when I was in the department of energy was small modular reactors.  These are about a tenth the size of the big ones, you can build them in a factory put them on a flatbed truck and this is not a crazy dream. Venture money is going on and there are companies that are on the verge of putting out a test deployment of of commercially constructed power plants.

So why isn’t John Kerry going to one of these hot new startups and doing a photo shoot? I don’t follow Ambassador okay, but you know the nuclear word that is a political hot potato in some quarters. Not to get too much into politics, but I think there is a faction of the left wing that just sees that as anathema and not a solution at all. Meanwhile the Chinese are doing it.

So I like the technology parts of the IRA I do not like the subsidies for wind and solar. One of the things you didn’t mention was I was Chief scientist for BP the oil company for five years. So I learned the energy industry. I never had to make any money in it, but I helped to strategize and kind of systematize thinking for them. So I know from the inside about subsidies to solar and wind. Everybody thinks that’s a solution, but of course wind and solar are intermittent sources of electricity: solar obviously doesn’t produce at night or when it’s cloudy, wind does not produce when the wind doesn’t blow. If you’re going to build a grid that’s entirely wind and solar you better have some way of filling in the times when they’re not producing.

Now if it’s only eight hours or 12 hours you’re trying to fill in, not so hard you can build batteries and so on. But if you need to fill in a couple weeks such as times in Europe, Texas and California when the wind has become still and the solar is clouded out. So you need something else right and that might be batteries although I think that’s unlikely. Gas with carbon capture or nuclear is going to be at least as capable as the wind and solar and since the wind and solar feeds are the cheapest the backup system is going to be more expensive, so you wind up running two parallel systems making electricity at least twice as expensive.

So I say that wind and solar can be an ornament on the real electrical system
but they can never be the backbone of the system.

Let me explain the biggest problem in trying to reduce emissions is not the one and a half billion people in the developed world; it’s the six and a half billion people who don’t have enough energy. And you’re telling them that because of some vague distant threat that we in the developed world are worried about, that they’re going to have to pay more for energy or get more less reliable sources. They should be able to make their own choices about whether they’re willing to tolerate whatever threat there might be from the climate versus having round-the-clock lighting, having adequate Refrigeration, having transportation and so on. Millions of people in India,  six and a half billion people worldwide right absolutely they’re energy starved.

Three billion people on the planet of the 8 billion use less electricity every year than the average U.S refrigerator. So first fix that problem, which is existential and immediate and solvable, and then we can talk about some vague climate thing that might happen 50 years from now.

But scientists must tell the truth, absolutely completely lay it all out,
and we’re not getting that out of the scientific establishment.

PR: Unsettled has been out for more than two years now how have your colleagues responded?

SK: Many colleagues who are not climate scientists say thanks for writing the book it gives me a framework to think about these things and points me to some of the problems that we’re seeing in the popular discussion. I got some rather awful reviews from mainstream climate scientists which disappointed me. Not because they found anything wrong in the book, they didn’t. But the quality of the discussion, the ad hominem attacks, the putting words in my mouth and so on, that wasn’t so good.  Their argument was, Steve Koonin you’re one of us ; you shouldn’t be saying this. It may be true but you shouldn’t be saying it. Steve how could you?

First of all I’ve been involved in science advice in other aspects of public policy particularly National Defense together with some Stanford former colleagues now passed on. And I was taught that you tell the whole truth and you let the politicians make the value judgments and the cost Effectiveness trade-offs. My sense of that balance is no better than anybody else’s, but I can bring to the table the scientific facts. If you trust democracy, you trust people to elect politicians who can over time make a mistake here, they’ll make a mistake there.

But over time you trust them. Now there are colleagues who say: No don’t tell them the truth we can’t trust them to make the right decision. That’s fundamentally what’s going on. I know scientists who know better than everybody else, and you know it’s even worse because these are scientists in the developed world. And if you ask the scientists in Nigeria or India and so on, you get a very different values calculus, that the primary concern is getting enough energy for folks.

 

PR: According to a Harris poll in January 2022 a little over a year year and a half ago now 84% of teenagers in the United States agree with both of the two following statements. they agree with:

♦  Climate change will impact everyone in my generation through Global political instability.
♦  If we don’t address climate change today it will be too late for future Generations making some parts of the planet unlivable.

John Kerry, Al Gore, Greta Thunberg and on and on, and countless voices warning that climate change represents a genuine danger to life on the planet. And now millions of Young Americans are really scared. Surely this has some role to play in what we see the the suicidal ideation and the increasing unhappiness.

SK: I’m sure there are all kinds of social factors but surely this is part of what’s going on. There are two immoralities here. One is the immoral treatment of the developing World which we talked about. The other immorality is scaring the bejesus out of the younger generation. And it’s doubly dangerous because it’s mostly in the west and not in China or India. I’ve tried. I go out and talk in universities and of course the audiences I talk to tend to be quantitative and factually driven. So the minds get opened up if the eyes get opened up.

I think in the U.S the problem will eventually solve itself because the route we are headed down is starting to impact people’s daily lives. Electricity is getting more expensive, you won’t be able to buy an internal combustion car in 10 or 15 years. If you’re here in California, people are going to say wait a second, as they already are in Europe, in UK , Germany, France. And I think there will be a falling down to Earth of all of this at some point and we will get more sensible.

PR: Let’s say your audience now is not a colleague of yours but is an 18 to 24 year old American pretty bright, maybe in college maybe not, but bright. Reads newspapers or at least reads them online. Speaking to that person speaking to an American kid or young adult: Do you need, do they need to be scared?

SK: No absolutely not. I would quote the 1900 to now flourishing as an example. And I would say, you probably believe that hurricanes are getting worse, and then point them to the IPCC line. And say you know you were misinformed about that by the media, don’t you think that there are other things about which you’ve been misinformed. You can read the book and find out many of them, and then go ask your climate friends how come it says one thing in the IPCC report but you’re telling me something else.

 

Why Climate Models Can’t Be Right

Vic Hughes explains in his American Thinker article The Blunt Truth about Global Warming Models.  Excerpts in italics with my bolds and added images.

I may be one of the first scientists in the country to know that
predicting long-term temperatures is not possible.

Almost 50 years ago, while in grad school, I had a contract from an Army research lab to use a state-of-the-art models to predict long-term temperatures. I quickly realized that the goal of the project, to forecast accurately the temperature long-term, was impossible because small errors in data inputs could result in huge forecasts errors. Equally important was that errors compounded so quickly that it caused the error ranges to explode. The results were junk.

As an example, what good is a temperature forecast with an error range
of plus or minus one hundred degrees?

I give university speeches to scientists and tell them: if you ever see some data or forecasts, your first question has to be “what’s the error range?” If you don’t know the error range, the data are almost useless. It’s not coincidental that the Climate Mafia don’t highlight this problem

So what about modern technology solving these problems? These error problems are still true today. It’s not that the long-term temperature forecasts are wrong; it’s that they can’t be right. All global warming modelers know this, or they are incredibly stupid, or they just lie about it for money or power.

When the U.N. Intergovernmental Panel on Climate Change made even a pretense of being science-based, they used to admit it. From the 2001 IPCC Third Assessment Report:

“The climate system is a coupled non-linear chaotic system, and therefore
the long-term prediction of future climate states is not possible.”

The weather is a coupled, non-linear chaotic system. Chaos theory says very small changes in inputs can result in totally different outcomes. This concept is counterintuitive for most people. We intrinsically think that if you’re a little off at the beginning, you should be a little off at the end. Try that on a mountain trail next to a cliff.

The Climate Mafia know that this is true, but they still want money and power. They argue that even though you can’t make a real temperature forecast, they can create a completely bogus forecasting approach, where they take a bunch of different climate models that don’t agree (so much for settled science) and combine their outputs. They then say voilà: we have a correct prediction, and they use pseudo-statistics to get around the error problem. The way I visualize it is, if you take a bunch (an ensemble sounds more scientific) of wrong answers and then combine them, that is the right answer. Absurd.

Since your input data are critical to forecasting the Chaotic Future, fully understanding past temperatures is also critical. The Climate Mafia create the entirely bogus concept of an “average Earth temperature” to create a bogus base data set for their bogus models. The Warming Scammers like to use a garbage temperature history that starts about 1850. The Scammers say that their temperatures increased since 1850, just coincidentally at the end of a three-hundred-year cooling cycle, represent the rise of the industrial pollution age. In 1850, and even in 1950, only a small percentage of world’s population could even be considered close to industrialized. Look at India, Africa, and China then: almost medieval energy use patterns until really recently. Humans have been around in their current form for many tens of thousands of years. To say the weather since the 1850s is representative of anything from a statistical perspective is a joke.

So what kind of temperature data do we have since 1850?

With oceans and ice caps covering over 80+% of the world’s surface, we have virtually no reliable long-term data on any of that, other than the last few decades. Even then, you are talking about a relatively small number of measuring devises in all those places. (Do you check the weather a few hundred miles away to know if you need an umbrella?) How about the temperature trends in deserts, on mountains, in the middle of Africa, South America, Siberia — at sea level, a hundred feet elevation, a thousand feet elevation? The data are so bad in all of the Southern Hemisphere — half the globe — that there are only a few datasets even close to reliable since the 1850s. There are almost no real, reliable, and complete long-term data globally, and particularly none reliable enough to create model of a chaotic system entirely dependent on very accurate input data.

 

The concept of “average Earth temperature” is critical to their bogus forecasting, but because we think “average” generally means something useful, it gets a mental pass. As an example, describe the “average” human.

Let me offer a thought experiment. What is the average temperature
of your house within one degree?

How many sensors, with what degree of accuracy reading the temperature, how often, would you need to know the average temperature within one degree? One sensor won’t do it. Would ten sensors (a hundred? a thousand?) be needed to cover the ceilings, floors, six feet up, each corner of every room, near every doors and window and heat source, measuring the temperature every hour, minute, second for a period of years, to get an average temperature within one degree? How do you weight a gauge in a big room versus a small one, or in the ceiling versus the floor?

If we can’t even figure out the long-term average temperature of one building, it is complete hubris to think we can create accurate enough global temperature inputs to predict 100 years out.

I won’t even get into outright data fraud, like lowering the hot 1930s or selectively changing input locations to make the current temperatures look hotter. The data are bad enough on their own. Using faulty data to predict the future creates faulty forecasts. Garbage in, garbage out.

Perhaps the greatest part of the Global Warming Scam is that it requires a complete disregard for common sense. We all know that weather forecasters can’t predict next week’s weather within 1 degree, but the Scammers push the lie that they can predict the temperature a hundred years from now within a degree or two. That literally defies credulity.

More importantly, say you had perfect data and a perfect model. How could we possibly know what impact that will have? Another thought experiment: How much will temperatures vary where you live today? Ten degrees? Twenty degrees? Thirty? How much does it vary in a year? For most of the U.S., that number might be 50 degrees, a 100 degrees. So plants and animals have adapted to 20-degree temperatures changes in a day or 100-degree temperature changes in a year.

Daily average temperature variability of Bolu City, Turkey, and its 365-day moving average.

Somehow a 1- or 2-degree temperature change in a hundred years
is going to take them out? That’s ridiculous.

Finally, when somebody offers me a forecast, my first question is, how right have your other forecasts been? We are now in the third or fourth ten-year period of the last forty years when the world is going to end in ten years. That tells you all you need to know about global warming forecasting.

The fact that any counter-narratives have to be censored is also damning. A final fact to consider is one my mother taught me at a young age: when the other side starts calling you names (deniers, anti-vaxxers), you know they have lost the argument.

And yet we should totally restructure society based on those impossible models. This was probably the greatest scientific lie in history, and people believed it. After that, selling the lie that even though mRNA vaccines have never worked in thirty years and generally killed all their test subjects but are now safe and effective for humans, including babies, after two months of limited testing, is child’s play. Joseph Goebbels, a proponent of the “Big Lie,” would be proud about both lies.

Before we restructure the world based on models, we must realize they can’t be right.

The blue background obscures these are estimates of TRILLIONS of Dollars.

 

How Climate Models Get Clouds Wrong

Why Did IMF Disinvite Nobel Laureate?

CO2 Coalition explains.  Nobel Laureate (Physics 2022) Dr. John Clauser was to present a seminar on climate models to the IMF on Thursday and now his talk has been summarily cancelled. According to an email he received last evening, the Director of the Independent Evaluation Office of the International Monetary Fund, Pablo Moreno, had read the flyer for John’s July 25 zoom talk and summarily and immediately canceled the talk. Technically, it was “postponed.”

Dr. Clauser had previously criticized the awarding of the 2021 Nobel Prize for work in the development of computer models predicting global warming and told President Biden that he disagreed with his climate policies. Dr. Clauser has developed a climate model that adds a new significant dominant process to existing models. The process involves the visible light reflected by cumulus clouds that cover, on average, half of the Earth. Existing models greatly underestimate this cloud feedback, which provides a very powerful, dominant thermostatic control of the Earth’s temperature.

More recently, he addressed the Korea Quantum Conference where he stated, “I don’t believe there is a climate crisis” and expressed his belief that “key processes are exaggerated and misunderstood by approximately 200 times.” Dr. Clauser, who is recognized as a climate change skeptic, also became a member of the board of directors of the CO2 Coalition last month, an organization that argues that carbon dioxide emissions are beneficial to life on Earth.

What Difference Clouds Make in Climate Models

Obviously the Clauser presentation is not accessible and I don’t find a link to a publication concerning his treatment of clouds in climate models.  But we can see how the models react to clouds by means of an important paper The Mechanisms of Cloudiness Evolution Responsible for Equilibrium Climate Sensitivity in Climate Model INM-CM4-8 by Evgeny Volodin AGU 03/12/2021.  Excerpts in italics with my bolds.

Abstract

Current climate models demonstrate large discrepancy in equilibrium climate sensitivity (ECS). The effects of cloudiness parameterization changes on the ECS of the INM-CM4-8 climate model were investigated. This model shows the lowest ECS among CMIP6 models. Reasonable changes in the parameterization of the degree of cloudiness yielded ECS variability of 1.8–4.1 K in INM-CM4-8, which was more than half of the interval for the CMIP6 models.

The three principal mechanisms responsible for the increased ECS were increased cloudiness dissipation in warmer climates due to the increased water vapor deficit in the non-cloud fraction of a cell, decreased cloudiness generation in the atmospheric boundary layer in warm climates, and the instantaneous cloud response to CO2 increases due to stratification changes.

Introduction

In CMIP6 the lowest and highest ECS (Equilibrium Climate Sensitivity) values are 1.8 and 5.6 K, respectively (Zelinka et al., 2020). Climate response to some external forcing produces feedbacks. Positive feedback enhances the response to forcing, negative feedback weakens it. Analysis of climate feedback shows that cloud feedback is the principal reason for the broad range of ECS (Zelinka et al., 2020). Clouds (especially low clouds) are significantly reduced with global warming in models with high ECS, resulting in positive feedback. Models with low sensitivity show small cloudiness changes with global warming; some models feature an increase in low clouds in warmer climates, creating a negative feedback.

Clouds produce shortwave and longwave radiative effects. The shortwave cloud radiative effect (SW CRE) is generally negative, because cloudiness reflects solar radiation that would otherwise be absorbed by the climate system. The shortwave effect is usually strongest for low clouds that have high amounts of liquid water and high albedos. The longwave cloud radiative effect (LW CRE) is generally positive, because cloud tops are usually much colder than the surface of the Earth; thus, thermal radiation from the cloud top is much lower than that from the surface. Negative/positive CRE produces cooling/warming from clouds.

The goal of this study is that we turn off some mechanisms responsible for large-scale cloud evolution that lead to increase or decrease ECS, and ECS is changed by the factor of more than 2. The role of a chosen mechanism in decrease or increase of ECS can be clearly seen. At the same time, all model versions show preindustrial climate with systematic biases compared to that for the version used in CMIP6. A realistic way of estimating the impact of change in parameterization on cloud feedback by keeping the cloud mean state realistic in all model versions and running 4xCO2 experiments rather than uniform +4K experiments are used in this study.

Table 1. Summary of Model Versions

Note. Equilibrium climate sensitivity ECS (K), effective radiation forcing ERF (W m−2), climate feedback parameter λ (W m−2 K−1), shortwave cloud radiative feedback СRFSW, longwave cloud radiative feedback СRFLW, net cloud radiative feedback СRFNET (W m−2 K−1) and instantaneous cloud radiative forcing change ΔCREINST (Wm−2).

The ECS estimation method is commonly used in CMIP5 and CMIP6 and was proposed by Gregory et al. (2004). The two model runs performed were the control run, in which all forcings were fixed at preindustrial levels, and the run where the concentration of CO2 in the atmosphere was four times higher than in the control run (4CO2 run). The initial state for both runs was the same and taken from a sufficiently long control run. Each run had a length of 150 years. Subsequently, the global mean difference of GMST and the heat balance at the top of atmosphere (THB) for 4CO2 and the control run were calculated for each model year.

Results of the sensitivity experiments performed with the five climate model versions.

Version 1 shows a very low ECS of 1.8 K due to a low negative climate feedback parameter value of −1.46 W m−2 K−1 (interval from −0.6 to −1.8 W m−2 K−1 for CMIP5 and CMIP6) and a low ERF value of 2.7 W m−2 (intervals of 2.6–4.4 and 2.7–4.3 W m−2 for CMIP5 and CMIP6, respectively, Zelinka et al., 2020). The low ECS was accompanied by mostly negative CRF in both the SW and LW spectral intervals (Figure 2 below).

Figure 2  Shortwave (top), longwave (middle) and net (bottom) cloud radiation feedback (Wm−2 K−1) for model version 1 (purple), 2 (yellow), 3 (red), 4(green), and 5 (blue). Data are multiplied by cosine of latitude.

The parameterization replacement scheme for cloudiness in version 2 dramatically changed all the parameters, and the ECS more than doubled to 3.8 K. ERF increased to 3.8 W m−2, without changes to the radiation code because ΔCREINST changed from −0.88 W m−2 to −0.13 W m−2. Additionally, the climate feedback parameter increased from −1.46 W m−2 K−1 to −1.0 W m−2 K−1. In version 2, global warming was associated with decreased cloudiness at all levels. The net cloud radiative feedback became positive. Version 2 yielded significantly increased net and SW cloud radiative feedbacks at all latitudes compared with version 1. Analysis of the sensitivity experiment results of versions 3–5 helps understand the mechanisms of these significant changes.

Version 3 features a suppressed mechanism of high tropical cloudiness due to decreased convective mass flux and higher ECS than version 2 (4.1 K); however, the change is not very pronounced. The LW CRF in the tropics increases in version 3 compared to version 2. The decrease in SW CRE is not very pronounced; therefore, increased net CRF increases ECS. This confirms our hypothesis that suppressing the decrease in tropical cloudiness should increase ECS and that the impact of this mechanism on ECS is noticeable but not very strong.

ECS is noticeably lower in version 4 (2.9 K) than in version 2. Thus, the mechanism of the decrease in boundary layer cloudiness due to decreased cloudiness generation by boundary layer turbulence is crucial for ECS. SW and LW CRF decreased in version 4 compared to version 2, primarily in the tropics and subtropics.

The mechanism of increased cloud dissipation under global warming conditions was suppressed in version 5, ECS was reduced to 2.5 K, and the climate feedback parameter decreased to −1.56 W m−2 K−1. Additionally, the SW CRF decreased in version 5 compared with version 4, primarily in the tropics and subtropics. In this version, all the mechanisms that decrease clouds with increased temperature, as raised in the previous section, are suppressed. The principal reason for the ECS difference between versions 1 and 5 is the instantaneous adjustment rather than the feedback (see Table 1). ΔCREINST values in versions 1 and 5 were −0.88 and 0.16 W m−2, respectively.

Conclusion

All model versions demonstrate similar model bias values for annual mean CRE, near-surface temperature, and precipitation; thus, determining a relation between present-day climate simulation model quality and ECS is difficult. Version 1 has slightly better quality (because it is a CMIP6 version) due to extensive tuning. The cloudiness scheme used in version 1 contained the dependence of low clouds on stratification. An increase in CO2 leads to more stable stratification and more low clouds and may be the primary cause of the low ECS. Bretherton (2015) and Geoffroy et al. (2017) obtained similar results. The decrease in clouds in warmer climates due to the mixing of cloud air with the unsaturated environment was also stated in a review by Gettelman and Sherwood (2016). Our results confirm those by Bony et al. (2006), Brient and Bony (2012), and others that a significant change in the response of low clouds to global warming leads to significant changes in cloud radiative feedback and ECS.

Comment

The main finding is: If warming increases low clouds, then SW (incoming solar radiation) is reduced, counteracting the warming, in effect a negative feedback.  That is consistent with Clauser’s position.

 

 

 

 

 

New 2023 INMCM RAS Climate Model First Results

Previous posts (linked at end) discuss how the climate model from RAS (Russian Academy of Science) has evolved through several versions. The interest arose because of its greater ability to replicate the past temperature history. The model is part of the CMIP program which will soon go the next step to CMIP7, and is one of the first to test with a new climate simulation.

This synopsis is made possible thanks to the lead author, Evgeny M. Volodin, providing me with a copy of the article published May 11, 2023 in Izvestiya, Atmospheric and Oceanic Physics. Those with institutional research credentials can access the paper at Simulation of Present-Day Climate with the INMCM60 Model by E. M. Volodin, et al. (2023). Excerpts are in italics with my bolds and added images and comment.

Abstract

A simulation of the present-day climate with a new version of the climate model developed at the Institute of Numerical Mathematics of the Russian Academy of Sciences (INM RAS) is considered. This model differs from the previous version by a change in the cloud and condensation scheme, which leads to a higher sensitivity to the increase in СО2. The changes are also included in the calculation of aerosol evolution, the aerosol indirect effect, land snow, atmospheric boundary-layer parameterization, and some other schemes.

The model is capable of reproducing near-surface air temperature, precipitation, sea-level pressure, cloud radiative forcing, and other parameters better than the previous version. The largest improvement can be seen in the simulation of temperature in the tropical troposphere and at the polar tropopause and surface temperature of the Southern Ocean. The simulation of climate changes in 1850 2021 by the two model versions is discussed.

Introduction

A new version has been developed on the basis of the climate system model described in [1]. It was shown [2] that introducing changes only to the cloud parameterization would produce climate models with different equilibrium sensitivities to a doubling of СО2, in a range of 1.8 to 4.1 K. The INMCM48 version has the lowest sensitivity of 1.8 K among Coupled Model Intercomparison Project Phase 6 (CMIP6) models. The natural question then arises as to how parameterization changes that increase the equilibrium sensitivity affect the simulation of modern climate and of its changes observed in recent decades.

The INMCM48 version simulates modern climate quite well, but it has some systematic biases common to many of the current climate models, and biases specific only to this model. For example, most climate models overestimate surface temperatures and near surface air temperatures at southern midlatitudes and off the east coast of the tropical Pacific and Atlantic oceans and underestimate surface air temperatures in the Arctic (see, e.g., [3]). A typical error of many current models, as well as of INMCM48, is the cold polar tropopause and the warm tropical tropopause, resulting in an overestimation of the westerlies in the midlatitude stratosphere. Possible sources of such systematic biases are the errors in the simulation of cloud amount and optical properties.

In the next version, therefore, changes were first made in cloud parameterization. Furthermore, the INMCM48 exhibited systematic biases specific solely to it. These are, for example, the overestimation of sea-level pressure, as well as of geopotential at any level in the troposphere, over the North Pacific. The likely reason for such biases seems to be related to errors in the heat sources located southward, over the tropical Pacific.

In this study, it is shown how changes in physical parameterizations,
including clouds, affect systematic biases in the simulation of
modern climate and its changes observed in recent decades.

Model and Numerical Experiments

The INMCM60 model, like the previous INMCM48 [1], consists of three major components: atmospheric dynamics, aerosol evolution, and ocean dynamics. The atmospheric component incorporates a land model including surface, vegetation, and soil. The oceanic component also encompasses a sea-ice evolution model. Both versions in the atmosphere have a spatial 2° × 1° longitude-by-latitude resolution and 21 vertical levels up to 10 hPa. In the ocean, the resolution is 1° × 0.5° and 40 levels.

The following changes have been introduced into the model
compared to INMCM48.

Parameterization of clouds and large-scale condensation is identical to that described in [4], except that tuning parameters of this parameterization differ from any of the versions outlined in [3], being, however, closest to version 4. The main difference from it is that the cloud water flux rating boundary-layer clouds is estimated not only for reasons of boundary-layer turbulence development, but also from the condition of moist instability, which, under deep convection, results in fewer clouds in the boundary layer and more in the upper troposphere. The equilibrium sensitivity of such a version to a doubling of atmospheric СО2 is about 3.3 K.

The aerosol scheme has also been updated by including a change in the calculation of natural emissions of sulfate aerosol [5] and wet scavenging, as well as the influence of aerosol concentration on the cloud droplet radius, i.e., the first indirect effect [6]. Numerical values of the constants, however, were taken to be a little different from those used in [5]. Additionally, the improved scheme of snow evolution taking into account refreezing and the calculation of the snow albedo [7] were introduced to the model. The calculation of universal functions in the atmospheric boundary layer in stable stratification has also been changed: in the latest model version, such functions assume turbulence at even large gradient Richardson numbers [8].

A numerical model experiment to simulate a preindustrial climate was run for
180 years,
not including the 200 years when equilibrium was reached.

All climate forcings in this experiment were held at their 1850 level. Along with a preindustrial experiment, a numerical experiment was run to simulate climate change in 1850–2029, for which forcings for 1850–2014 were prescribed consistent with observational estimates [9], while forcings for 2015–2029 were set according to the Shared Socioeconomic Pathway (SSP3-7.0) scenario [10].

To verify the simulation of present-day climate, the data from the experiment with a realistic forcing change for 1985–2014 were used and compared against the European Centre for Medium-Range Weather Forecasts (ECMWF) Reanalysis fifth generation (ERA5) data [11], the Global Precipitation Climatology Project, version 2.3 (GPCP 2.3) precipitation data [12], and the Clouds and the Earth’s Radiant Energy System (CERES) Energy Balanced and Fitted Edition 4.1 (CERES-EBAF 4.1) top-of-atmosphere (TOA) outgoing radiation fluxes [13].

The root-mean-square deviation of the annual and monthly averages of modeled and observed fields was used as a measure for the deviation of model data from observations, for which the observed fields were interpolated into a model grid. For calculating the sea-level pressure and 850-hPa temperature errors, grid points with a height over 1500 m were excluded. The modeled surface air temperatures were compared with the Met Office Hadley Center/Climatic Research Unit version 5  (HadCRUT5) dataset [14].

Results

Below are some results of the present-day climate simulation. Because changes in the model were introduced mainly into the scheme of atmospheric dynamics and land surface and there were no essential changes in the oceanic component, we shall restrict our discussion to atmospheric dynamics.

 

 

Table 1 demonstrates that the norms of errors in most fields were reduced. Changing the calculation of cloud cover and properties has improved the cloud radiative forcing, and the norm of errors for both longwave and shortwave forcing decreased by 10–20% in the new version compared to its predecessor. The global average of short-wave cloud radiative forcing is –47.7 W/m2 in the new version, –40.5 W/m2 in the previous version, and about –47 W/m2 in CERES-EBAF. The average TOA longwave radiative forcing is 29.5 W/m2 in the new version, 23.2 W/m2 in the previous version, and 28 W/m2 in CERES-EBAF. Thus, the average longwave and shortwave cloud radiative forcing in the new model version has proven to be much closer to observations than in the previous version.

From Table 1, the norm of the systematic bias has decreased significantly for 850-hPa temperatures and 500-hPa geopotential height. It was reduced mainly because average values of these fields approached observations, whereas the averages of both fields in INMCM48 were underestimated.

Figure 3 Volodin et al (2023)

We now consider the simulation of climate changes for 1850–2021. The 5-year mean surface temperature from HadCRUT5 (black), INMCM48 (blue), and INMCM60 (red) is shown in Fig. 3. The average for the period 1850 to 1899 is subtracted for each of the three datasets. The model data are slightly extended to the future, so that the most recent value matches the 2025–2029 average. It can be seen that warming in both versions by 2010 is about 1 K, approximately consistent with observations. The observed climate changes, such as the warmer 1940s and 1950s and the slower warming, or even a small cooling, in the 1960s and 1970s, are also obtained in both model versions. However, the warming after 2010–2014 turns out to be far larger in the new version than in the previous one, with differences reaching 0.5 K in 2025–2029. The discrepancies between the two versions are most distinct in the rate of temperature rise from 1990–1994 to 2025–2029. In INMCM48, the temperature rises by about 0.8 K, while the increase for INMCM60 is about 1.5 K. The discrepancy appears to have been caused primarily by a different sensitivity of the models, but a substantial contribution may also come from natural variability, so a more reliable conclusion could be made only by running ensemble numerical experiments.

Figure 4 Volodin et al. (2023)

Figure 4 displays the difference in surface air temperature from HadCRUT5 (top), the new version (midddle) , and the previous version (bottom) in 2000–2021 and 1979–1999. It is the interval where the warming was the largest, as is seen in Fig. 3. The observational data show that the largest warming, above 2 K, was in the Arctic, there was a warming of about 1 K in the Northern Hemisphere midlatitudes, and there was hardly any warming over the Southern Ocean. The pattern associated with a transition of the Pacific Decadal Oscillation (PDO) from the positive phase to the negative one appears over the Pacific Ocean. The new model version simulates a temperature rise at high and middle northern latitudes more closely to observations, whereas the previous version underestimates the rise in temperature in that region.

At the same time, over the tropical oceans where the observed warming is small, the data from the previous model version agree better with observations, while the new version overestimates warming. Both models failed to reproduce the Pacific Ocean temperature changes resulting from a positive to negative phase transition of PDO, as well as near zero temperature changes at southern midlatitudes.  Large differences of temperature change in the Atlantic sector of the Arctic, where there is some temperature decrease in the INMCM48 model and substantial increase in the INMCM60, are most probably caused by natural climate fluctuations in this region, so a reliable conclusion regarding the response of these model versions to observed forcing could also be drawn here only by running ensemble numerical experiments.

Conclusions

The INMCM60 climate model is able to simulate present-day climate better than the previous version The largest decrease can be seen in systematic biases connected with an overestimation of surface temperatures at southern midlatitudes, an underestimation of surface air temperatures in the Arctic, and an underestimation of polar tropopause and tropospheric temperatures in the tropics. The simulation of the cloud radiative forcing has also improved.

Despite different equilibrium sensitivities to doubled СО2, both model versions show approximately the same global warming by 2010–2015, similar to observations. However, projections of global temperature for 2025–2029 already differ between the two model versions by about 0.5 K. A more reliable conclusion regarding the difference in the simulation of current climate changes by the two model versions could have been made by running ensemble simulations, but this is likely to be done later because of the large amount of computational time and computer resources it will take.

My Comments
  1.  Note that INMCM60 runs hotter than version 48 and HadCRUT5. However, as the author points out, this is only a single simulation run, and a truer result will come later from an ensemble of multiple runs. There were several other references to tetnative findings awaiting ensemble runs yet to be done.
    For example see the comparable ensemble performance of the previous version (then referred to as INMCM5)

Figure 1. The 5-year mean GMST (K) anomaly with respect to 1850–1899 for HadCRUTv4 (thick solid black); model mean (thick solid red). Dashed thin lines represent data from individual model runs: 1 – purple, 2 – dark blue, 3 – blue, 4 – green, 5 – yellow, 6 – orange, 7 – magenta. In this and the next figures numbers on the time axis indicate the first year of the 5-year mean.

 

2.  Secondly, this study confirms the warming impacts of cloud parameters that appear in all the CMIP6 models. They all run hotter primarily because of changes in cloud settings. The author explains how INMCM60 performance improved in various respects, but it came with increased CO2 sensitivity. That value rose from 1.8C per doubling to 3.3C, shifting the model from lowest to middle of the range of CMIP6 models. (See Climate Models: Good, Bad and Ugly)

Figure 8: Warming in the tropical troposphere according to the CMIP6 models. Trends 1979–2014 (except the rightmost model, which is to 2007), for 20°N–20°S, 300–200 hPa.

3.  Thirdly, the temperature record standard has changed with a warming bias. See below from Clive Best comparison between HadCrut 4.6 and HadCrut 5.

The HadCRUT5 data show about a 0.1C increase in annual global temperatures compared to HadCRUT4.6. There are two reasons for this.

The change in sea surface temperatures moving from HadSST3 to HadSST4
The interpolation of nearby station data into previously empty grid cells.

Here I look into how large each effect is. Shown above is a comparison of HadCRUt4.6 with HadCRUT5.

Coincidentally or not, with the temperature standard shifting to HadCrut 5, model parameters shifted to show more warming to match. Skeptics of climate models are not encouraged by seeing warming added into the temperature record, followed by models tuned to increase CO2 warming.

4. The additional warming in both the model and in HadCRUT5 is mostly located in the Arctic. However, those observations include a warming bias derived from using datasets of anomalies rather than actual temperature readings.

Clive Best provides this animation of recent monthly temperature anomalies which demonstrates how most variability in anomalies occur over northern continents.

See Temperature Misunderstandings

The main problem with all the existing observational datasets is that they don’t actually measure the global temperature at all. Instead they measure the global average temperature ‘anomaly’. . .The use of anomalies introduces a new bias because they are now dominated by the larger ‘anomalies’ occurring at cold places in high latitudes. The reason for this is obvious, because all extreme seasonal variations in temperature occur in northern continents, with the exception of Antarctica. Increases in anomalies are mainly due to an increase in the minimum winter temperatures, especially near the arctic circle.

A study of temperature trends recorded at weather stations around the Arctic showed the same pattern as the rest of NH.  See Arctic Warming Unalarming

5. The CMIP program specifies that participating models include CO2 forcing and exclude solar forcing. Aerosols are the main parameter for tuning models to match. Scafetta has shown recently that models perform better when a solar forcing proxy is included. See Empirical Proof Sun Driving Climate (Scafetta 2023)

• The role of the Sun in climate change is hotly debated with diverse models.

• The Earth’s climate is likely influenced by the Sun through a variety of physical mechanisms.

• Balanced multi-proxy solar records were created and their climate effect assessed.

• Factors other than direct TSI forcing account for around 80% of the solar influence on the climate.

• Important solar-climate mechanisms must be investigated before developing reliable GCMs.

This issue may well become crucial if we go into a cooling period due to a drop in solar activity.

Summation

I appreciate very much the diligence and candor shown by the INM team in pursuing this monumental modeling challenge. The many complexities are evident, as well as the exacting attention to details in the attempt to dynamically and realistically represent Earth’s climate. It is also clear that clouds continue to be a major obstacle to model performance, both hindcasting and forecasting. I look forward to their future results.

Resources

Top Climate Model Gets Better

Best Climate Model: Mild Warming Forecasted

Temperatures According to Climate Models

What If Climate is Self-Regulating?

Andy Kessler writes at WSJ Can the Climate Heal Itself?  Excerpts in italics with my bolds and added images.

Dissenters from the catastrophe consensus on warming are worth listening to.

Stop with all the existential-crisis talk. President Biden said, “Climate change is literally an existential threat to our nation and to the world.” Defense Secretary Lloyd Austin also talks about the “existential threat” of climate change. National security adviser Jake Sullivan identifies an “accelerating climate crisis” as one reason for a “new consensus” for government picking winners and losers in the economy. Be wary of those touting consensus.

But what if the entire premise is wrong? What if the Earth is self-healing? Before you hurl the “climate denier” invective at me, let’s think this through. Earth has been around for 4.5 billion years— living organisms for 3.7 billion. Surely, an enlightened engineer might think, the planet’s creator built in a mechanism to regulate heat, or we wouldn’t still be here to worry about it.

The theory of climate change is that excess carbon dioxide and methane trap the sun’s radiation in the atmosphere, and these man-made greenhouse gases reflect more of that heat back to Earth, warming the planet. Pretty simple. Eventually, we reach a tipping point when positive feedback loops form—less ice to reflect sunlight, warm oceans that can no longer absorb carbon dioxide—and then we fry, existentially. So lose those gas stoves and carbon spewing Suburbans.

Note nearly half incoming solar energy is not absorbed by Earth’s surface.

But nothing is simple. What about negative feedback loops? Examples: human sweat and its cooling condensation or our irises dilating or constricting based on the amount of light coming in. Clouds, which can block the sun or trap its radiation, are rarely mentioned in climate talk.

Why? Because clouds are notoriously difficult to model in climate simulations. Steven Koonin, a New York University professor and author of “Unsettled,” tells me that today’s computing power can typically model the Earth’s atmosphere in grids 60 miles on a side. Pretty coarse. So, Mr. Koonin says, “the properties of clouds in climate models are often adjusted or ‘tuned’ to match observations.” Tuned!

Last month the coddling modelers at the United Nations’ World Meteorological Organization stated that “warming El Niño” and “human-induced climate change” mean there is a “66% likelihood that annual average global temperatures will exceed the threshold of 1.5 degrees Celsius above preindustrial levels by 2027.” Notice that El Niño is mentioned first.

To enlarge open image in new tab.

Richard Lindzen, a professor at the Massachusetts Institute of Technology and lead author of an early Intergovernmental Panel on Climate Change report, told me, “Temperatures in the tropics remain relatively constant compared with changes in the tropics-to-pole temperatures. The tropics-polar difference is about 40 degrees Celsius today but was 20 degrees during the warm Eocene Epoch and 60 degrees during Ice Ages.” This difference has more to do with changes in the Earth’s rotation, like wobbling, than anything else. According to Mr. Lindzen, this effect is some 70 times as great as human-made greenhouse gases.

OK, back to clouds. Cumulus clouds, the puffy ones often called thunderclouds, are an important convection element, carrying heat from the Earth’s surface to the upper atmosphere. Above them are high-altitude cirrus clouds, which can reflect heat back toward the surface. A 2001 Lindzen paper, however, suggests that high-level cirrus clouds in the tropics dissipate as temperatures rise. These thinning cirrus clouds allow more heat to escape. It’s called the Iris Effect, like a temperature-controlled vent opener for an actual greenhouse so you don’t (existentially) fry your plants. Yes, Earth has a safety valve.

Mr. Lindzen says, “This more than offsets the effect of greenhouse gases.” As you can imagine, theories debunking the climate consensus are met with rebuttals and more papers. Often, Mr. Lindzen points out, critics, “to maintain the warming narrative, adjust their models, especially coverage and reflection or albedo of clouds in the tropics.” More tuning.

A 2021 paper co-authored by Mr. Lindzen shows strong support for an Iris Effect.  Maybe Earth really was built by an engineer. Proof? None other than astronomer Carl Sagan described the Faint Young Sun Paradox that, 2.5 billion years ago, the sun’s energy was 30% less, but Earth’s climate was basically the same as today. Cirrus clouds likely formed to trap heat—a closed Iris and a negative feedback loop at work.

Figure 2: At higher temperatures there are more thunderstorms over the ocean and the area without high level clouds (dry and clear) expands further and thus allows more heat to radiate off into space (strong OLR) than when temperatures are lower, i.e. when the iris is smaller. Source: Figure 1 from MS15.

In a 2015 Nature Geoscience paper, Thorsten Mauritsen and Bjorn Stephen at the Max Planck Institute for Meteorology reran climate models using the Iris Effect and found them better at modeling historic observations. No need for tuning. Wouldn’t it be nice if the U.N. used realistic cloud and climate models?

Earth has warmed, but I’m convinced negative feedback loops will save us. Dismissing the Iris Effect or detuning it isn’t science. Sadly, climate science has morphed into climate rhetoric. And note, Treasury Secretary Janet Yellen explained in April that green spending “is, at its core, about turning the climate crisis into an economic opportunity.” Hmmm. “Catastrophic,” “existential” and “crisis” are cloudy thinking. Negative feedback is welcome. Dissenters from the catastrophe consensus on warming are worth listening to.

Footnote–Phanerozoic Temperatures

Maurice Lavigne commented that the best evidence of our self-regulating climate is found in the Phanerozoic temperature record.  I had to find out what he meant, which led me to discover this:

The PhanSST global database of Phanerozoic sea surface temperature proxy data

And this graph from Nir Shaviv and Jan Veizer:

Cosmic radiation and temperature through Phanerozoic according to Nir Shaviv and Jan Veizer. The vertical axis on the left represents the temperature as deviations from present temperature. The vertical axis on the right shows the cosmic radiation as multiples of radiation today – today’s radiation is set to 1. Note that the right scale is inverted so that strong radiation can be compared to low temperature. The red curve represents the temperature and the blue radiation. Temperature and cosmic radiation appear to have a very good correlation. The horizontal axis represents time through Phanerozoic’s more than 500 million years. Note that the Carboniferous is divided into “Missisipian” and “Pennsylvanian”, that is an American custom, referring to different types of coal from the coal mines.

The image above comes from Christopher Scotese PaleoMAP project, showing the dramatic temperature and climate shifts, hothouse to icehouse and everything in between.  Finally, a graph showing these temperature cycles unrelated to CO2 concentrations.

See Also More Evidence of Nature’s Sunscreen

Greenhouse with adjustable sun screens to control warming.