I may be one of the first scientists in the country to know that
predicting long-term temperatures is not possible.
Almost 50 years ago, while in grad school, I had a contract from an Army research lab to use a state-of-the-art models to predict long-term temperatures. I quickly realized that the goal of the project, to forecast accurately the temperature long-term, was impossible because small errors in data inputs could result in huge forecasts errors. Equally important was that errors compounded so quicklythat it caused the error ranges to explode. The results were junk.
As an example, what good is a temperature forecast with an error range
of plus or minus one hundred degrees?
I give university speeches to scientists and tell them: if you ever see some data or forecasts, your first question has to be “what’s the error range?” If you don’t know the error range, the data are almost useless. It’s not coincidental that the Climate Mafia don’t highlight this problem
So what about modern technology solving these problems? These error problems are still true today. It’s not that the long-term temperature forecasts are wrong; it’s that they can’t be right. All global warming modelers know this, or they are incredibly stupid, or they just lie about it for money or power.
When the U.N. Intergovernmental Panel on Climate Change made even a pretense of being science-based, they used to admit it. From the 2001 IPCC Third Assessment Report:
“The climate system is a coupled non-linear chaotic system, and therefore
the long-term prediction of future climate states is not possible.”
The weather is a coupled, non-linear chaotic system. Chaos theory says very small changes in inputs can result in totally different outcomes. This concept is counterintuitive for most people. We intrinsically think that if you’re a little off at the beginning, you should be a little off at the end. Try that on a mountain trail next to a cliff.
The Climate Mafia know that this is true, but they still want money and power. They argue that even though you can’t make a real temperature forecast, they can create a completely bogus forecasting approach, where they take a bunch of different climate models that don’t agree (so much for settled science) and combine their outputs. They then say voilà: we have a correct prediction, and they use pseudo-statistics to get around the error problem. The way I visualize it is, if you take a bunch (an ensemble sounds more scientific) of wrong answers and then combine them, that is the right answer. Absurd.
Since your input data are critical to forecasting the Chaotic Future, fully understanding past temperatures is also critical. The Climate Mafia create the entirely bogus concept of an “average Earth temperature” to create a bogus base data set for their bogus models. The Warming Scammers like to use a garbage temperature history that starts about 1850. The Scammers say that their temperatures increased since 1850, just coincidentally at the end of a three-hundred-year cooling cycle, represent the rise of the industrial pollution age. In 1850, and even in 1950, only a small percentage of world’s population could even be considered close to industrialized. Look at India, Africa, and China then: almost medieval energy use patterns until really recently. Humans have been around in their current form for many tens of thousands of years. To say the weather since the 1850s is representative of anything from a statistical perspective is a joke.
So what kind of temperature data do we have since 1850?
With oceans and ice caps covering over 80+% of the world’s surface, we have virtually no reliable long-term data on any of that, other than the last few decades. Even then, you are talking about a relatively small number of measuring devises in all those places. (Do you check the weather a few hundred miles away to know if you need an umbrella?) How about the temperature trends in deserts, on mountains, in the middle of Africa, South America, Siberia — at sea level, a hundred feet elevation, a thousand feet elevation? The data are so bad in all of the Southern Hemisphere — half the globe — that there are only a few datasets even close to reliable since the 1850s. There are almost no real, reliable, and complete long-term data globally, and particularly none reliable enough to create model of a chaotic system entirely dependent on very accurate input data.
The concept of “average Earth temperature” is critical to their bogus forecasting, but because we think “average” generally means something useful, it gets a mental pass. As an example, describe the “average” human.
Let me offer a thought experiment. What is the average temperature
of your house within one degree?
How many sensors, with what degree of accuracy reading the temperature, how often, would you need to know the average temperature within one degree? One sensor won’t do it. Would ten sensors (a hundred? a thousand?) be needed to cover the ceilings, floors, six feet up, each corner of every room, near every doors and window and heat source, measuring the temperature every hour, minute, second for a period of years, to get an average temperature within one degree? How do you weight a gauge in a big room versus a small one, or in the ceiling versus the floor?
If we can’t even figure out the long-term average temperature of one building, it is complete hubris to think we can create accurate enough global temperature inputs to predict 100 years out.
I won’t even get into outright data fraud, like lowering the hot 1930s or selectively changing input locations to make the current temperatures look hotter. The data are bad enough on their own. Using faulty data to predict the future creates faulty forecasts. Garbage in, garbage out.
Perhaps the greatest part of the Global Warming Scam is that it requires a complete disregard for common sense. We all know that weather forecasters can’t predict next week’s weather within 1 degree, but the Scammers push the lie that they can predict the temperature a hundred years from now within a degree or two. That literally defies credulity.
More importantly, say you had perfect data and a perfect model. How could we possibly know what impact that will have? Another thought experiment: How much will temperatures vary where you live today? Ten degrees? Twenty degrees? Thirty? How much does it vary in a year? For most of the U.S., that number might be 50 degrees, a 100 degrees. So plants and animals have adapted to 20-degree temperatures changes in a day or 100-degree temperature changes in a year.
Daily average temperature variability of Bolu City, Turkey, and its 365-day moving average.
Somehow a 1- or 2-degree temperature change in a hundred years
is going to take them out? That’s ridiculous.
Finally, when somebody offers me a forecast, my first question is, how right have your other forecasts been?We are now in the third or fourth ten-year period of the last forty years when the world is going to end in ten years. That tells you all you need to know about global warming forecasting.
The fact that any counter-narratives have to be censored is also damning. A final fact to consider is one my mother taught me at a young age: when the other side starts calling you names (deniers, anti-vaxxers), you know they have lost the argument.
And yet we should totally restructure society based on those impossible models. This was probably the greatest scientific lie in history, and people believed it. After that, selling the lie that even though mRNA vaccines have never worked in thirty years and generally killed all their test subjects but are now safe and effective for humans, including babies, after two months of limited testing, is child’s play. Joseph Goebbels, a proponent of the “Big Lie,” would be proud about both lies.
Before we restructure the world based on models, we must realize they can’t be right.
The blue background obscures these are estimates of TRILLIONS of Dollars.
CO2 Coalition explains. Nobel Laureate (Physics 2022) Dr. John Clauser was to present a seminar on climate models to the IMF on Thursday and now his talk has been summarily cancelled. According to an email he received last evening, the Director of the Independent Evaluation Office of the International Monetary Fund, Pablo Moreno, had read the flyer for John’s July 25 zoom talk and summarily and immediately canceled the talk. Technically, it was “postponed.”
Dr. Clauser had previously criticized the awarding of the 2021 Nobel Prize for work in the development of computer models predicting global warming and told President Biden that he disagreed with his climate policies. Dr. Clauser has developed a climate model that adds a new significant dominant process to existing models. The process involves the visible light reflected by cumulus clouds that cover, on average, half of the Earth. Existing models greatly underestimate this cloud feedback, which provides a very powerful, dominant thermostatic control of the Earth’s temperature.
More recently, he addressed the Korea Quantum Conference where he stated, “I don’t believe there is a climate crisis” and expressed his belief that “key processes are exaggerated and misunderstood by approximately 200 times.” Dr. Clauser, who is recognized as a climate change skeptic, also became a member of the board of directors of the CO2 Coalition last month, an organization that argues that carbon dioxide emissions are beneficial to life on Earth.
Current climate models demonstrate large discrepancy in equilibrium climate sensitivity (ECS). The effects of cloudiness parameterization changes on the ECS of the INM-CM4-8 climate model were investigated. This model shows the lowest ECS among CMIP6 models. Reasonable changes in the parameterization of the degree of cloudiness yielded ECS variability of 1.8–4.1 K in INM-CM4-8, which was more than half of the interval for the CMIP6 models.
The three principal mechanisms responsible for the increased ECS were increased cloudiness dissipation in warmer climates due to the increased water vapor deficit in the non-cloud fraction of a cell, decreased cloudiness generation in the atmospheric boundary layer in warm climates, and the instantaneous cloud response to CO2 increases due to stratification changes.
Introduction
In CMIP6 the lowest and highest ECS (Equilibrium Climate Sensitivity) values are 1.8 and 5.6 K, respectively (Zelinka et al., 2020). Climate response to some external forcing produces feedbacks. Positive feedback enhances the response to forcing, negative feedback weakens it. Analysis of climate feedback shows that cloud feedback is the principal reason for the broad range of ECS (Zelinka et al., 2020). Clouds (especially low clouds) are significantly reduced with global warmingin models with high ECS, resulting in positive feedback. Models with low sensitivity show small cloudiness changes with global warming; some models feature an increase in low clouds in warmer climates, creating a negative feedback.
Clouds produce shortwave and longwave radiative effects. The shortwave cloud radiative effect (SW CRE) is generally negative, because cloudiness reflects solar radiation that would otherwise be absorbed by the climate system. The shortwave effect is usually strongest for low clouds that have high amounts of liquid water and high albedos. The longwave cloud radiative effect (LW CRE) is generally positive, because cloud tops are usually much colder than the surface of the Earth; thus, thermal radiation from the cloud top is much lower than that from the surface. Negative/positive CRE produces cooling/warming from clouds.
The goal of this study is that we turn off some mechanisms responsible for large-scale cloud evolution that lead to increase or decrease ECS, and ECS is changed by the factor of more than 2. The role of a chosen mechanism in decrease or increase of ECS can be clearly seen. At the same time, all model versions show preindustrial climate with systematic biases compared to that for the version used in CMIP6. A realistic way of estimating the impact of change in parameterization on cloud feedback by keeping the cloud mean state realistic in all model versions and running 4xCO2 experiments rather than uniform +4K experiments are used in this study.
Table 1. Summary of Model Versions
Note. Equilibrium climate sensitivity ECS (K), effective radiation forcing ERF (W m−2), climate feedback parameter λ (W m−2 K−1), shortwave cloud radiative feedback СRFSW, longwave cloud radiative feedback СRFLW, net cloud radiative feedback СRFNET (W m−2 K−1) and instantaneous cloud radiative forcing change ΔCREINST (Wm−2).
The ECS estimation method is commonly used in CMIP5 and CMIP6 and was proposed by Gregory et al. (2004). The two model runs performed were the control run, in which all forcings were fixed at preindustrial levels, and the run where the concentration of CO2 in the atmosphere was four times higher than in the control run (4CO2 run). The initial state for both runs was the same and taken from a sufficiently long control run. Each run had a length of 150 years. Subsequently, the global mean difference of GMST and the heat balance at the top of atmosphere (THB) for 4CO2 and the control run were calculated for each model year.
Results of the sensitivity experiments performed with the five climate model versions.
Version 1 shows a very low ECS of 1.8 K due to a low negative climate feedback parameter value of −1.46 W m−2 K−1 (interval from −0.6 to −1.8 W m−2 K−1 for CMIP5 and CMIP6) and a low ERF value of 2.7 W m−2 (intervals of 2.6–4.4 and 2.7–4.3 W m−2 for CMIP5 and CMIP6, respectively, Zelinka et al., 2020). The low ECS was accompanied by mostly negative CRF in both the SW and LW spectral intervals (Figure 2 below).
Figure 2 Shortwave (top), longwave (middle) and net (bottom) cloud radiation feedback (Wm−2 K−1) for model version 1 (purple), 2 (yellow), 3 (red), 4(green), and 5 (blue). Data are multiplied by cosine of latitude.
The parameterization replacement scheme for cloudiness in version 2 dramatically changed all the parameters, and the ECS more than doubled to 3.8 K. ERF increased to 3.8 W m−2, without changes to the radiation code because ΔCREINST changed from −0.88 W m−2 to −0.13 W m−2. Additionally, the climate feedback parameter increased from −1.46 W m−2 K−1 to −1.0 W m−2 K−1. In version 2, global warming was associated with decreased cloudiness at all levels. The net cloud radiative feedback became positive. Version 2 yielded significantly increased net and SW cloud radiative feedbacks at all latitudes compared with version 1. Analysis of the sensitivity experiment results of versions 3–5 helps understand the mechanisms of these significant changes.
Version 3 features a suppressed mechanism of high tropical cloudiness due to decreased convective mass flux and higher ECS than version 2 (4.1 K); however, the change is not very pronounced. The LW CRF in the tropics increases in version 3 compared to version 2. The decrease in SW CRE is not very pronounced; therefore, increased net CRF increases ECS. This confirms our hypothesis that suppressing the decrease in tropical cloudiness should increase ECS and that the impact of this mechanism on ECS is noticeable but not very strong.
ECS is noticeably lower in version 4 (2.9 K) than in version 2. Thus, the mechanism of the decrease in boundary layer cloudiness due to decreased cloudiness generation by boundary layer turbulence is crucial for ECS. SW and LW CRF decreased in version 4 compared to version 2, primarily in the tropics and subtropics.
The mechanism of increased cloud dissipation under global warming conditions was suppressed in version 5, ECS was reduced to 2.5 K, and the climate feedback parameter decreased to −1.56 W m−2 K−1. Additionally, the SW CRF decreased in version 5 compared with version 4, primarily in the tropics and subtropics. In this version, all the mechanisms that decrease clouds with increased temperature, as raised in the previous section, are suppressed. The principal reason for the ECS difference between versions 1 and 5 is the instantaneous adjustment rather than the feedback (see Table 1). ΔCREINST values in versions 1 and 5 were −0.88 and 0.16 W m−2, respectively.
Conclusion
All model versions demonstrate similar model bias values for annual mean CRE, near-surface temperature, and precipitation; thus, determining a relation between present-day climate simulation model quality and ECS is difficult. Version 1 has slightly better quality (because it is a CMIP6 version) due to extensive tuning. The cloudiness scheme used in version 1 contained the dependence of low clouds on stratification. An increase in CO2 leads to more stable stratification and more low clouds and may be the primary cause of the low ECS. Bretherton (2015) and Geoffroy et al. (2017) obtained similar results. The decrease in clouds in warmer climates due to the mixing of cloud air with the unsaturated environment was also stated in a review by Gettelman and Sherwood (2016). Our results confirm those by Bony et al. (2006), Brient and Bony (2012), and others that a significant change in the response of low clouds to global warming leads to significant changes in cloud radiative feedback and ECS.
Comment
The main finding is: If warming increases low clouds, then SW (incoming solar radiation) is reduced, counteracting the warming, in effect a negative feedback. That is consistent with Clauser’s position.
Previous posts (linked at end) discuss how the climate model from RAS (Russian Academy of Science) has evolved through several versions. The interest arose because of its greater ability to replicate the past temperature history. The model is part of the CMIP program which will soon go the next step to CMIP7, and is one of the first to test with a new climate simulation.
This synopsis is made possible thanks to the lead author, Evgeny M. Volodin, providing me with a copy of the article published May 11, 2023 in Izvestiya, Atmospheric and Oceanic Physics. Those with institutional research credentials can access the paper at Simulation of Present-Day Climate with the INMCM60 Model by E. M. Volodin, et al. (2023). Excerpts are in italics with my bolds and added images and comment.
Abstract
A simulation of the present-day climate with a new version of the climate model developed at the Institute of Numerical Mathematics of the Russian Academy of Sciences (INM RAS) is considered. This model differs from the previous version by a change in the cloud and condensation scheme, which leads to a higher sensitivity to the increase in СО2. The changes are also included in the calculation of aerosol evolution, the aerosol indirect effect, land snow, atmospheric boundary-layer parameterization, and some other schemes.
The model is capable of reproducing near-surface air temperature, precipitation, sea-level pressure, cloud radiative forcing, and other parameters better than the previous version. The largest improvement can be seen in the simulation of temperature in the tropical troposphere and at the polar tropopause and surface temperature of the Southern Ocean. The simulation of climate changes in 1850 2021 by the two model versions is discussed.
Introduction
A new version has been developed on the basis of the climate system model described in [1]. It was shown [2] that introducing changes only to the cloud parameterization would produce climate models with different equilibrium sensitivities to a doubling of СО2, in a range of 1.8 to 4.1 K. The INMCM48 version has the lowest sensitivity of 1.8 K among Coupled Model Intercomparison Project Phase 6 (CMIP6) models. The natural question then arises as to how parameterization changes that increase the equilibrium sensitivity affect the simulation of modern climate and of its changes observed in recent decades.
The INMCM48 version simulates modern climate quite well, but it has some systematic biases common to many of the current climate models, and biases specific only to this model. For example, most climate models overestimate surface temperatures and near surface air temperatures at southern midlatitudes and off the east coast of the tropical Pacific and Atlantic oceans and underestimate surface air temperatures in the Arctic (see, e.g., [3]). A typical error of many current models, as well as of INMCM48, is the cold polar tropopause and the warm tropical tropopause, resulting in an overestimation of the westerlies in the midlatitude stratosphere. Possible sources of such systematic biases are the errors in the simulation of cloud amount and optical properties.
In the next version, therefore, changes were first made in cloud parameterization. Furthermore, the INMCM48 exhibited systematic biases specific solely to it. These are, for example, the overestimation of sea-level pressure, as well as of geopotential at any level in the troposphere, over the North Pacific. The likely reason for such biases seems to be related to errors in the heat sources located southward, over the tropical Pacific.
In this study, it is shown how changes in physical parameterizations,
including clouds, affect systematic biases in the simulation of
modern climate and its changes observed in recent decades.
Model and Numerical Experiments
The INMCM60 model, like the previous INMCM48 [1], consists of three major components: atmospheric dynamics, aerosol evolution, and ocean dynamics. The atmospheric component incorporates a land model including surface, vegetation, and soil. The oceanic component also encompasses a sea-ice evolution model. Both versions in the atmosphere have a spatial 2° × 1° longitude-by-latitude resolution and 21 vertical levels up to 10 hPa. In the ocean, the resolution is 1° × 0.5° and 40 levels.
The following changes have been introduced into the model
compared to INMCM48.
Parameterization of clouds and large-scale condensation is identical to that described in [4], except that tuning parameters of this parameterization differ from any of the versions outlined in [3], being, however, closest to version 4. The main difference from it is that the cloud water flux rating boundary-layer clouds is estimated not only for reasons of boundary-layer turbulence development, but also from the condition of moist instability, which, under deep convection, results in fewer clouds in the boundary layer and more in the upper troposphere. The equilibrium sensitivity of such a version to a doubling of atmospheric СО2 is about 3.3 K.
The aerosol scheme has also been updated by including a change in the calculation of natural emissions of sulfate aerosol [5] and wet scavenging, as well as the influence of aerosol concentration on the cloud droplet radius, i.e., the first indirect effect [6]. Numerical values of the constants, however, were taken to be a little different from those used in [5]. Additionally, the improved scheme of snow evolution taking into account refreezing and the calculation of the snow albedo [7] were introduced to the model. The calculation of universal functions in the atmospheric boundary layer in stable stratification has also been changed: in the latest model version, such functions assume turbulence at even large gradient Richardson numbers [8].
A numerical model experiment to simulate a preindustrial climate was run for
180 years, not including the 200 years when equilibrium was reached.
All climate forcings in this experiment were held at their 1850 level. Along with a preindustrial experiment, a numerical experiment was run to simulate climate change in 1850–2029, for which forcings for 1850–2014 were prescribed consistent with observational estimates [9], while forcings for 2015–2029 were set according to the Shared Socioeconomic Pathway (SSP3-7.0) scenario [10].
To verify the simulation of present-day climate, the data from the experiment with a realistic forcing change for 1985–2014 were used and compared against the European Centre for Medium-Range Weather Forecasts (ECMWF) Reanalysis fifth generation (ERA5) data [11], the Global Precipitation Climatology Project, version 2.3 (GPCP 2.3) precipitation data [12], and the Clouds and the Earth’s Radiant Energy System (CERES) Energy Balanced and Fitted Edition 4.1 (CERES-EBAF 4.1) top-of-atmosphere (TOA) outgoing radiation fluxes [13].
The root-mean-square deviation of the annual and monthly averages of modeled and observed fields was used as a measure for the deviation of model data from observations, for which the observed fields were interpolated into a model grid. For calculating the sea-level pressure and 850-hPa temperature errors, grid points with a height over 1500 m were excluded. The modeled surface air temperatures were compared with the Met Office Hadley Center/Climatic Research Unit version 5 (HadCRUT5) dataset [14].
Results
Below are some results of the present-day climate simulation. Because changes in the model were introduced mainly into the scheme of atmospheric dynamics and land surface and there were no essential changes in the oceanic component, we shall restrict our discussion to atmospheric dynamics.
Table 1 demonstrates that the norms of errors in most fields were reduced. Changing the calculation of cloud cover and properties has improved the cloud radiative forcing, and the norm of errors for both longwave and shortwave forcing decreased by 10–20% in the new version compared to its predecessor. The global average of short-wave cloud radiative forcing is –47.7 W/m2 in the new version, –40.5 W/m2 in the previous version, and about –47 W/m2 in CERES-EBAF. The average TOA longwave radiative forcing is 29.5 W/m2 in the new version, 23.2 W/m2 in the previous version, and 28 W/m2 in CERES-EBAF. Thus, the average longwave and shortwave cloud radiative forcing in the new model version has proven to be much closer to observations than in the previous version.
From Table 1, the norm of the systematic bias has decreased significantly for 850-hPa temperatures and 500-hPa geopotential height. It was reduced mainly because average values of these fields approached observations, whereas the averages of both fields in INMCM48 were underestimated.
Figure 3 Volodin et al (2023)
We now consider the simulation of climate changes for 1850–2021. The 5-year mean surface temperature from HadCRUT5 (black), INMCM48 (blue), and INMCM60 (red) is shown in Fig. 3. The average for the period 1850 to 1899 is subtracted for each of the three datasets. The model data are slightly extended to the future, so that the most recent value matches the 2025–2029 average. It can be seen that warming in both versions by 2010 is about 1 K, approximately consistent with observations. The observed climate changes, such as the warmer 1940s and 1950s and the slower warming, or even a small cooling, in the 1960s and 1970s, are also obtained in both model versions. However, the warming after 2010–2014 turns out to be far larger in the new version than in the previous one, with differences reaching 0.5 K in 2025–2029. The discrepancies between the two versions are most distinct in the rate of temperature rise from 1990–1994 to 2025–2029. In INMCM48, the temperature rises by about 0.8 K, while the increase for INMCM60 is about 1.5 K. The discrepancy appears to have been caused primarily by a different sensitivity of the models, but a substantial contribution may also come from natural variability, so a more reliable conclusion could be made only by running ensemble numerical experiments.
Figure 4 Volodin et al. (2023)
Figure 4 displays the difference in surface air temperature from HadCRUT5 (top), the new version (midddle) , and the previous version (bottom) in 2000–2021 and 1979–1999. It is the interval where the warming was the largest, as is seen in Fig. 3. The observational data show that the largest warming, above 2 K, was in the Arctic, there was a warming of about 1 K in the Northern Hemisphere midlatitudes, and there was hardly any warming over the Southern Ocean. The pattern associated with a transition of the Pacific Decadal Oscillation (PDO) from the positive phase to the negative one appears over the Pacific Ocean. The new model version simulates a temperature rise at high and middle northern latitudes more closely to observations, whereas the previous version underestimates the rise in temperature in that region.
At the same time, over the tropical oceans where the observed warming is small, the data from the previous model version agree better with observations, while the new version overestimates warming.Both models failed to reproduce the Pacific Ocean temperature changes resulting from a positive to negative phase transition of PDO, as well as near zero temperature changes at southern midlatitudes. Large differences of temperature change in the Atlantic sector of the Arctic, where there is some temperature decrease in the INMCM48 model and substantial increase in the INMCM60, are most probably caused by natural climate fluctuations in this region, so a reliable conclusion regarding the response of these model versions to observed forcing could also be drawn here only by running ensemble numerical experiments.
Conclusions
The INMCM60 climate model is able to simulate present-day climate better than the previous version The largest decrease can be seen in systematic biases connected with an overestimation of surface temperatures at southern midlatitudes, an underestimation of surface air temperatures in the Arctic, and an underestimation of polar tropopause and tropospheric temperatures in the tropics. The simulation of the cloud radiative forcing has also improved.
Despite different equilibrium sensitivities to doubled СО2, both model versions show approximately the same global warming by 2010–2015, similar to observations. However, projections of global temperature for 2025–2029 already differ between the two model versions by about 0.5 K. A more reliable conclusion regarding the difference in the simulation of current climate changes by the two model versions could have been made by running ensemble simulations, but this is likely to be done later because of the large amount of computational time and computer resources it will take.
My Comments
Note that INMCM60 runs hotter than version 48 and HadCRUT5. However, as the author points out, this is only a single simulation run, and a truer result will come later from an ensemble of multiple runs. There were several other references to tetnative findings awaiting ensemble runs yet to be done.
For example see the comparable ensemble performance of the previous version (then referred to as INMCM5)
Figure 1. The 5-year mean GMST (K) anomaly with respect to 1850–1899 for HadCRUTv4 (thick solid black); model mean (thick solid red). Dashed thin lines represent data from individual model runs: 1 – purple, 2 – dark blue, 3 – blue, 4 – green, 5 – yellow, 6 – orange, 7 – magenta. In this and the next figures numbers on the time axis indicate the first year of the 5-year mean.
2. Secondly, this study confirms the warming impacts of cloud parameters that appear in all the CMIP6 models. They all run hotter primarily because of changes in cloud settings. The author explains how INMCM60 performance improved in various respects, but it came with increased CO2 sensitivity. That value rose from 1.8C per doubling to 3.3C, shifting the model from lowest to middle of the range of CMIP6 models. (See Climate Models: Good, Bad and Ugly)
Figure 8: Warming in the tropical troposphere according to the CMIP6 models. Trends 1979–2014 (except the rightmost model, which is to 2007), for 20°N–20°S, 300–200 hPa.
The HadCRUT5 data show about a 0.1C increase in annual global temperatures compared to HadCRUT4.6. There are two reasons for this.
The change in sea surface temperatures moving from HadSST3 to HadSST4 The interpolation of nearby station data into previously empty grid cells.
Here I look into how large each effect is. Shown above is a comparison of HadCRUt4.6 with HadCRUT5.
Coincidentally or not, with the temperature standard shifting to HadCrut 5, model parameters shifted to show more warming to match. Skeptics of climate models are not encouraged by seeing warming added into the temperature record, followed by models tuned to increase CO2 warming.
4. The additional warming in both the model and in HadCRUT5 is mostly located in the Arctic. However, those observations include a warming bias derived from using datasets of anomalies rather than actual temperature readings.
Clive Best provides this animation of recent monthly temperature anomalies which demonstrates how most variability in anomalies occur over northern continents.
The main problem with all the existing observational datasets is that they don’t actually measure the global temperature at all. Instead they measure the global average temperature ‘anomaly’. . .The use of anomalies introduces a new bias because they are now dominated by the larger ‘anomalies’ occurring at cold places in high latitudes. The reason for this is obvious, because all extreme seasonal variations in temperature occur in northern continents, with the exception of Antarctica. Increases in anomalies are mainly due to an increase in the minimum winter temperatures, especially near the arctic circle.
A study of temperature trends recorded at weather stations around the Arctic showed the same pattern as the rest of NH. See Arctic Warming Unalarming
5. The CMIP program specifies that participating models include CO2 forcing and exclude solar forcing. Aerosols are the main parameter for tuning models to match. Scafetta has shown recently that models perform better when a solar forcing proxy is included. See Empirical Proof Sun Driving Climate (Scafetta 2023)
• The role of the Sun in climate change is hotly debated with diverse models.
• The Earth’s climate is likely influenced by the Sun through a variety of physical mechanisms.
• Balanced multi-proxy solar records were created and their climate effect assessed.
• Factors other than direct TSI forcing account for around 80% of the solar influence on the climate.
• Important solar-climate mechanisms must be investigated before developing reliable GCMs.
This issue may well become crucial if we go into a cooling period due to a drop in solar activity.
Summation
I appreciate very much the diligence and candor shown by the INM team in pursuing this monumental modeling challenge. The many complexities are evident, as well as the exacting attention to details in the attempt to dynamically and realistically represent Earth’s climate. It is also clear that clouds continue to be a major obstacle to model performance, both hindcasting and forecasting. I look forward to their future results.
Dissenters from the catastrophe consensus on warming are worth listening to.
Stop with all the existential-crisis talk. President Biden said, “Climate change is literally an existential threat to our nation and to the world.” Defense Secretary Lloyd Austin also talks about the “existential threat” of climate change. National security adviser Jake Sullivan identifies an “accelerating climate crisis” as one reason for a “new consensus” for government picking winners and losers in the economy. Be wary of those touting consensus.
But what if the entire premise is wrong?What if the Earth is self-healing? Before you hurl the “climate denier” invective at me, let’s think this through. Earth has been around for 4.5 billion years— living organisms for 3.7 billion. Surely, an enlightened engineer might think, the planet’s creator built in a mechanism to regulate heat, or we wouldn’t still be here to worry about it.
The theory of climate change is that excess carbon dioxide and methane trap the sun’s radiation in the atmosphere, and these man-made greenhouse gases reflect more of that heat back to Earth, warming the planet. Pretty simple. Eventually, we reach a tipping point when positive feedback loops form—less ice to reflect sunlight, warm oceans that can no longer absorb carbon dioxide—and then we fry, existentially. So lose those gas stoves and carbon spewing Suburbans.
Note nearly half incoming solar energy is not absorbed by Earth’s surface.
But nothing is simple. What about negative feedback loops? Examples: human sweat and its cooling condensation or our irises dilating or constricting based on the amount of light coming in. Clouds, which can block the sun or trap its radiation, are rarely mentioned in climate talk.
Why? Because clouds are notoriously difficult to model in climate simulations. Steven Koonin, a New York University professor and author of “Unsettled,” tells me that today’s computing power can typically model the Earth’s atmosphere in grids 60 miles on a side. Pretty coarse. So, Mr. Koonin says, “the properties of clouds in climate models are often adjusted or ‘tuned’ to match observations.” Tuned!
Last month the coddling modelers at the United Nations’ World Meteorological Organization stated that “warming El Niño” and “human-induced climate change” mean there is a “66% likelihood that annual average global temperatures will exceed the threshold of 1.5 degrees Celsius above preindustrial levels by 2027.” Notice that El Niño is mentioned first.
To enlarge open image in new tab.
Richard Lindzen, a professor at the Massachusetts Institute of Technology and lead author of an early Intergovernmental Panel on Climate Change report, told me, “Temperatures in the tropics remain relatively constant compared with changes in the tropics-to-pole temperatures. The tropics-polar difference is about 40 degrees Celsius today but was 20 degrees during the warm Eocene Epoch and 60 degrees during Ice Ages.” This difference has more to do with changes in the Earth’s rotation, like wobbling, than anything else. According to Mr. Lindzen, this effect is some 70 times as great as human-made greenhouse gases.
OK, back to clouds. Cumulus clouds, the puffy ones often called thunderclouds, are an important convection element, carrying heat from the Earth’s surface to the upper atmosphere. Above them are high-altitude cirrus clouds, which can reflect heat back toward the surface. A 2001 Lindzen paper, however, suggests that high-level cirrus clouds in the tropics dissipate as temperatures rise. These thinning cirrus clouds allow more heat to escape. It’s called the Iris Effect, like a temperature-controlled vent opener for an actual greenhouse so you don’t (existentially) fry your plants. Yes, Earth has a safety valve.
Mr. Lindzen says, “This more than offsets the effect of greenhouse gases.” As you can imagine, theories debunking the climate consensus are met with rebuttals and more papers. Often, Mr. Lindzen points out, critics, “to maintain the warming narrative, adjust their models, especially coverage and reflection or albedo of clouds in the tropics.” More tuning.
A 2021 paper co-authored by Mr. Lindzen shows strong support for an Iris Effect. Maybe Earth really was built by an engineer. Proof? None other than astronomer Carl Sagan described the Faint Young Sun Paradox that, 2.5 billion years ago, the sun’s energy was 30% less, but Earth’s climate was basically the same as today. Cirrus clouds likely formed to trap heat—a closed Iris and a negative feedback loop at work.
Figure 2: At higher temperatures there are more thunderstorms over the ocean and the area without high level clouds (dry and clear) expands further and thus allows more heat to radiate off into space (strong OLR) than when temperatures are lower, i.e. when the iris is smaller. Source: Figure 1 from MS15.
In a 2015 Nature Geoscience paper, Thorsten Mauritsen and Bjorn Stephen at the Max Planck Institute for Meteorology reran climate models using the Iris Effect and found them better at modeling historic observations. No need for tuning. Wouldn’t it be nice if the U.N. used realistic cloud and climate models?
Earth has warmed, but I’m convinced negative feedback loops will save us. Dismissing the Iris Effect or detuning it isn’t science. Sadly, climate science has morphed into climate rhetoric. And note, Treasury Secretary Janet Yellen explained in April that green spending “is, at its core, about turning the climate crisis into an economic opportunity.” Hmmm. “Catastrophic,” “existential” and “crisis” are cloudy thinking. Negative feedback is welcome. Dissenters from the catastrophe consensus on warming are worth listening to.
Footnote–Phanerozoic Temperatures
Maurice Lavigne commented that the best evidence of our self-regulating climate is found in the Phanerozoic temperature record. I had to find out what he meant, which led me to discover this:
Cosmic radiation and temperature through Phanerozoic according to Nir Shaviv and Jan Veizer. The vertical axis on the left represents the temperature as deviations from present temperature. The vertical axis on the right shows the cosmic radiation as multiples of radiation today – today’s radiation is set to 1. Note that the right scale is inverted so that strong radiation can be compared to low temperature. The red curve represents the temperature and the blue radiation. Temperature and cosmic radiation appear to have a very good correlation. The horizontal axis represents time through Phanerozoic’s more than 500 million years. Note that the Carboniferous is divided into “Missisipian” and “Pennsylvanian”, that is an American custom, referring to different types of coal from the coal mines.
The image above comes from Christopher Scotese PaleoMAP project, showing the dramatic temperature and climate shifts, hothouse to icehouse and everything in between. Finally, a graph showing these temperature cycles unrelated to CO2 concentrations.
There are various answers to the title question. IPCC doctrine asserts that not only does more CO2 induce warming, it also triggers a water vapor positive feedback that triples the warming. Many other scientists, including some skeptical of any climate “emergency,” agree some CO2 warming is likely, but doubt the positive feedback, with the possibility the sign is wrong. Still others point out that increases of CO2 lag temperature increases on all time scales, from ice core data to last month’s observations. CO2 can hardly be claimed to cause warming, when CO2 changes do not precede the effect. [See Temps Cause CO2 Changes, Not the Reverse. ]
Below is a post describing how CO2 warming is not only lacking, but more CO2 actually increases planetary cooling. The mathematical analysis reveals a fundamental error in the past and only now subjected to correction.
In 1896, Svante Arrhenius proposed a model predicting that increased concentration of carbon dioxide and water vapour in the atmosphere would result in a warming of the planet. In his model, the warming effects of atmospheric carbon dioxide and water vapour in preventing heat flow from the Earth’ s surface (now known as the “Greenhouse Effect”) are counteracted by a cooling effect where the same gasses are responsiblefor the radiation of heat to space from the atmosphere. His analysis found that there was a net warming effect and his model has remained the foundation of the Enhanced Greenhouse Effect—Global Warming hypothesis.
This paper attempts to quantify the parameters in his equations but on evaluation his model cannot produce thermodynamic equilibrium. A modified model is proposed which reveals that increased atmospheric emissivity enhances the ability of the atmosphere to radiate heat to space overcoming the cooling effect resulting in a net cooling of the planet. In consideration of this result, there is a need for greenhouse effect—global warming models to be revised.
1. Introduction
In 1896 Arrhenius proposed that changes in the levels of “carbonic acid” (carbon dioxide) in the atmosphere could substantially alter the surface temperature of the Earth. This has come to be known as the greenhouse effect. Arrhenius’ paper, “On the Influence of Carbonic Acid in the Air upon the Temperature of the Ground”, was published in Philosophical Magazine. Arrhenius concludes:
“If the quantity of carbonic acid in the air should sink to one-half its present percentage, the temperature would fall by about 4˚; a diminution to one-quarter would reduce the temperature by 8˚. On the other hand, any doubling of the percentage of carbon dioxide in the air would raise the temperature of the earth’s surface by 4˚; and if the carbon dioxide were increased fourfold, the temperature would rise by 8˚ ” [ 2 ].
It is interesting to note that Arrhenius considered this greenhouse effect a positive thing if we were to avoid the ice ages of the past. Nevertheless, Arrhenius’ theory has become the foundation of the enhanced greenhouse effect―global warming hypothesis in the 21st century. His model remains the basis for most modern energy equilibrium models.
2. Arrhenius’ Energy Equilibrium Model
Arrhenius’ proposed a two-part energy equilibrium model in which the atmosphere radiates the same amount of heat to space as it receives and, likewise, the ground transfers the same amount of heat to the atmosphere and to space as it receives. The model contains the following assumptions:
• Heat conducted from the center of the Earth is neglected.
• Heat flow by convection between the surface and the atmosphere and throughout the atmosphere remains constant.
• Cloud cover remains constant. This is questionable but allows the model to be quantified.
Part 1: Equilibrium of the Air
The balance of heat flow to and from the air (or atmosphere) has four components as shown in Figure 1. The arrow labelled S1 indicates the solar energy absorbed by the atmosphere. R indicates the infra-red radiation from the surface of the Earth to the atmosphere, M is the quantity of heat “conveyed” to the atmosphere by convection and Q1 represents heat loss from the atmosphere to space by radiation. All quantities are measured in terms of energy per unit area per unit time (W/m2).
Figure 1. Model of the energy balance of the atmosphere. The heat received by the atmosphere ( R+M+S1 ) equals the heat lost to space (Q1). In this single layer atmospheric model, the absorbing and emitting layers are one and the same.
Part 2: Thermal Equilibrium of the Ground
In the second part of his model, Arrhenius describes the heat flow equilibrium at the “ground” or surface of the Earth. There are four contributions to the surface heat flow as shown in Figure 2. S2 is the solar energy absorbed by the surface, R is the infra-red radiation emitted from the surface and transferred to the atmosphere, N is the heat conveyed to the atmosphere by convection and Q2 is the heat radiated to space from the surface. Note: Here Arrhenius uses the term N for the convective heat flow. It is equivalent to the term M used in the air equilibrium model.
Figure 2. The energy balance at the surface of the Earth. The energy received by the ground is equal to the energy lost.
3. Finding the Temperature of the Earth
Arrhenius combined these equations and, by eliminating the temperature of the atmosphere which according to Arrhenius “has no considerable interest”, he arrived at the following relationship:
ΔTg is the expected change in the temperature of the Earth for a change in atmospheric emissivity from ε1 to ε2. Arrhenius determined that the current transparency of the atmosphere was 0.31 and, therefore the emissivity/absorptivity ε1 = 0.69. The current mean temperature for the surface of the Earth can be assumed to be To = 288 K.
Figure 3. Arrhenius’ model is used to determine the mean surface temperature of the Earth as a function of atmospheric emissivity ε. For initial conditions, ε = 0.69 and the surface temperature is 288 K. An increase in atmospheric emissivity produces an increase in the surface temperature of the Earth.
Arrhenius estimated that a doubling of carbon dioxide concentration in the atmosphere would produce a change in emissivity from 0.69 to 0.78 raising the temperature of the surface by approximately 6 K. This value would be considered high by modern climate researchers; however, Arrhenius’ modelhas become the foundation of the greenhouse-global warming theory today. Arrhenius made no attempt to quantify the specific heat flow values in his model. At the time of his paper there was little quantitative data available relating to heat flow for the Earth.
4. Evaluation of Arrhenius’ Model under Present Conditions
More recently, Kiehl and Trenberth (K & T) [ 3 ] and others have quantified the heat flow values used in Arrhenius’ model. K & T’s data are summarised in Figure 4.
The reflected solar radiation, which plays no part in the energy balance described in this model, is ignored. R is the net radiative transfer from the ground to the atmosphere derived from K & T’s diagram. The majority of the heat radiated to space originates from the atmosphere (Q1 > Q2). And the majority of the heat lost from the ground is by means of convection to the atmosphere (M > R + Q2).
Figure 4. Model of the mean energy budget of the earth as determined by Kiehl and Trenberth.
Equation (5) Q2=(1−ε)σνT4e(5)
Substituting ε = 0.567, ν = 1.0 and Tg = 288 K we get: Q2=149.2 W/m2
Using Arrhenius value of 0.69 for the atmospheric emissivity Q2 = 120.9 W/m2.
Both values are significantly more than the 40 W/m2 determined by K & T.
The equation will not balance, something is clearly wrong.
Figure 5 illustrates the problem.
Equation (5) is based on the Stefan-Boltzmann law which is an empirical relationship which describes the amount of radiation from a hot surface passing through a vacuum to a region of space at a temperature of absolute zero. This is clearly not the case for radiation passing through the Earth’s atmosphere and as a result the amount of heat lost by radiation has been grossly overestimated.
No amount of adjusting parameters will allow this relationship to produce
sensible quantities and the required net heat flow of 40 W/m2.
This error affects the equilibrium heat flow values in Arrhenius’ model and the model is not able to produce a reasonable approximation of present day conditions as shown in Table 1. In particular, the convective heat flow takes on very different values from the two parts of the model. The values M and N in the table should be equivalent.
5. A New Energy Equilibrium Model
A modified model is proposed which will determine the change in surface temperature of the Earth caused by a change in the emissivity of the atmosphere (as would occur when greenhouse gas concentrations change). The model incorporates the following ideas:
1) The total heat radiated from the Earth ( Q1+Q2Q1+Q2 ) will remain constant and equal to the total solar radiation absorbed by the Earth ( S1+S2S1+S2 ).
2) Convective heat flow M remains constant. Convective heat flow between two regions is dependent on their temperature difference, as expressed by Newton’s Law of cooling1. The temperature difference between the atmosphere and the ground is maintained at 8.9 K (see Equation 7(a)). M = 102 W/m2 (K & T).
3) A surface temperature of 288 K and an atmospheric emissivity of 0.567 (Equation (7b)) is assumed for initial or present conditions.
Equation (9) represents the new model relating the emissivity of the atmosphere ε to the surface temperature Tg. Results from this model are shown in Table 2. The table shows the individual heat flow quantities and the temperature of the surface of the Earth that is required to maintain equilibrium:
The table shows that as the value of the atmospheric emissivity ε is increased less heat flows from the Earth’s surface to space, Q2 decreases. This is what would be expected. As well, more heat is radiated to space from the atmosphere; Q1 increases. This is also expected. The total energy radiated to space Q1+Q2=235 W/m2 . A plot of the resultant surface temperature Tg versus the atmospheric emissivity ε is shown below Figure 6.
Figure 6. Plot of the Earth’s mean surface temperature as a function of the atmospheric emissivity. This model predicts that the temperature of the Earth will decrease as the emissivity of the atmosphere increases.
6. Conclusion
Arrhenius identified the fact that the emissivity/absorptivity of the atmosphere increased with increasing greenhouse gas concentrations and this would affect the temperature of the Earth. He understood that infra-red active gases in the atmosphere contribute both to the absorption of radiation from the Earth’s surface and to the emission of radiation to space from the atmosphere. These were competing processes; one trapped heat, warming the Earth; the other released heat, cooling the Earth. He derived a relationship between the surface temperature and the emissivity of the atmosphere and deduced that an increase in emissivity led to an increase in the surface temperature of the Earth.
However, his model is unable to produce sensible results for the heat flow quantities as determined by K & T and others. In particular, his model and all similar recent models, grossly exaggerate the quantity of radiative heat flow from the Earth’s surface to space. A new energy equilibrium model has been proposed which is consistent with the measured heat flow quantities and maintains thermal equilibrium. This model predicts the changes in the heat flow quantities in response to changes in atmospheric emissivity and reveals that Arrhenius’ prediction is reversed. Increasing atmospheric emissivity due to increased greenhouse gas concentrations will have a net cooling effect.
It is therefore proposed by the author that any attempt to curtail emissions of CO2
will have no effect in curbing global warming.
Summary:
If Stannard is right, then the unthinkable, inconvenient truth is: More CO2 cools, rather than warms the planet. As noted before, we have enjoyed a modern warming period with the recovery of temperatures ending the Little Ice Age. But cold is the greater threat to human life and prosperity, and as well to the biosphere. Society’s priorities should be to ensure reliable affordable energy, and robust infrastructure to meet the demands of future cooling, which will eventually bring down CO2 concentrations in its wake.
Footnote:
A comment below refers to the cartoon image at the top which was an older version of K & T. The more recent version was used by the author and has slightly different numbers. Below is the actual model he analyzed:
I agree that these energy budgets oversimplify the real world, and the author’s intention is not to correct the details, but to show that the models fail when taken at face value. He is focusing on the imbalance arising from applying Stefan-Boltzmann law to an atmospheric planet. As noted below there are other challenging issues such as using the average frequency of visual light for calculating W/m^2, which is not realistic for earth’s LW radiation.
In the end the two (satellite) series were similar but RSS has consistently exhibited more warming than UAH. Then a little more than a decade ago, the group at NOAA headed by Zou produced a new data product called STAR (Satellite Applications and Research). They used the same underlying microwave retrievals but produced a temperature record showing much more warming than either UAH or RSS, as well as all the weather balloon records. It came close to validating the climate models, although in my paper with Christy we included the STAR data in the satellite average and the models still ran too hot. Nonetheless it was possible to point to the coolest of the models and compare them to the STAR data and find a match, which was a lifeline for those arguing that climate models are within the uncertainty range of the data.
Until now. In their new paper Zou and his co-authors rebuilt the STAR series based on a new empirical method for removing time-of-day observation drift and a more stable method of merging satellite records. Now STAR agrees with the UAH series very closely — in fact it has a slightly smaller warming trend. The old STAR series had a mid-troposphere warming trend of 0.16 degrees Celsius per decade, but it’s now 0.09 degrees per decade, compared to 0.1 in UAH and 0.14 in RSS. For the troposphere as a whole they estimate a warming trend of 0.14 C/decade.
Figure 14 Global mean Temperature Total Troposphere (TTT, TMT adjusted by TLS) time series (blue lines) and its smoothed time series (red lines). The locally weighted regression method (Cleveland, 1979) is used for the smoothing. Both TMT and TLS for TTT generation are from STAR V5.0.
In 1896, Svante Arrhenius proposed a model predicting that increased concentration of carbon dioxide and water vapour in the atmosphere would result in a warming of the planet. In his model, the warming effects of atmospheric carbon dioxide and water vapour in preventing heat flow from the Earth’ s surface (now known as the “Greenhouse Effect”) are counteracted by a cooling effect where the same gasses are responsiblefor the radiation of heat to space from the atmosphere. His analysis found that there was a net warming effect and his model has remained the foundation of the Enhanced Greenhouse Effect—Global Warming hypothesis.
This paper attempts to quantify the parameters in his equations but on evaluation his model cannot produce thermodynamic equilibrium. A modified model is proposed which reveals that increased atmospheric emissivity enhances the ability of the atmosphere to radiate heat to space overcoming the cooling effect resulting in a net cooling of the planet. In consideration of this result, there is a need for greenhouse effect—global warming models to be revised.
1. Introduction
In 1896 Arrhenius proposed that changes in the levels of “carbonic acid” (carbon dioxide) in the atmosphere could substantially alter the surface temperature of the Earth. This has come to be known as the greenhouse effect. Arrhenius’ paper, “On the Influence of Carbonic Acid in the Air upon the Temperature of the Ground”, was published in Philosophical Magazine. Arrhenius concludes:
“If the quantity of carbonic acid in the air should sink to one-half its present percentage, the temperature would fall by about 4˚; a diminution to one-quarter would reduce the temperature by 8˚. On the other hand, any doubling of the percentage of carbon dioxide in the air would raise the temperature of the earth’s surface by 4˚; and if the carbon dioxide were increased fourfold, the temperature would rise by 8˚ ” [ 2 ].
It is interesting to note that Arrhenius considered this greenhouse effect a positive thing if we were to avoid the ice ages of the past. Nevertheless, Arrhenius’ theory has become the foundation of the enhanced greenhouse effect―global warming hypothesis in the 21st century. His model remains the basis for most modern energy equilibrium models.
2. Arrhenius’ Energy Equilibrium Model
Arrhenius’ proposed a two-part energy equilibrium model in which the atmosphere radiates the same amount of heat to space as it receives and, likewise, the ground transfers the same amount of heat to the atmosphere and to space as it receives. The model contains the following assumptions:
• Heat conducted from the center of the Earth is neglected.
• Heat flow by convection between the surface and the atmosphere and throughout the atmosphere remains constant.
• Cloud cover remains constant. This is questionable but allows the model to be quantified.
Part 1: Equilibrium of the Air
The balance of heat flow to and from the air (or atmosphere) has four components as shown in Figure 1. The arrow labelled S1 indicates the solar energy absorbed by the atmosphere. R indicates the infra-red radiation from the surface of the Earth to the atmosphere, M is the quantity of heat “conveyed” to the atmosphere by convection and Q1 represents heat loss from the atmosphere to space by radiation. All quantities are measured in terms of energy per unit area per unit time (W/m2).
Figure 1. Model of the energy balance of the atmosphere. The heat received by the atmosphere ( R+M+S1 ) equals the heat lost to space (Q1). In this single layer atmospheric model, the absorbing and emitting layers are one and the same.
Part 2: Thermal Equilibrium of the Ground
In the second part of his model, Arrhenius describes the heat flow equilibrium at the “ground” or surface of the Earth. There are four contributions to the surface heat flow as shown in Figure 2. S2 is the solar energy absorbed by the surface, R is the infra-red radiation emitted from the surface and transferred to the atmosphere, N is the heat conveyed to the atmosphere by convection and Q2 is the heat radiated to space from the surface. Note: Here Arrhenius uses the term N for the convective heat flow. It is equivalent to the term M used in the air equilibrium model.
Figure 2. The energy balance at the surface of the Earth. The energy received by the ground is equal to the energy lost.
3. Finding the Temperature of the Earth
Arrhenius combined these equations and, by eliminating the temperature of the atmosphere which according to Arrhenius “has no considerable interest”, he arrived at the following relationship:
ΔTg is the expected change in the temperature of the Earth for a change in atmospheric emissivity from ε1 to ε2. Arrhenius determined that the current transparency of the atmosphere was 0.31 and, therefore the emissivity/absorptivity ε1 = 0.69. The current mean temperature for the surface of the Earth can be assumed to be To = 288 K.
Figure 3. Arrhenius’ model is used to determine the mean surface temperature of the Earth as a function of atmospheric emissivity ε. For initial conditions, ε = 0.69 and the surface temperature is 288 K. An increase in atmospheric emissivity produces an increase in the surface temperature of the Earth.
Arrhenius estimated that a doubling of carbon dioxide concentration in the atmosphere would produce a change in emissivity from 0.69 to 0.78 raising the temperature of the surface by approximately 6 K. This value would be considered high by modern climate researchers; however, Arrhenius’ modelhas become the foundation of the greenhouse-global warming theory today. Arrhenius made no attempt to quantify the specific heat flow values in his model. At the time of his paper there was little quantitative data available relating to heat flow for the Earth.
4. Evaluation of Arrhenius’ Model under Present Conditions
More recently, Kiehl and Trenberth (K & T) [ 3 ] and others have quantified the heat flow values used in Arrhenius’ model. K & T’s data are summarised in Figure 4.
The reflected solar radiation, which plays no part in the energy balance described in this model, is ignored. R is the net radiative transfer from the ground to the atmosphere derived from K & T’s diagram. The majority of the heat radiated to space originates from the atmosphere (Q1 > Q2). And the majority of the heat lost from the ground is by means of convection to the atmosphere (M > R + Q2).
Figure 4. Model of the mean energy budget of the earth as determined by Kiehl and Trenberth.
Q2=(1−ε)σνT4e(5)
Substituting ε = 0.567, ν = 1.0 and Tg = 288 K we get: Q2=149.2 W/m2
Using Arrhenius value of 0.69 for the atmospheric emissivity Q2 = 120.9 W/m2.
Both values are significantly more than the 40 W/m2 determined by K & T.
The equation will not balance, something is clearly wrong.
Figure 5 illustrates the problem.
Equation (5) is based on the Stefan-Boltzmann law which is an empirical relationship which describes the amount of radiation from a hot surface passing through a vacuum to a region of space at a temperature of absolute zero. This is clearly not the case for radiation passing through the Earth’s atmosphere and as a result the amount of heat lost by radiation has been grossly overestimated.
No amount of adjusting parameters will allow this relationship to produce
sensible quantities and the required net heat flow of 40 W/m2.
This error affects the equilibrium heat flow values in Arrhenius’ model and the model is not able to produce a reasonable approximation of present day conditions as shown in Table 1. In particular, the convective heat flow takes on very different values from the two parts of the model. The values M and N in the table should be equivalent.
5. A New Energy Equilibrium Model
A modified model is proposed which will determine the change in surface temperature of the Earth caused by a change in the emissivity of the atmosphere (as would occur when greenhouse gas concentrations change). The model incorporates the following ideas:
1) The total heat radiated from the Earth ( Q1+Q2Q1+Q2 ) will remain constant and equal to the total solar radiation absorbed by the Earth ( S1+S2S1+S2 ).
2) Convective heat flow M remains constant. Convective heat flow between two regions is dependent on their temperature difference, as expressed by Newton’s Law of cooling1. The temperature difference between the atmosphere and the ground is maintained at 8.9 K (see Equation 7(a)). M = 102 W/m2 (K & T).
3) A surface temperature of 288 K and an atmospheric emissivity of 0.567 (Equation (7b)) is assumed for initial or present conditions.
Equation (9) represents the new model relating the emissivity of the atmosphere ε to the surface temperature Tg. Results from this model are shown in Table 2. The table shows the individual heat flow quantities and the temperature of the surface of the Earth that is required to maintain equilibrium:
The table shows that as the value of the atmospheric emissivity ε is increased less heat flows from the Earth’s surface to space, Q2 decreases. This is what would be expected. As well, more heat is radiated to space from the atmosphere; Q1 increases. This is also expected. The total energy radiated to space Q1+Q2=235 W/m2 . A plot of the resultant surface temperature Tg versus the atmospheric emissivity ε is shown below Figure 6.
Figure 6. Plot of the Earth’s mean surface temperature as a function of the atmospheric emissivity. This model predicts that the temperature of the Earth will decrease as the emissivity of the atmosphere increases.
6. Conclusion
Arrhenius identified the fact that the emissivity/absorptivity of the atmosphere increased with increasing greenhouse gas concentrations and this would affect the temperature of the Earth. He understood that infra-red active gases in the atmosphere contribute both to the absorption of radiation from the Earth’s surface and to the emission of radiation to space from the atmosphere. These were competing processes; one trapped heat, warming the Earth; the other released heat, cooling the Earth. He derived a relationship between the surface temperature and the emissivity of the atmosphere and deduced that an increase in emissivity led to an increase in the surface temperature of the Earth.
However, his model is unable to produce sensible results for the heat flow quantities as determined by K & T and others. In particular, his model and all similar recent models, grossly exaggerate the quantity of radiative heat flow from the Earth’s surface to space. A new energy equilibrium model has been proposed which is consistent with the measured heat flow quantities and maintains thermal equilibrium. This model predicts the changes in the heat flow quantities in response to changes in atmospheric emissivity and reveals that Arrhenius’ prediction is reversed. Increasing atmospheric emissivity due to increased greenhouse gas concentrations will have a net cooling effect.
It is therefore proposed by the author that any attempt to curtail emissions of CO2
will have no effect in curbing global warming.
BizNews TV interviewed Dr. John Christy last week as shown in the video above. For those who prefer to read what was said, I provide a lightly edited transcript below in italics with my bolds and added images. BN refers to questions from the interviewer and JC refers to responses from Christy.
BN: Joining me today is Dr John Christy, climate scientist at the University of Alabama in Huntsville and Alabama State climatologist since 2000. Dr Christy, thank you so much for your time. You’ve described yourself as a climate nerd and apparently you were 12 when your unwavering desire to understand weather and climate started. Why climate?
JC: Well I think it was more like 10 years old when I was fascinated with some unusual weather events that happened in my home area of California. So that began a fascination for me, and I wanted to try to figure out why things happen the way they did. Why did one year have more rain–that’s a big story in California, does it rain or not–and another year would be very dry. Why were the mountains covered with snow in one April and not another. In fact I have here April 1967 that I recorded as a teenager. This has been a passion of mine forever, and as it turns out now that I’m as old as I am, I still can’t figure out why one year is wetter than the other.
BN: Well you seem to be getting a lot closer than most people would. I think it was in 1989 when you and NASA scientist Roy Spencer pioneered a new method of measuring and monitoring temperature recordsvia satellites, since that time up until now. Why did you feel you needed to develop a new method to begin with, and how did it differ in terms of the readings of established methods at the time?
JC: Well the issue was we only had surface temperature measurements and they are scattered over the world. They don’t cover much of the world at all, actually mainly just the land regions and scattered places on the ocean. And the measurement itself is not that robust. The stations move, the instruments changed through time, and so it’s a very difficult thing to detect. In fact a small little change in the area right around the station can really affect the temperature of that station
So Roy Spencer and Dick Mcknight came up with an idea about looking at some satellite data. This is the temperature of the deep layer of the atmosphere, so this is like the surface to about 8000 meters. And so if we could see the temperature of that bulk atmospheric layer, we would have a very robust measurement, and the microwave sensors on the NOAA Polo orbiting satellites did precisely that. And so we were the first to really put those data into a simple data set that had the temperature, at that time, for month by month since about November 1978.
BN: Okay, and how do readings differ from the climate science at the time?
JC:First of all they differed because we had a global measurement. We really did see the entire Globe from satellite, because the orbit of that satellite is polar and the Earth spins around underneath. So every day we have 14 orbits as the Earth spins around underneath. We see the entire planet so that’s one big difference.
The other one is that the actual result did not show as much warming as what the surface temperatures showed. And we’re doing even more work now to demonstrate that a lot of the surface stations are spuriously affected by the growth of an infrastructure around them. And so there’s kind of a false warming signal there. You don’t get the background climate signal with surface temperature measurements; you get a bit of what’s happening in the local area.
BN: Your research has to do with testing the theories posited by climate model forecasts, so you don’t actually do any modeling yourself. But what criteria do you use to test these theories?
JC: That’s a very good question, because in climate you hear all kinds of claims and theories being thrown out there. For a lot of people who don’t really understand the climate system it’s a quick and easy answer just to say: Oh humans caused that, you know it’s global warming, something like that is the answer. When in fact the climate system is very complex, so we look at these claims and Roy Spencer and I are just a few of the people around the world that actually build data sets from scratch. I mean we start with the photon counts of the satellite radiometers, or the original paper records of 19th century East Africa, for example. We do all this from scratch so that we can test the claims that people make.
Once we build the data set, we test it to make sure we have confidence in the data set, that it’s telling us a truth about what’s happening over time. And then we check the claim. So for example, we make surface temperature data sets that go back to the 19th century. Someone will say: Well this is now the hottest decade, or that more records happen this decade than in the past. And we can demonstrate, in the United States especially, that’s not the case. You would need to go back to the 1930s if you want to see real record temperatures that occurred at that time.
And for climate models we like to use the satellite data set since it’s a robust deep layer measurement; it’s measuring lots of mass of the atmosphere, the heat content really. That’s a direct value we can get out of the climate model, so we are comparing Apples to Apples: What the satellite produces and observes is what the climate model also generates, and we can compare them one to one.
In a paper Ross McKitrick and I wrote a couple of years ago, we found that 100 of the climate models we’re warming the atmosphere faster than it was actually warming. So that’s not a good result if you’re trying to test your theory of how the climate works with the model against what actually happens.
BN: How much do you think the deeply over-exaggerated predictions of Doom and Gloom have to do with the methodology substantiated by confirmation bias?
JC: That’s an interesting question because we’re a bit confused as well. We have been publishing these papers since 1994 that have demonstrated models warm too much relative to the actual climate, and yet we don’t see an improvement in climate models and trying to match reality with their model output. Now I think a number of modelers understand that: yes the there is a difference there and the models are just too hot. But what is the process that’s gone wrong in the models is a difficult question for these folks. Because models have hundreds of places you can turn a little knob, change a coefficient, and that will change the result. It’s not a physical thing, it’s not based on physics; it’s the model parameterizations— the little pieces of the model that try to represent an actual part of the atmosphere. For example, when do clouds form? That’s a pretty big question. How much humidity in the atmosphere is required to create a cloud? Because once the cloud forms it reflects sunlight and cools the Earth. So that’s it that’s one of the big questions.
So in testing the models we like to use the bulk atmospheric temperature; it’d a very direct measurement that models produce and so we can then say there’s a problem here with climate models.
BN: To what degree did your observation on data differ from their forecasts?
Generally it’s about a factor of two. At times it’s been more, but on average the latest models (CMIP6) for the Deep layer of the atmosphere are warming about twice too fast, and that’s a real problem. I think when now we’re looking at over 40 years with which we can test these models, and they’re already that far off.
Figure 8: Warming in the tropical troposphere according to the CMIP6 models. Trends 1979–2014 (except the rightmost model, which is to 2007), for 20°N–20°S, 300–200 hPa.
So we should not use them to to tell us what’s going to happen in the future since they haven’t even gotten us to the right place in the last 40 years.
BN: Given that your real world data refuted what the forecasts were every time for decades, why then (and I recognize that this is conjecture) why are, let’s say, 97 or 99 % of scientists so firmly behind climate crisis narrative?
JC: Yeah I don’t know how many are really fully behind that crisis climate narrative. I saw a recent survey where about 55 percent might have been of the opinion that the climate warming was going to be a problem. Warming itself is not a problem: I mean the Earth has been warmer in the past than it is today, so the Earth has survived that before. And I don’t think putting extra plant food in the atmosphere is going to be a real problem for us to overcome. I do think the world is going to warm some from the extra CO2, but there are a lot of benefits that come from that.
You’re you’re dealing with a question about human nature and funding and so on. I think we all know that the more dramatic the story is, especially in the political world, the more attention you will get. Therefore your work can be highlighted and that helps you with funding and attention and so on. And part of what’s going on here. Then there’s the other real stronger political narrative: that there are groups and in the world political Elite that like to have a narrative that scares people, so that they can then offer a solution. And so it’s a simple way to say: elect me to this office and I will be able to solve this problem.
Then you are facing people like us who actually produce the data and we can report on extreme events and so on and say: Well you know there isn’t any change in these extreme events, so what’s the problem you’re trying to solve? And then we look at the other side of that issue and say: Okay if you actually implement this regulation or this law, it’s not going to make any difference on the climate end, so it’s a you kind of lose on two ends on that story.
BN: You’re a distinguished professor of atmospheric science and also director of Earth Sciences also at Alabama in Huntsville, these are prominent positions. How have you managed to hold on to them with climate views that are so divergent from the norm?
JC: Well the environment in the state of Alabama is different than what you have in Washington. I’m from California way across the country, and I tell people that one of the reasons I like to live in Alabama because in Alabama you can call a duck a duck; that you can just be direct about what’s going on and and you’re not going to be given the evil eye or cast out. As it is now in the climate establishment, you know, saying that all the models are warming too much and that there is not a disaster arising that causes great consternation.Because the narrative has been built over the last 30 years that we are supposed to be in a catastrophe. To come out and say, well here’s the data and the data show there is no catastrophe looming; we’re doing fine, the world is doing fine, human life is thriving in places it’s allowed to. So what’s the problem here you’re trying to solve.
BN:Did you ever manage to get your findings to policy makers that have influence to do something about it?
[An important proof against the CO2 global warming claim was included in John Christy’s testimony 29 March 2017 at the House Committee on Science, Space and Technology. The text and diagram below are from that document which can be accessed here.
IPCC Assessment Reports show that the IPCC climate models performed best versus observations when they did not include extra GHGs and this result can be demonstrated with a statistical model as well.
Figure 5. Simplification of IPCC AR5 shown above in Fig. 4. The colored lines represent the range of results for the models and observations. The trends here represent trends at different levels of the tropical atmosphere from the surface up to 50,000 ft. The gray lines are the bounds for the range of observations, the blue for the range of IPCC model results without extra GHGs and the red for IPCC model results with extra GHGs.The key point displayed is the lack of overlap between the GHG model results (red) and the observations (gray). The nonGHG model runs (blue) overlap the observations almost completely.
JC: Well, I’ve been to Congress 20 times, testified before hearings. So the information is there and available, but I can’t force Congress to make legislation that matches the real world. The Congressional world is a political world, and things happen there that are kind of out of my reach and ability to influence.
BN: According to your research, you’ve also said that the climate models underestimate negative feedback loops. Can you explain to me what is this mechanism and the effect of overestimation of the loops on understanding climate for what it is?
JC: That’s a very complicated issue, and I don’t understand it all for sure, but we can say just from some general results and general observation what’s going on here. One of those General observations is that when a climate model warms up the atmosphere one degree Kelvin, it sends out 1.4 watts per metersquared so the air atmosphere warms up and energy escapes to space 1.4 watts. When we use actual observations of the atmosphere, when the real atmosphere warms up one Kelvin it sends out 2.6 watts of energy. That’s almost twice as much so that tells you right there that the climate models are retaining or holding on to energy that the real world allows to escape when it warms. So that’s a negative feedback: as the atmosphere warms for a bit the real real world knows how to let that heat escape; whereas the models don’t and they retain it and that’s why they keep building up heat over time.
BN: What other variables do you look at?
JC: The state climatologists I deal a lot get very practical questions that people ask. They want to know: is it getting hot or is it getting wetter. Are rain storms getting heavier and are the Hurricanes getting worse and so on. I actually wrote a booklet called a practical guide to climate change in Alabama. But it covers a lot of the country as well. It’s free, you can download it from the first page of my website The Alabama State climatologist. I answer a lot of these very practical questions and as we go down the list: droughts are not getting worse over time, heavy rainstorms are not getting worse over time, here in the Southeast in fact. Ross McKitrick and I also had a paper where we went back to 1878 and demonstrated that the trends are not significant. Hurricanes are not going up at all; in fact 2022 is going to be one of the quietest that we’ve had in a while. Tornadoes are not becoming more numerous, heat waves are not becoming worse. So one after another, the weather that people really care about, that if it changes could cause problem or catastrophe, we find those events are not changing, they’ve always been around.[Title below in red is link to Christy’s booklet.]
BN: Some of the biggest critics of climate skeptics say: okay yeah it’s not fair one extreme weather event doesn’t say much, but they argue that there are very particular trends that have been on the increase. Recently have you observed this at all?
JC: That’s exactly the kind of thing we build data sets to discover. For example there is a story, and there is some evidence for it, that in the last hundred years there’s been an increase in in heavy rain events in part of our country, not all of it just part of the country. So I built a data set that went back in fact back to the 1860s. And we looked at that very carefully, and found that when you go back far enough, there were a lot of heavy events back then. And so over the long time period of 140 years or more we don’t see an upward trend. It’s unusual in that sample of time 140 years that we don’t see a change in those kind of events. So that’s why I think it has great value to build these data sets so you can specifically answer the question and the claim that is being made
One of the worst ones was made by the New York Times when they were talking about how many record high temperatures occurred in a recent heat wave around the country. So I looked at that carefully, and they were allowing stations to be included that only had 30 years or even less than 30 years of data. Some had a hundred years but a lot of them just had 30 years. Well when you become very systematic, you say: I’m only going to allow stations that have a hundred years so that every station that measured in 2022 can be compared with the entire time series. Then their story falls apart because the 1930s and the 50s were so hot in our country that they still hold the records for the number of high temperature events.
The scary thing for me is that as much as it completely falls apart, there’s no logic to it,
yet it’s still firmly stands as what most people believe.
You have to credit those in the climate establishment and the media or whoever is behind all this, that they have been successful in scaring people about the climate. Because now you find that even in grade school textbooks. Almost every new story that comes out, and this is where this establishment is very good, they make sure every story has some kind of line in it about climate change. They don’t ever go back and talk to someone who actually builds these data sets who says is that really the worst it’s been was 120 years ago. They just make those claims.
Other than the fact that sea level is rising a bit, the extreme events are just not there to really cause problems now. We are in a problem of having greater damages occur because of extreme events, and mainly because we’ve just built so much more stuff and placed It In harm’s way. Our coastlines are crowded with Condominiums, entertainment parks and retirement villages, and so on. There’s so many more of them that when a hurricane does come, it’s going to wipe out a lot more and so for the absolute value of those damages has gone up. But the number of hurricanes, their strength and so on, the background climate has not caused that problem. It’s just that we like to build things in places that are dangerous.
We have records of sea level rise, and it’s on the order of about an inch per decade, except in places where the land’s sinking. You can find that on the Louisiana Gulf Coast and places like that, but otherwise it’s about an inch per decade. I tell folks that an inch per decade, two and a half centimeters a decade is not your problem. It’s 10 feet in six hours from the next hurricane that’s your problem. If you can withstand a rise of sea level of 10 feet in six hours then you’re probably going to be okay. But if you can’t then a hurricane can really cause problems, and so we just have more exposure to that kind of his situation now than we’ve had before.
BN: What about the trend with sea level rise? Should we be worried about future Generations having to deal with issues that might not affect us in our lifetime but eventually will threaten their lives?
I think your listeners would need to understand that sea level is a dynamic variable–It goes up, it goes down. It has been over a hundred meters lower than today just in the last 25,000 years, and there was a period from about 15 000 years ago to 8 000 years ago where the sea level rose about 12 centimeters per decade for seven thousand years. That’s a lot more than two and a half centimeters a decade as it’s doing now, so the world has managed to deal with rising sea levels before. If we go back to the last warm period about 130 000 years ago, the sea level then was higher than it is now by about five meters or so. So just naturally we would expect at least another five meters of sea level; it won’t happen tomorrow, it won’t happen this Century. But slowly it will likely continue to rise and so that should be placed in your thinking if you’re building a dock for say a military port or something you want to last a long time. Put a cushion in there, a way to handle another half meter of seat level rise in the next hundred years, and you should be okay.
BN: About your temperature records: How much has the Earth warmed let’s say over the last four years?
JC: Yes. With this November we finished 43 years of measurements. In that time the temperature has risen half a degree Celsius. And you might want to look at other things about the world. World agricultural production has expanded tremendously. Nations are now exporting grain more than they had before, because people are pretty smart and figure out how to do things better all the time. Growing food is one thing they figured out how to do better as time passed, so the climate warming of a half degree has not caused a a major catastrophe at all. Wealth has increased around the planet, now some governments are trying to prevent you from growing your wealth, but that’s a hard thing to stop people who like to have food; they like to have conveniences in their life and that’s hard to pass laws that say you can’t enjoy the life the way you want to.
BN:How much of the warming are you reliably able to say is as a result of human activity?
JC: Okay. The answer is none in the sense that you said reliably. I can’t come up with an answer for that reliably. Warming from humans assumes warming that is not due to El Nino; or warming that’s not due to volcanic suppression of temperatures earlier in the record, which comes up to about a tenth of a degree per decade.
Are there other factors that we can say for sure have played a role in the incremental warming of the planet over the last few decades. We see that we’ve had a couple of volcanoes in the first half of that period Eyjafjallajökull and Pinatubo and those cool the planet in the first half of that 40 years. So that tilted the trend up and that’s where I come up with a one-tenth per decade is the warming rate, which means the climate is not very sensitive to carbon dioxide or greenhouse gas warming. It’s probably half or even less as sensitive as models tend to report.
BN: So if CO2 exposure or insertion into the atmosphere were to double what would the results be?
JC: I actually had a little paper on that and we’re kind of expecting maybe about 2070 or 2080 it will be double from what it was back in 1850. And the warming of that amount uh will be about a degree, 1.3 C is what I calculated. The general rule I found about people is they don’t mind an extra degree on their temperature. In fact if you look at the United States the average American experiences a much warmer temperature now than they did a hundred years ago. Because the average American has moved South; the average American has moved to much warmer climates–California, Arizona Texas, Alabama, Florida and so on. Because cold is not a whole lot of fun. You know, skiing, snowmobiling and ice fishing and so on, that’s fine. But the average person likes it to be warm and so that’s why many people in our country have moved to warmer areas. So I don’t think that 1.3 Kelvin is going to matter much whether people really care about those extreme events and so on.
BN: What do you your temperature records tell you about previous hotter temperatures?
JC: Since 1979, what we see is an upward trend in the in the global temperature that I think is manageable. But it goes up and down the 1997-98 El Nino was a big event and in 2016 El Nino was a big event. We also see the downs that come from a volcano that might go off and cool off the planet. Those are bigger effects than that small trend that’s going up. The global temperature can change by two tenths of a degree from month to month when we’re talking about a tenth per decade. Then people say, you know a month to month we can handle but we can’t handle 20 years worth of a small change. That just doesn’t make sense and and the real world evidence is pretty clear that that humans have done extremely well as our planet has been warming a little bit, whether it’s natural or not.
BN: Can you tell me about the Milankovich Cycles?
Milankovich Cycles are the orbital cycles of the earth orbit around the Sun and its tilt of the axis and the distance from the Sun. It is not a perfect circular orbit around the sun, it’s kind of an ellipse and it changes through time. All those factors work together to put a little bit more solar energy in certain places and less than others. These cycles are likely related to the Ice Ages we talk about.
If you can melt the snow in Canada in the summer, then you won’t have an ice age. So the snow falls in the winter and if you can’t melt the snow in Canada in the summer because the Earth is tilted away a bit in July and August. Then the snow hangs around all summer long, the next winter more snow that piles up the next summer it doesn’t melt and so on the next year. You get this mechanism that adds and extends snow cover leading to an ice age
So the tilt of the axis and other parameters I just mentioned can moderate how much sunlight comes in the summertime in Canada. And it’s up to 100 watts per meter squared which is a lot of energy difference over time. That’s probably the strongest theory that has a good amount of evidence that those orbital changes can cause huge changes in the climate from ice ages to the current interglacial.
BN: There’s claims that the way that humans are living is causing daily Extinction of two to three hundred different species. Is this a natural course of Evolution?
JC: You know 99 % of the species that have ever lived are extinct, so extinction is is pretty natural. Obviously humans cause some extinctions. When you destroy the environment of a small place and that was the only place that particular species lived then yes you know humans caused that extinction Did climate change from humans cause any extinctions? I think that jury is still out because most species love the extra carbon dioxide. Plants do specifically and then everything that eats plants loves that, so you might want to say the extra carbon dioxide actually helped in some sense the whole biosphere.
But I think that what humans do to the surface and to water, if it’s not clean properly and if you just really poison the surface in the air, then that can cause some real problems for the species that are living out there. And that’s why we have rules about not putting poison in air or in the water.
BN: Does that qualify as climate change?
JC: No. To say carbon dioxide is a poison, you really have to scratch your head on that because plants love the stuff. It invigorates the biosphere. When did all of this Greenery evolve and the corals occur and grow and develop? it was when there was two to four or five times as much CO2 as is in the air now. Carbon dioxide invigorates the biosphere, so we’re just actually putting back carbon dioxide that had been in the atmosphere earlier. And I don’t think the world is going to have much problem with that in terms of its biosphere. The issue is about the climate going to become so bad that some things can’t handle it and I don’t really see the evidence for that happening.
BN: Critics of your views on climate have argued that you undercut your credibility by making claims that exceed your data and that you’re unwilling to agree with different findings. How do you respond to that?
JC: Show me a finding and let me look at it and if it’s a valid finding, fine I’ll agree with it. But you know you can find anything on the web these days about claims that someone might make but you show me the evidence. Let me see what you’re complaining about and we can have a discussion about that. I just had a paper published last week on snowfall in the western states of the United States that shows for the main snowfall regions there is no trend in snowfall. The amount of snow that’s falling right now is the same as it was 120 years ago. So snow is still falling out in the western mountains of the United States–that’s evidence, that’s data. And so when someone claims that oh my, snowfall is going away out in the west, I said well well here look at this evidence from real station data that people recorded back in 1890 to now.
So I can answer that question with real information. You don’t see many people like me in debates because they’re not offered to me. In fact I’ve been uninvited you know. Someone on a particular panel would say hey let’s get this guy to come here and speak to us, and then I receive the disinvitation because I was not going to go along with the theme of their climate change as a catastrophe presentation
BN: You referred to times in the past that CO2 levels were significantly higher than they are now. Do records show any negative effects as a result of such high CO2?
JC: Well when you say negative, that’s almost a moral question: good or bad that the dinosaurs went extinct? I think they’d the dinosaurs would have an opinion about that. Let me rephrase: If it had to be on those levels today would it negatively impact Humanity? We see carbon dioxide has increased as humans are producing energy so that their lives can be enhanced. There’s a direct relationship between how much carbon or energy you’re able to use and carbon is the main source today and your ability to thrive.
Think about it we didn’t leave the Stone Age because we ran out of rocks. We left the Stone Age because something better came along, you know, Iron, Bronze and so on. In terms of energy we didn’t leave the wood and dung age because we ran out of trees or excrement, we found a better source that was carbon: coal, oil and so on. And transportation: we didn’t leave the horse and buggy age because we ran out of horses. It was because Henry Ford made a vehicle that was cheap and affordable. My great grandfather who was in destitute poverty in Oklahoma in the 1930s had a Model T. And another thing about Henry Ford: He didn’t go around getting the government to kill all the horses so you’d have to buy his cars. Horses were still available for the poorest people you know. And he didn’t make the government go out and build gas stations or drill for oil, that was done at the market for the private level.
But today we have a government that says this is what we want for the energy structure, and so we’re going to be using your taxpayer money to put out all these charging stations and force you to buy electric cars or at least subsidize them tremendously, and put up all these windmills and so on at Great expense and great environmental wreckage.
I can assure you that without energy life is brutal and short, and so energy is a thing that has caused our lifespans to double so that children no longer fear about diseases that used to wipe out Millions. Because of the advances that energy has brought through electricity and experimentation and all the sciences that we have developed now. All that’s based upon that access to energy.
So yes developing countries are going to get their energy, they’re going to find the energy they need. I’m not making this as a prediction, just using this as an observation. Right now it’s carbon that’s the cheapest and most effective and very high density. So we will see these countries use carbon to advance and we should not stand in their way. Because they want to live like us who already have pretty big carbon Footprints.
If you want to have some comfort in that, remember the carbon dioxide we’re putting back into the atmosphere is invigorating the biosphere and it also represents people living longer and better lives. Just no question that as energy is made available and affordable people live longer and better lives. I think that’s going to ultimately be the the inertia that’s going to carry forward this issue past all the preaching about carbon dioxide problems.
Environmentalists would argue that they’re not against electricity and prosperity, they’re just advocating for a better cleaner way to do it. It’s a tremendous misconception that a windmill or a solar panel can somehow give you cleaner and more reliable energy than what you have now. That’s just not true. To build a windmill, there’s tremendous environmental wreckage that you have to go through in terms of all the minerals you have to yank out of the earth and process. And processing takes energy by the way. And then building all these transmission lines. The energy is so diffuse, it is so weak in wind and solar that you have to gather up huge amounts of land to put it together. Robert Price said it well when he called it the iron law of power: the weaker the source of energy the more stuff and material you need to gather it, to concentrate it and to make it useful.
You have to spend huge amounts of dollars in environmental cost to make a windmill or make a solar panel, which by the way doesn’t last forever. So this carbon that already has a tremendous amount of energy in a very small dense space means that its environmental footprint is much much smaller than what you have with solar or wind. In fact it’s about one to a thousand or two thousand in terms of the square footage you need.
Look at windmill and solar panel farms. Not only are they just ugly but they cause tremendous environmental damage in their construction and maintenance. What are the long-term effects decades from now? If we just continue to get our energy from fossil fuels how bad can it get?
Well start with how good it can get. People will have access to affordable energy so they’re going to live longer and they’re going to live better lives when you have access to this. The impact on the climate is about the only thing you can think of. Well the sea level is going to continue to rise since it’s been rising for several hundred years, and at a manageable rate by the way. And the atmosphere might warm some more, but certainly not in terms of some catastrophic effect that will cause us to lose our ability to thrive.
I’m just very optimistic that people are clever and they can figure out how to adapt to whatever is going to happen. The real issues I deal with as a state climatologist are the extreme events that we know are going to happen that you’re not ready for. I mean that flood that happened 50 years ago is going to come back again. And it’s going to cause some real problems if you don’t build your infrastructure and put your houses or industry where they can be safe. If you don’t build up on the coast too much, so then you won’t be clobbered by a hurricane or something like that. It’s these kind of natural extreme events to which we’re far more vulnerable right now, rather than some small and gradual change that the climate system might undergo.
BN: I did read somewhere that someone has said, and I’m sure you must get it a lot, whether you get any funding from the fossil fuel industry. Do you?
JC: No I do not, and I made that decision way back in the early 1990s. I might make a fossil fuel company mad by some of them information I would produce, but so be it. I can put my head to bed at night and not be worried about: Did I accommodate some agenda somewhere? I’m just after what observations say. Can I build the best observational data sets to answer the questions of climate that we have, and that’s what I want to do.
BN: I suppose one of the biggest tragedies about it would be that it would discredit the real science and the fundamental research that you’re doing. Just that it’ll be a non-starter because people will immediately dismiss it.
JC: That’s unfortunate because the perception then is that, well if a fossil fuel company paid someone to do some research they really wanted to know the answer about something and this person was completely honest did the work properly and provided the answer to the fossil fuel companies is as. Well that answer would be tainted because it came from a fossil fuel company. Well hello: Think about what environmental advocacy groups and pressure groups do all of the time. They pay tremendous sums to people so they can come up with an answer that gives them their leverage in claiming this is a catastrophic problem. So I can at least take that perception off the table.
BN: Lastly are you aware of any ways in which geoengineering could possibly be affecting the natural balance of things? Is it being done more than we’re aware of and could it backfire?
JC: Anytime humansdo something they’re going to have an impact, no question about that. So you could call it geoengineering but inadvertently we have made some desert valleys cooler because now we irrigate crops. We have taken water that fell someplace and moved it to another place. So that’s a bit of geoengineering there. And by the way a lot of those places feed a lot of the world, so you can’t say it’s bad I suppose.
But the other question about geoengineering is:
Can we do something to prevent a perceived problem here?
And that’s the real danger I think, because you don’t know the consequences when you start tinkering with a very complex and dynamic system. And so I would say stay away from that. Suppose someone did a big geoengineering experiment and something bad happened somewhere. Well that country would sue the world and say: look you made this bad thing happen to us you are liable. And so then we’re getting nowhere in terms of preventing some problem on the planet.
Two fallacies ensure meaningless public discussion about climate “crisis” or “emergency.” H/T to Terry Oldberg for comments and writings prompting me to post on this topic.
One corruption is the numerous times climate claims include fallacies of Equivocation. For instance, “climate change” can mean all observed events in nature, but as defined by IPCC all are 100% caused by human activities. Similarly, forecasts from climate models are proclaimed to be “predictions” of future disasters, but renamed “projections” in disclaimers against legal liability. And so on.
A second error in the argument is the Fallacy of Misplaced Concreteness, AKA Reification. This involves mistaking an abstraction for something tangible and real in time and space. We often see this in both spoken and written communications. It can take several forms:
♦ Confusing a word with the thing to which it refers
♦ Confusing an image with the reality it represents
♦ Confusing an idea with something observed to be happening
Examples of Equivocation and Reification from the World of Climate Alarm
“Seeing the wildfires, floods and storms, Mother Nature is not happy with us failing to recognize the challenges facing us.” – Nancy Pelosi
Mother Nature’ is a philosophical construct and has no feelings about people.
“This was the moment when the rise of the oceans began to slow and our planet began to heal …” – Barack Obama
The ocean and the planet do not respond to someone winning a political party nomination. Nor does a planet experience human sickness and healing.
“If something has never happened before, we are generally safe in assuming it is not going to happen in the future, but the exceptions can kill you, and climate change is one of those exceptions.” – Al Gore
The future is not knowable, and can only be a matter of speculation and opinion.
“The planet is warming because of the growing level of greenhouse gas emissions from human activity. If this trend continues, truly catastrophic consequences are likely to ensue. “– Malcolm Turnbull
Temperature is an intrinsic property of an object, so temperature of “the planet” cannot be measured. The likelihood of catastrophic consequences is unknowable. Humans are blamed as guilty by association.
“Anybody who doesn’t see the impact of climate change is really, and I would say, myopic. They don’t see the reality. It’s so evident that we are destroying Mother Earth. “– Juan Manuel Santos
“Climate change” is an abstraction anyone can fill with subjective content. Efforts to safeguard the environment are real, successful and ignored in the rush to alarm.
“Climate change, if unchecked, is an urgent threat to health, food supplies, biodiversity, and livelihoods across the globe.” – John F. Kerry
To the abstraction “Climate Change” is added abstract “threats” and abstract means of “checking Climate Change.”
“Climate change is the most severe problem that we are facing today, more serious even than the threat of terrorism.” -David King
Instances of people killed and injured by terrorists are reported daily and are a matter of record, while problems from Climate Change are hypothetical
Corollary: Reality is also that which doesn’t happen, no matter how much we expect it to.
Climate Models Are Built on Fallacies
A previous post Chameleon Climate Modelsdescribed the general issue of whether a model belongs on the bookshelf (theoretically useful) or whether it passes real world filters of relevance, thus qualifying as useful for policy considerations.
Following an interesting discussion on her blog, Dr. Judith Curry has written an important essay on the usefulness and limitations of climate models.
The paper was developed to respond to a request from a group of lawyers wondering how to regard claims based upon climate model outputs. The document is entitled Climate Models and is a great informative read for anyone. Some excerpts that struck me in italics with my bolds and added images.
Climate model development has followed a pathway mostly driven by scientific curiosity and computational limitations. GCMs were originally designed as a tool to help understand how the climate system works. GCMs are used by researchers to represent aspects of climate that are extremely difficult to observe, experiment with theories in a new way by enabling hitherto infeasible calculations, understand a complex system of equations that would otherwise be impenetrable, and explore the climate system to identify unexpected outcomes. As such, GCMs are an important element of climate research.
Climate models are useful tools for conducting scientific research to understand the climate system. However, the above points support the conclusion that current GCM climate models are not fit for the purpose of attributing the causes of 20th century warming or for predicting global or regional climate change on timescales of decades to centuries, with any high level of confidence. By extension, GCMs are not fit for the purpose of justifying political policies to fundamentally alter world social, economic and energy systems.
It is this application of climate model results that fuels the vociferousness of the debate surrounding climate models.
Evolution of state-of-the-art Climate Models from the mid 70s to the mid 00s. From IPCC (2007)
The actual equations used in the GCM computer codes are only approximations of the physical processes that occur in the climate system.
While some of these approximations are highly accurate, others are unavoidably crude. This is because the real processes they represent are either poorly understood or too complex to include in the model given the constraints of the computer system. Of the processes that are most important for climate change, parameterizations related to clouds and precipitation remain the most challenging, and are the greatest source of disagreement among different GCMs.
There are literally thousands of different choices made in the construction of a climate model (e.g. resolution, complexity of the submodels, parameterizations). Each different set of choices produces a different model having different sensitivities. Further, different modeling groups have different focal interests, e.g. long paleoclimate simulations, details of ocean circulations, nuances of the interactions between aerosol particles and clouds, the carbon cycle. These different interests focus their limited computational resources on a particular aspect of simulating the climate system, at the expense of others.
Overview of the structure of a state-of-the-art climate model. See Climate Models Explainedby R.G. Brown
Human-caused warming depends not only on how much CO2 is added to the atmosphere, but also on how ‘sensitive’ the climate is to the increased CO2. Climate sensitivity is defined as the global surface warming that occurs when the concentration of carbon dioxide in the atmosphere doubles. If climate sensitivity is high, then we can expect substantial warming in the coming century as emissions continue to increase. If climate sensitivity is low, then future warming will be substantially lower.
In GCMs, the equilibrium climate sensitivity is an ‘emergent property’ that is not directly calibrated or tuned.
While there has been some narrowing of the range of modeled climate sensitivities over time, models still can be made to yield a wide range of sensitivities by altering model parameterizations. Model versions can be rejected or not, subject to the modelers’ own preconceptions, expectations and biases of the outcome of equilibrium climate sensitivity calculation.
Further, the discrepancy between observational and climate model-based estimates of climate sensitivity is substantial and of significant importance to policymakers. Equilibrium climate sensitivity, and the level of uncertainty in its value, is a key input into the economic models that drive cost-benefit analyses and estimates of the social cost of carbon.
Variations in climate can be caused by external forcing, such as solar variations, volcanic eruptions or changes in atmospheric composition such as an increase in CO2. Climate can also change owing to internal processes within the climate system (internal variability). The best known example of internal climate variability is El Nino/La Nina. Modes of decadal to centennial to millennial internal variability arise from the slow circulations in the oceans. As such, the ocean serves as a ‘fly wheel’ on the climate system, storing and releasing heat on long timescales and acting to stabilize the climate. As a result of the time lags and storage of heat in the ocean, the climate system is never in equilibrium.
The combination of uncertainty in the transient climate response (sensitivity) and the uncertainties in the magnitude and phasing of the major modes in natural internal variability preclude an unambiguous separation of externally forced climate variations from natural internal climate variability. If the climate sensitivity is on the low end of the range of estimates, and natural internal variability is on the strong side of the distribution of climate models, different conclusions are drawn about the relative importance of human causes to the 20th century warming.
Figure 5.1. Comparative dynamics of the World Fuel Consumption (WFC) and Global Surface Air Temperature Anomaly (ΔT), 1861-2000. The thin dashed line represents annual ΔT, the bold line—its 13-year smoothing, and the line constructed from rectangles—WFC (in millions of tons of nominal fuel) (Klyashtorin and Lyubushin, 2003). Source: Frolov et al. 2009
Anthropogenic (human-caused) climate change is a theory in which the basic mechanism is well understood, but whose potential magnitude is highly uncertain.
What does the preceding analysis imply for IPCC’s ‘extremely likely’ attribution of anthropogenically caused warming since 1950? Climate models infer that all of the warming since 1950 can be attributed to humans. However, there have been large magnitude variations in global/hemispheric climate on timescales of 30 years, which are the same duration as the late 20th century warming. The IPCC does not have convincing explanations for previous 30 year periods in the 20th century, notably the warming 1910-1945 and the grand hiatus 1945-1975. Further, there is a secular warming trend at least since 1800 (and possibly as long as 400 years) that cannot be explained by CO2, and is only partly explained by volcanic eruptions.
CO2 relation to Temperature is Inconsistent.
Summary
There is growing evidence that climate models are running too hot and that climate sensitivity to CO2 is on the lower end of the range provided by the IPCC. Nevertheless, these lower values of climate sensitivity are not accounted for in IPCC climate model projections of temperature at the end of the 21st century or in estimates of the impact on temperatures of reducing CO2 emissions.
The climate modeling community has been focused on the response of the climate to increased human caused emissions, and the policy community accepts (either explicitly or implicitly) the results of the 21st century GCM simulations as actual predictions. Hence we don’t have a good understanding of the relative climate impacts of the above (natural factors) or their potential impacts on the evolution of the 21st century climate.
Footnote:
There are a series of posts here which apply reality filters to attest climate models. The first was Temperatures According to Climate Modelswhere both hindcasting and forecasting were seen to be flawed.
Given the persistent headlines about climate change over the years, it’s surprising how long it took the Nobel Committee to award the Physics prize to a climate modeler, which finally occurred earlier this month.
Indeed, Syukuro Manabe has been a pioneer in the development of so-called general circulation climate models (GCMs) and more comprehensive Earth System Models (ESMs). According to the Committee, Manabe was awarded the prize “For the physical modelling of the earth’s climate, quantifying variability, and reliably predicting global warming.”
What Manabe did was to modify early global weather forecasting models, adapting them to long-term increases in human emissions of carbon dioxide that alter the atmosphere’s internal energy balance, resulting in a general warming of surface temperatures, along with a much larger warming of temperatures above the surface over the earth’s vast tropics.
Unlike some climate modelers, like NASA’s James Hansen — who lit the bonfire of the greenhouse vanities in 1988, Manabe is hardly a publicity hound. And while politics clearly influences it (see Al Gore’s 2007 Prize), the Nobel Committee also respects primacy, as Manabe’s model was the first comprehensive GCM. He produced it at the National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory (GFDL) in Princeton NJ. The seminal papers were published in 1975 and 1980.
And, after many modifications and renditions, it is also the most incorrect of all the world’s GCMs at altitude over the vast tropics of the planet.
Getting the tropical temperatures right is critical. The vast majority of life-giving moisture that falls over the worlds productive midlatitude agrosystems originates as evaporation from the tropical oceans.
The major determinant of how much moisture is wafted into our region is the vertical distribution of tropical temperature. When the contrast is great, with cold temperatures aloft compared to the normally hot surface, that surface air is buoyant and ascends, ultimately transferring moisture to the temperate zones. When the contrast is less, the opposite occurs, and less moisture enters the atmosphere.
Every GCM or ESM predicts that several miles above the tropical surface should be a “hot spot,” where there is much more warming caused by carbon dioxide emissions than at the surface. If this is improperly forecast, then subsequent forecasts of rainfall over the world’s major agricultural regions will be unreliable.
That in turn will affect forecasts of surface temperature. Everyone knows a wet surface heats up (and cools down) slower than a dry one (see: deserts), so getting the moisture input right is critical.
Following Manabe, vast numbers of modelling centers popped up, mushrooms fertilized by public — and only public — money.
Every six years or so, the U.S. Department of Energy collects all of these models, aggregating them into what they call Coupled Model Intercomparison Projects (CMIPs). These serve as the bases for the various “scientific assessments” of climate change produced by the U.N.’s Intergovernmental Panel on Climate Change (IPCC) or the U.S. “National Assessments” of climate.
Figure 8: Warming in the tropical troposphere according to the CMIP6 models. Trends 1979–2014 (except the rightmost model, which is to 2007), for 20°N–20°S, 300–200 hPa. John Christy (2019)
In 2017, University of Alabama’s John Christy, along with Richard McNider, published a paper that, among other things, examined the 25 applicable families of CMIP-5 models, comparing their performance to what’s been observed in the three-dimensional global tropics. Take a close look at Figure 3 from the paper, in the Asia-Pacific Journal of Atmospheric Sciences, and you’ll see that the model GFDL-CM3 is so bad that it is literally off the scale of the graph. [See Climate Models: Good, Bad and Ugly]
At its worst, the GFDL model is predicting approximately five times as much warming as has been observed since the upper-atmospheric data became comprehensive in 1979. This is the most evolved version of the model that won Manabe the Nobel.
In the CMIP-5 model suite, there is one, and only one, that works. It is the model INM-CM4 from the Russian Institute for Numerical Modelling, and the lead author is Evgeny Volodin. It seems that Volodin would be much more deserving of the Nobel for, in the words of the committee “reliably predicting global warming.”
Might this have something to do with the fact that INM-CM4 and its successor models have less predicted warming than all of the other models?
Patrick J. Michaels is a senior fellow working on energy and environment issues at the Competitive Enterprise Institute and author of “Scientocracy: The Tangled Web of Public Science and Public Policy.”