Climate Crusade Is a Dead End

This post presents the main points and exhibits from Professor de Lange’s presentation February 26, 2025.  Most images are self explanatory, with some excerpts in italics lightly edited from captions, and some added images as well. H/T Bud Bromley.

Prof. de Lange demonstrates that there is no credible climate crisis, and that there is much more to climate than CO2 alone. First, he addresses the discrepancy between satellite temperature measurements and results from climate models. Second, he shows the effect of even doubling the CO2 concentration has only minor effects, while it is in fact crucial to photosynthesis. Third, he shows that how the significant lack of experimental data on cloud composition now hampers progress in climate science. Fourth, he demonstrates that there is no convincing correlation between CO2 and temperature on a geological time scale. Fifth, he addresses global future energy supply, demonstrating that renewables are “unaffordables”, just as are untested technologies (batteries, hydrogen), and he concludes that the future has to be based on nuclear power.

1.  Natural Science and Observations versus Models

2.  Atmospheric Physics and Greenhouse Gases

Warm Surface of the earth can be viewed as a radiator in the infrared that radiates Intensity out Into the atmosphere, and again the flow of infrared energy is not interrupted. It is absorbed by the atmosphere and that’s where the clouds turn out to be extremely important. They delay the outgoing energy into the universe. In climate science we balance the yellow incoming solar energy in watts per square meter with the outgoing radiation from the surface and atmosphere. Some is reflected and some is absorbed and emitted as long wave radiation.  The imbalance is shown at the bottom as ~1 W/m2, which is a small difference between two much larger energy flows showing hundreds of W/m2. If for any reason, there is a slight change in either the incoming or outgoing flows, the imbalance would change dramatically.

The fact that Greenhouse gases play very important role in absorbing infrared radiation in the atmosphere is already 150 years old. We shall see that dependence of the temperature of the earth due to greenhouse gases is not linear, the effect on temperature is logarithmic. This is seen in the graph on the left side.

On the horizontal scale we see the frequency scale expressed in common unit in physics in wave numbers. And here we see the continuous Blue Trace results from infrared radiation that would leave the warm surface of the planet if there were no atmosphere at all. The total surface under the blue trace depends on temperature to the fourth power, very temperature dependent.

We see the effect of atmosphere greenhouse gases represented by the black line, which is a bit lower than the blue Trace. The green line shows the where the black line would be, were there to be no CO2 in the atmosphere. The red line shows that there would be little difference from doubling CO2 from 400 ppm to 800 ppm.

The role of water vapor is terribly important.  Water is the most important Greenhouse gas, but when we Go to clouds, he situation becomes much more complicated than in the absence of clouds. So clouds again are the Achilles heel of of climate Science.  As I said an increase in CO2  leads to a little more warming but the increase is logarithmic. meaning less and Less warming at higher CO2 levels.  Doubling CO2 leads to extra forcing of about 1 percent or about 3 watts per square meter.  Since 1850 when temperature measurements really started since, the planet’s surface has warmed up by about 1°C.   That is not very much, and the effect of CO2 can only be very much smaller.

3.  Scattering in Clouds

The post referenced in the exhibit is Clauser’s Case: GHG Science Wrong, Clouds the Climate Thermostat

4. Is CO2 the only and most important culprit of ‘’disastrous’’ climate change, warming in particular?

5. Supplying Energy to a Growing World Population

Solar Activity Linked to Ocean Cycles

Solar energy accumulates massively in the ocean and is variably released during circulation events.

Thanks to Franklin Isaac Ormaza-González alerting me to this paper Did Schwabe cycles 19–24 influence the ENSO events, PDO, and AMO indexes in the Pacific and Atlantic Oceans? by Ormaza-González, Espinoza-Celi and Roa-López, all from ESPOL Polytechnic University, Ecuador.  Why is this important? Because warming in the modern era is closely tied to El Niño and La Niña events (ENSO).  For example,

The exhibit shows since 1947 GMT warmed by 0.8 C, from 13.9 to 14.7, as estimated by Hadcrut4.  This resulted from three natural warming events involving ocean cycles. The most recent rise 2013-16 lifted temperatures by 0.2C.  Previously the 1997-98 El Nino produced a plateau increase of 0.4C.  Before that, a rise from 1977-81 added 0.2C to start the warming since 1947.

As shown in the synopsis below, the paper analyzes multiple oceanic oscillations during the years 1954 to 2019 in order to compare with solar cycles of sunspots 19 through 24 occurring during that time frame.  The title is stated as a question, and the conclusion provides this answer (in italics with my bolds).

Finally, did Schwabe cycles 19–24 influence the ENSO events, PDO, and AMO indexes in the Pacific and Atlantic Oceans? Yes, it has been found a wide range correlation coefficient from 0.100 to about 0.500 statistically significant (p < 0.05) with lag times from few months to over 2 years between the Schwabe cycles and the ocean indices chosen here. These results could be a potential source to improve predictive skills for the understanding of ENSO, PDO and AMO interannual and decadal fluctuations. Better predictive models are imperative given that El Niño or La Niña has vast impacts on lives, property, and economic activity around the globe, especially when dramatic peaks of El Niño occur. The new cycle 25 has started and could have a major oceanic swing follow suit, and the next El Niño would be in around 2023–2024 according to historical events and results presented here.

Given that the paper was drafted before submitting in February 2022, and publication in October that year, the forecast of a 2023-24 El Nino was confirmed in a remarkable way.

To enlarge, open image in new tab.

The cyan line represents SST anomalies in the Tropics and shows the major El Ninos, 2015-16, 2019-20 and 2023-24.  Note all three events included pairs of major NH summer warming peaks. The synopsis below consists of excerpts in italics with my bolds to present the broad strokes of the analyses and findings. (Note: The paper includes detailed analyses and many references to supporting studies, and interested readers can access them by linking there.)

Context

The surface-subsurface layers of the ocean that interact with the lower atmosphere alternately release and absorb heat energy. The work of Zhou and Tung (2010) reported the impact of the TSI on global SST over 150 years, finding signals of cooling and warming SSTs at the valley and peak of the SS cycles. Schlesinger and Ramankutty (1994) report a global cycle of 65–70 years for SST that is affected by greenhouse anthropogenic gases, sulphate aerosols and/or El Niño events, but they did not imply any external forcing such as the SS. There have been other studies on how solar radiation variability could affect temperature; recently, Cheke et al. (2021) have studied those solar cycles of SS that would affect the El Nino Southern Oscillation (ENSO) indexes.

There are well known oceanic events that show periodicity with low or high frequencies: 25–30 and 3–7 years, respectively. These include the Pacific Decadal Oscillation (PDO), Atlantic Multidecadal Oscillation (AMO),  and Interdecadal Pacific Oscillation (IPO), as well as El Niño or La Niña.  During El Niño events, the surface and subsurface lose energy to the atmosphere and the opposite occurs during La Niña; these events have a periodicity of 3–7 years. The Interdecadal oscillations have a series of impacts; e.g., the PDO gives rise to teleconnections between the tropic and mid-latitudes, and the effects include:

1) ocean heat content,
2) the lower and higher levels of the trophic chain including small pelagic fisheries (tuna and sardines);
3) biogeochemical air-sea CO2 fluxes;
4) the frequency of La Niña/El Niño.

The interactions between decadal oscillations PDO/IPO and AMO may also affect ocean heat content. All these low and high frequency oceanographic events have a direct impact on local, regional, and global climate patterns, and there is growing evidence from many studies that the driving source of energy is the sun.

Thus, whatever affects the solar irradiation falling on the surface of the oceans, including volcanic eruptions (Fang et al., 2020), and cloudiness for example, it would affect the gain or loss of heat content of the oceans. The cited works tried to find the physical reasons for these connections, but they remained unknown or difficult to explain.

The work reported here investigates how fluctuations of sunspots over time (1954–2019) may cross-correlate with low and high frequency oceanic events such as the sea surface temperature (SST), anomalies (SSTA), Oceanographic El Niño Index (ONI), Multivariate ENSO Index (MEI), Southern Oscillation Index (SOI) in the central and east equatorial Pacific Ocean; and PDO, as well as on the AMO in the North Pacific and Atlantic basins. The hypothesis is that even small variations of the TSI can be reflected in these tele-connected indexes.

Discussion

Fig. 1. Behaviour of monthly counts of SS, ONI, MEI, PDO and AMO. The Indexes start at t = 0, 12, 24 and 36 months (panels a, b, c, and d respectively). The SS series starts at t = 0 in the four panels. The left vertical axis gives the values for the Indexes, and SS counts at the right vertical scale. The end of each Schwabe cycle is marked by vertical dashed lines.

Maxima in the PDO, AMO, ONI, and MEI series were offset by 0, 12, 24 and 36 months (Fig. 1, panels a, b, c, and d respectively), with the SS series starts at t = 0 in the four panels. It has been reported that the lag times for responses of some Indexes to SS cycles (SS) are around 12–36 months (see fig. 1 of Hassan et al., 2016), and Fang et al. (2020) have reported that ENSO responds with a 2–3 years of lag time after a major volcanic eruption. From 1954 to the present time, each sunspot cycle from 19 to 24 has occurred with a period of around 11 years (Hathaway, 2015), which is slightly less than the 11.2 years reported by Dicke (1978). The highest SS activity is seen in cycle 19 with around 250 SS/month, followed by <150, and at cycle 21 around 200, before decreasing steadily over cycles 22 to 24 to just over 100 SS/month. Cycle 24 is the lowest contemporary value of SS activity that is comparable only to cycles 12–15 (around 1880–1930) and is the lowest in the last 200 years (Clette et al., 2014).

Fig. 12. Sunspots monthly counts curves per cycle. Red and blue lines represent El Niño and La Niña events. Note that Cycle 24 finished on December 2019 (National Weather Service, 2020).

The SSTA in El Niño 1 + 2 region cross-correlated with SS many times, especially during descending phases of all cycles except SS 22 with cc-ρ up 0.389 (SS 24) and main lag times from 5 to 13 months. The SS cycles (20 and 24) during cold phase PDO showed alternate cross-correlation reaching a maximum 0.389 and negative −0.314 (p < 0.05). During the ascending phase in El Niño 1 + 2 region (blue bars, Fig. 5a) the cc-ρ peaked at 0.393 (p < 0.05). In the cycles 19 and 24 the highest cc-ρ were found, −0.460 and 0.394 (p < 0.05) respectively. These coefficients coincided with the largest (over 2 years) and most intense (<−1.5C) La Niña during 1954–1955, and 2010–2012 (Fig. 12).

It must be noticed that during cycle 21 two big events El Niño (1983–1985) and La Niña (1984–1985) were registered as well as in cycles 23 and 24 with coefficients just around 0.2. The highest coefficients would mean an influence up to 21.2% and 15.5% of the SS on the SSTAs in El Niño 3.4 region. These results would suggest the cross-correlations are stronger in El Niño 3.4 region due to the less dispersing oceanographic-meteorological conditions than in El Niño 1 + 2 region. Also, these findings would suggest that during the cold phase of PDOs (see NOAA, 2016), the cc-ρ in El Niño 3.4 region tends to be higher, as the solar energy reaching the ocean surface increases as the cloudiness tends to decrease significantly during prolonged periods around or over in El Niño 3.4 region (Porch et al., 2006).

The sun cycle 19 is the most intense since the last 100 years, the contrary is the cycle 24 (NWS, 2021). In general, the ascending phase of the SS cycles takes a shorter time than descending phase, therefore the slope of the curve is steeper (Fig. 12); then the increasing change of the TSI influences in a clearer way the studied indexes. It seems that during the ascending phases, El Niño events are prone to develop as TSI increases (as well as UV radiation does, NWS, 2021), while during plunging SS phases, when the TSI tends to diminish (see Formula (1)), could lead to La Niña events, like the 2020–2022 occurrence (Ormaza-González, 2021).

Most of the La Niña events occur during the descending phase or just when approaching or leaving the valley or minimum SS counts (Fig. 12) when the TSI decreases and reaches the minimum (Scafetta et al., 2019). La Niña 2020–2022 is a good example, the lowest SS counts (<2 counts/months) occurred during extended periods when reaching the valley of the SS 24. The valley of SS 24 has had an extended period of close to 3 years, during which there have been weeks and months without sunspots, before the SS 25 started in December 2020.

The weakest sunspot cycle (SS 24) over the last 100 years (NWS, 2021) has had four La Niña events: 2007–2009, 2010–2012, 2016–2017, and 2020–2022 (Fig. 12), it is the only cycle with that number of La Niña events.

Conclusions

Over the studied period 1954–2019, sunspot numbers decreased from a monthly maximum between 225 (SS 21) to a minimum around 20–25 (SS 24). The SS 24 had 913 days without SS counts until December 2019 (Burud et al., 2021), being this cycle the weakest since 1755; and the SS 25 will probably be weaker than or like SS 24 (Ineson et al., 2014; Chowdhury et al., 2021; NASA, 2021a, NASA, 2021b). Thus, the Earth has been receiving slightly decreasing solar energy over this almost 7-decade period.

On the ocean surface the influence of sunspots could chiefly be due to UV energy fluctuation (Ineson et al., 2014) as this radiation penetrates down to 75–100 m depth in the water column (Smyth, 2011). van Loon et al. (2007) suggested that even though SS cycles produce weak changes on the Total Solar Irradiation (TSI) of about 0.07% (Gray et al., 2010), these can still produce decadal and millennial impacts on global thermohaline circulation (Bond et al., 2001; Gray et al., 2016).

The ONI Index showed to be poorly cross-correlated with cc-ρ values <0.100, only twice approached to −0.200. On the other hand, the MEI registered around ±0.200 through all cycles and predominant lag times within 12 months. The SOI showed cross-correlations with SS cycles (19–21, and) averaging a coefficient of 0.200 with lags times range of 9–34 months. The SOI temporal behaviour has also been associated with SS and it could enhance or affect the oceanographic Indexes of the equatorial Pacific (Higginson et al., 2004). [The Multivariate ENSO Index does not only consider the SST Anomaly but also sea-level pressure and other variables.]

The MEI index could have been influenced from 7.3% up to 23%. The MEI correlated in all ascending and descending phases of SS cycles. The SOI had similar cross-correlation coherence to those oceanographic indexes during ascending and descending phases. These results would provide evidence on how SS affects the studied Indexes during the ascending/descending phases of their cycles. In some cycles, the impact will be stronger and in other weaker depending on intensity and behaviour in time of the cycle.

Finally, did Schwabe cycles 19–24 influence the ENSO events, PDO, and AMO indexes in the Pacific and Atlantic Oceans? Yes, it has been found a wide range correlation coefficient from 0.100 to about 0.500 statistically significant (p < 0.05) with lag times from few months to over 2 years between the Schwabe cycles and the ocean indices chosen here. These results could be a potential source to improve predictive skills for the understanding of ENSO, PDO and AMO interannual and decadal fluctuations. Better predictive models are imperative given that El Niño or La Niña has vast impacts on lives, property, and economic activity around the globe, especially when dramatic peaks of El Niño occur. The new cycle 25 has started and could have a major oceanic swing follow suit, and the next El Niño would be in around 2023–2024 according to historical events and results presented here.

Sun Rules Earth Climate

On February 12, 2025, Tom Nelson conducted the above interview with solar physicist Valentina Zharkova: Grand solar minimum is underway. Below is my synopsis  of lightly edited transcript excerpts from the closed captions along with key graphics in her presentation. H/T Chiefio

The full content of the video is:

Time line of segments:
0:00 – Introduction to Valentina
0:35 – Understanding the Solar Cycles
4:25 – Challenges In Measuring Sun
5:10 – Discovering The Background (magnetic fields)
6:00 – Analyzing Magnetic Waves
7:50 – Predicting Solar Activity
14;45 – Grand Solar Minimum
27:25 – Implications of the Grand Solar Minimum
37:55 – CO2 and Temperature Correlation
39:10 – Solar Cycles and Earth’s Temperature
42:45 – Solar Inertial Motion and Climate
48:30 – Future Climate Predictions
1:05:20 – Volcanic Activity and Climate
1:07:30 – Earth’s Magnetic Field
1:12:10 – Concluding Thoughts

Transcript Excerpts

Today we’re talking again about Grand solar minimum but I also speak about a little bit of solar radiation and verification of the new solar activity index we discovered with the existing one which is derived by average Sunspot number.

Understanding the Solar Cycle and Sunspots

The solar activity cycle is about 11 years and on the Sun it occurs that in the start of the cycle on the left image the sun has Southern polarity.  And during the cycle this polarity slowly migrates in the opposite direction and so the next solar minimum you have polarity changed and this happens approximately every 11 years. so basically what is happening the the loops appear in the Solar surface and the occurring as the active region for forming coronal mass injections flares and different fluxes towards the Earth and other planets.

So in the past we were dealing  with the sunspots.  In the 18th century Wolff discovered that this Sunspot appears on this latitude 30° and migrates slowly towards the equator and basically this is the basic Solar activity index using daily average Sunspot numbers.

Why we love sunspots and why we support this for a couple of centuries is because sunspots actually are Roots which are embedded into the Photosphere (the surface layer of the Sun that gives off light).  And we see them from outside with the naked eye but basically they are the places where magnetic Loops are embedded.

The problem with Sunspots is that we see only a few of them.  Even with this Solar maximum there’s only a small part of the solar surface covered with them. Whatever we use to detect them, always the Sunspot index is defined by people manually.  They agree from different observatories what number of sunspots which configuration Etc.  So the Sunspot number changes during 11 year cycle.

Discovering The Background (magnetic fields)

So we decided to look at the background field in which these sunspots are embedded so on the top is the B is the background magnetic field measured at solar observatory in Stanford with orange. So you see clearly that the leading polarity of Sunspot always opposite to the polarity of the background magnetic field in that hemisphere.  It was not only us who detected this it was others as well so it was very encouraging. We decided we can detect solar activity with much better accuracy.

The black curve is our summary modulus summary curve and the red is a  sunspot number and you see that our a Vector summary Eigen vectors will represent this Solar, remembering that our index represents the magnetic field of the background Sun. In 2022 we added Cycle 24 and discovered that our curve still represents Sunspot index.  At the bottom is the summary curve modulus summary curve cycle 25 where we are now,   Here we see our prediction that the maximum will be actually year 23-24 and now there will be a very sharp drop of the activity, and we have two little Maxima before the minimum between cycle 25-26.  Cycle 26 will be have very low amplitude, 70% lower than the previous two cycles.

So how it works.   If you have two waves on the top two black waves which are running with the same amplitude but if the face difference is zero you have constructive interference.   In the cycle 26 we can see the amplitudes are going opposite with the resulting amplitude becoming zero.   This is what we observe on the sun and I teach my  first year physics students how they interact.   There’s no miracle, just basic physics of the waves and this effect called beat effect.

Implications of the Grand Solar Minimum

Now we come back to solar radiance and climate so first we now know that we entered into a grand solar minimum, the temperature started decreasing.  But the problem with the grand solar minimum is that during previous Grand solar minimum, which was the Maunder minimum in 17th century,  the Solar Radiance reduced by 3 watts per square meter approximately. But the temperature during Maunder minimum decreased approximately by one degree maximum.

Different investigations show slightly different variations but mostly they all reconstruct temperatures during and after the minimum to find where the surface temperature was reduced on the the globe. So this is what you see for Northern Hemisphere, this is Europe, very dark blue is reduction of temperature by one degree.   And it is mostly all Europe, Russia and Siberia, and also all Northern America and Canada.

So basically this is probably we are heading towards now.  We have noticed the cold flashes from the drop of the temperature that occurred because drop of abundance of ozone created by solar ultraviolet light in the stratosphere.  If the solar radiation is reduced, this layer abundance of ozone is reduced and it affects planetary atmospheric waves.

In the left image Globe the stable just stream flows somewhere in this path and separate middle latitude from the north Northern latitude, but when ozone layer is reduced it causes giant Wiggles in just stream shown in the right plot called wind from arctics can now penetrate to the southern latitudes as shown on the picture.  It kicks off North Atlantic oscillation and balance between permanent low pressure system near Greenland and permanent high pressure system and the South into Negative PH. It was reported 24 years ago go and it works now.

We are trying now to say that the temperature will be increasing because the sun become closer to us but the sun is very humane it gives us this grand solar minimum for 30 years to sort out our understanding how the heating comes through and then prepare for the next stage of heating which come does no matter what we do on Earth; if we stop using fuels, we crawl to the caves and start using I don’t know what energy.   All people will die still the temperature will increase, it doesn’t matter what we do.

So this prediction of the anthropogenic global warming people is not working.  The temperature will be increasing no matter what we do with CO2 because the increase of the temperature comes from the solar inertial motion.   So this my conclusion: We had this global warming–it is real;  it is not caused by humans because human only contribute 6% maximum of all CO2.  And CO2 is a very good gas because it is mostly absorbed by the plants and not by humans.

Global warming is caused by this Solar inertial motion and gravitation of large planets which drag the Sun from the center Body Center closer to the planets and this causes the increase of the  temperature.  And the temperature as I shown in my book will increase by 2.5-3° by 25-2600 years. This is the end of the story.

TN: Thank you it sounds like we’re due for some cooling between now and 2053 but warming in general between then and 2600.  I’m curious, do you think we’re going to see the temperatures freeze over at all?

Yes, I’m confident it will be freezing from 2031 to 2042 for sure.  This will be the worst period of cold air and cold temperature and not only temps.  Rivers and the ponds will be freezing all right and other dramatic things that might happen.  It’s going to be a lot harder to grow wheat in Canada for example, I would guess during that time absolutely.  In 17th century people heated their houses with their own fireplaces, now we have central heating.  If we don’t have electricity even our Central heating is not working, so you need to have the portable generators run from fossil fuel or have a wood stove in your house.  At that time people grew up something in their Gardens, now people don’t know how to grow up anything, so it will be really really difficult.

See Also:

Zharkova on Solar Forcing and Global Cooling

Good Reasons to Distrust Climatists

The most recent case of climatists’ bad behavior is the retraction of a peer-reviewed paper analyzing the properties of CO2 as an IR active gas, concluding that additional levels of atmospheric CO2 will have negligible effect on temperatures.  From the Daily Sceptic:

Another important paper taking issue with the ‘settled’ climate narrative has been cancelled following a report in the Daily Sceptic and subsequent reposts that went viral across social media. The paper discussed the atmospheric ‘saturation’ of greenhouse gases such as carbon dioxide and argued that higher levels will not cause temperatures to rise. The work was led by the widely-published Polish scientist Dr. Jan Kubicki and appeared on Elsevier’s ScienceDirect website in December 2023. The paper has been widely discussed on social media since April 2024 when the Daily Sceptic reported on the findings. Interest is growing in the saturation hypothesis not least because it provides a coherent explanation for why life and the biosphere grew and often thrived for 600 million years despite much higher atmospheric levels of greenhouse gases. Alas for control freaks, it also destroys the science backing for the Net Zero fantasy.

Below are some comments responding to a Quora question, text in italics with my bolds and added images:

What are some reasons why some people do not believe in climate change or global warming despite scientific evidence? Is there any additional information that could help us understand their perspective?

Answer from Mike Jonas,  M.A. in Mathematics, Oxford University, UK, 

Good scientists do not lie and cheat to protect their science, they are happy to discuss their evidence and their findings, and they always understand that everything needs to be replicable and verifiable.

When Climategate erupted on the scene, and the climate scientists behind the man-made global warming narrative were found to have lied and cheated, all honest scientists thought that would be the end of it. Instead, what happened was that those climate scientists closed ranks and carried on, supported by a massive amount of government (ie, the public’s) money. One of the first things they did was to deflect Climategate by saying the emails involved had been hacked so should be ignored, but some of the people involved confirmed that all of the emails really were genuine.

It has been about 15 years since Climategate, and study after study has shown virtually all of the components of the man-made global warming narrative to be incorrect, even that none of the computer models used by the IPCC are fit for purpose,

And yet they maintained their closed ranks,
and the government money kept pouring in.

Did you know that the IPCC does not do any research (please do check that, on their web page About – IPCC they state “The IPCC does not conduct its own research”). It is, as its name says, an inter-governmental organisation, and it is run by and for governments. They say lots of persuasive sciency things, but the simple fact is that they cherry-pick and corrupt the science to achieve their ends. Regrettably, almost all the scientific societies are on the gravy train too. This is part of what the highly respected physicist Professor Hal Lewis said in his resignation letter to the American Physical Society (APS):

It is of course, the global warming scam, with the (literally) trillions of dollars driving it, that has corrupted so many scientists, and has carried APS before it like a rogue wave. It is the greatest and most successful pseudoscientific fraud I have seen in my long life as a physicist. Anyone who has the faintest doubt that this is so should force himself to read the ClimateGate documents, which lay it bare.

I don’t believe that any real physicist, nay scientist, can read that stuff without revulsion. I would almost make that revulsion a definition of the word scientist.

So what has the APS, as an organization, done in the face of this challenge?
It has accepted the corruption as the norm, and gone along with it.

If you want to find out more about this “greatest and most successful pseudoscientific fraud”, the website Watts Up With That? is a good place to start (the fraudsters absolutely hate it), and it links to many other good websites. It has the full text of Hal Lewis’ resignation letter at:

Hal Lewis: My Resignation From The American Physical Society – an important moment in science history

Answer from Susannah Moyer

It’s curious that climate science is the rare scientific field where dissenting scientists, those with contrarian views, are unwelcome and even ostracized.

There are some well known climate scientists that have doubts about the role of CO2 and man made global warming as it pertains to global temperature. They have raised the issue that computer generated prediction models have been inaccurate in predicting temperature patterns because the modeling requires assumptions that have not been shown to be accurate.

Here is a contrarian view from climate scientists who have published climate research results in Nature, which is no small feat:

McNider and Christy are professors of atmospheric science at the University of Alabama in Huntsville and fellows of the American Meteorological Society. Mr. Christy was a member of the Intergovernmental Panel on Climate Change that shared the 2007 Nobel Peace Prize with former Vice President Al Gore.

It is not a known fact by how much the Earth’s atmosphere will warm in response to this added carbon dioxide. The warming numbers most commonly advanced are created by climate computer models built almost entirely by scientists who believe in catastrophic global warming. The rate of warming forecast by these models depends on many assumptions and engineering to replicate a complex world in tractable terms, such as how water vapor and clouds will react to the direct heat added by carbon dioxide or the rate of heat uptake, or absorption, by the oceans.

We might forgive these modelers if their forecasts had not been so consistently and spectacularly wrong. From the beginning of climate modeling in the 1980s, these forecasts have, on average, always overstated the degree to which the Earth is warming compared with what we see in the real climate.

For instance, in 1994 we published an article in the journal Nature showing that the actual global temperature trend was “one-quarter of the magnitude of climate model results.” As the nearby graph shows, the disparity between the predicted temperature increases and real-world evidence has only grown in the past 20 years.

“Consensus” science that ignores reality can have tragic consequences if cures are ignored or promising research is abandoned. The climate-change consensus is not endangering lives, but the way it imperils economic growth and warps government policy making has made the future considerably bleaker. The recent Obama administration announcement that it would not provide aid for fossil-fuel energy in developing countries, thereby consigning millions of people to energy poverty, is all too reminiscent of the Sick and Health Board denying fresh fruit to dying British sailors.

Another questioner, Dr. Koonin was undersecretary for science in the Energy Department during President Barack Obama’s first term and is currently director of the Center for Urban Science and Progress at New York University. His previous positions include professor of theoretical physics and provost at Caltech, as well as chief scientist of BP, where his work focused on renewable and low-carbon energy technologies.

But—here’s the catch—those questions are the hardest ones to answer. They challenge, in a fundamental way, what science can tell us about future climates.

Firstly, even though human influences could have serious consequences for the climate, they are physically small in relation to the climate system as a whole. For example, human additions to carbon dioxide in the atmosphere by the middle of the 21st century are expected to directly shift the atmosphere’s natural greenhouse effect by only 1% to 2%. Since the climate system is highly variable on its own, that smallness sets a very high bar for confidently projecting the consequences of human influences.

A second challenge to “knowing” future climate is today’s poor understanding of the oceans. The oceans, which change over decades and centuries, hold most of the climate’s heat and strongly influence the atmosphere. Unfortunately, precise, comprehensive observations of the oceans are available only for the past few decades; the reliable record is still far too short to adequately understand how the oceans will change and how that will affect climate.

A third fundamental challenge arises from feedbacks that can dramatically amplify or mute the climate’s response to human and natural influences. One important feedback, which is thought to approximately double the direct heating effect of carbon dioxide, involves water vapor, clouds and temperature.

Climate Science Is Not Settled

Another group questioning what some consider “settled science”:

  • Claude Allegre, former director of the Institute for the Study of the Earth, University of Paris;
  • J. Scott Armstrong, cofounder of the Journal of Forecasting and the International Journal of Forecasting;
  • Jan Breslow, head of the Laboratory of Biochemical Genetics and Metabolism, Rockefeller University;
  • Roger Cohen, fellow, American Physical Society;
  • Edward David, member, National Academy of Engineering and National Academy of Sciences;
  • William Happer, professor of physics, Princeton;
  • Michael Kelly, professor of technology, University of Cambridge, U.K.;
  • William Kininmonth, former head of climate research at the Australian Bureau of Meteorology;
  • Richard Lindzen, professor of atmospheric sciences, MIT;
  • James McGrath, professor of chemistry, Virginia Technical University;
  • Rodney Nichols, former president and CEO of the New York Academy of Sciences;
  • Burt Rutan, aerospace engineer, designer of Voyager and SpaceShipOne;
  • Harrison H. Schmitt, Apollo 17 astronaut and former U.S. senator;
  • Nir Shaviv, professor of astrophysics, Hebrew University, Jerusalem;
  • Henk Tennekes, former director, Royal Dutch Meteorological Service;
  • Antonio Zichichi, president of the World Federation of Scientists, Geneva.

Although the number of publicly dissenting scientists is growing, many young scientists furtively say that while they also have serious doubts about the global-warming message, they are afraid to speak up for fear of not being promoted—or worse. They have good reason to worry. In 2003, Dr. Chris de Freitas, the editor of the journal Climate Research, dared to publish a peer-reviewed article with the politically incorrect (but factually correct) conclusion that the recent warming is not unusual in the context of climate changes over the past thousand years. The international warming establishment quickly mounted a determined campaign to have Dr. de Freitas removed from his editorial job and fired from his university position. Fortunately, Dr. de Freitas was able to keep his university job.

This is not the way science is supposed to work, but we have seen it before—for example, in the frightening period when Trofim Lysenko hijacked biology in the Soviet Union. Soviet biologists who revealed that they believed in genes, which Lysenko maintained were a bourgeois fiction, were fired from their jobs. Many were sent to the gulag and some were condemned to death.

Why is there so much passion about global warming, and why has the issue become so vexing that the American Physical Society (APS), from which Dr. Giaever resigned a few months ago, refused the seemingly reasonable request by many of its members to remove the word “incontrovertible” from its description of a scientific issue?

There are several reasons, but a good place to start is the old question
“cui bono?” Or the modern update, “Follow the money.”

 

 

 

 

Happer: Cloud Radiation Matters, CO2 Not So Much (2025)

This month van Wijngaarden and Happer published a new paper Radiation Transport in Clouds.

Last year William Happer spoke on Radiation Transfer in Clouds at the EIKE conference, and the video is above.  For those preferring to read, below is a transcript from the closed captions along with some key exhibits.  I left out the most technical section in the latter part of the presentation. Text in italics with my bolds.

William Happer: Radiation Transfer in Clouds

People have been looking at Clouds for a very long time in in a quantitative way. This is one of the first quantitative studies done about 1800. And this is John Leslie,  a Scottish physicist who built this gadget. He called it an Aethrioscope, but basically it was designed to figure out how effective the sky was in causing Frost. If you live in Scotland you worry about Frost. So it consisted of two glass bulbs with a very thin capillary attachment between them. And there was a little column of alcohol here.

The bulbs were full of air, and so if one bulb got a little bit warmer it would force the alcohol up through the capillary. If this one got colder it would suck the alcohol up. So he set this device out under the clear sky. And he described that the sensibility of the instrument is very striking. For the liquor incessantly falls and rises in the stem with every passing cloud. in fine weather the aethrioscope will seldom indicate a frigorific impression of less than 30 or more than 80 millesimal degrees. He’s talking about how high this column of alcohol would go up and down if the sky became overclouded. it may be reduced to as low as 15 refers to how much the sky cools or even five degrees when the congregated vapours hover over the hilly tracks. We don’t speak English that way anymore but I I love it.

The point was that even in 1800 Leslie and his colleagues knew very well that clouds have an enormous effect on the cooling of the earth. And of course anyone who has a garden knows that if you have a clear calm night you’re likely to get Frost and lose your crops. So this was a quantitative study of that.

Now it’s important to remember that if you go out today the atmosphere is full of two types of radiation. There’s sunlight which you can see and then there is the thermal radiation that’s generated by greenhouse gases, by clouds and by the surface of the Earth. You can’t see thermal radiation but you you can feel it if it’s intense enough by its warming effect. And these curves practically don’t overlap so we’re really dealing with two completely different types of radiation.

There’s sunlight which scatters very nicely and off of not only clouds but molecules; it’s the blue sky the Rayleigh scattering. Then there’s the thermal radiation which actually doesn’t scatter at all on molecules so greenhouse gases are very good at absorbing thermal radiation but they don’t scatter it. But clouds scatter thermal radiation and plotted here is the probability that you will find Photon of sunlight between you know log of its wavelength and the log of in this interval of the wavelength scale.

Since Leslie’s day two types of instruments have been developed to do what he did more precisely. One of them is called a pyranometer and this is designed to measure sunlight coming down onto the Earth on a day like this. So you put this instrument out there and it would read the flux of sunlight coming down. It’s designed to see sunlight coming in every direction so it doesn’t matter which angle the sun is shining; it’s uh calibrated to see them all.

Let me show you a measurement by a pyranometer. This is a actually a curve from a sales brochure of a company that will sell you one of these devices. It’s comparing two types of detectors and as you can see they’re very good you can hardly tell the difference. The point is that if you look on a clear day with no clouds you see sunlight beginning to increase at dawn it peaks at noon and it goes down to zero and there’s no sunlight at night. So half of the day over most of the Earth there’s no sunlight in the in the atmosphere.

Here’s a day with clouds, it’s just a few days later shown by days of the year going across. You can see every time a cloud goes by the intensity hitting the ground goes down. With a little clear sky it goes up, then down up and so on. On average at this particular day you get a lot less sunlight than you did on the clear day.

But you know nature is surprising. Einstein had this wonderful quote: God is subtle but he’s not malicious. He meant that nature does all of sorts of things you don’t expect, and so let me show you what happens on a partly cloudy day. Here so this is data taken near Munich. The blue curve is the measurement and the red curve is is the intensity on the ground if there were no clouds. This is a partly cloudy day and you can see there are brief periods when the sunlight is much brighter on the detector on a cloudy day than it is on the clear day. And that’s because coming through clouds you get focusing from the edges of the cloud pointing down toward your detector. That means somewhere else there’s less radiation reaching the ground. But this is rather surprising to most people. I was very surprised to learn about it but it just shows that the actual details of climate are a lot more subtle than you might think.

We know that visible light only happens during the daytime and stops at night. There’s a second type of important radiation which is the thermal radiation which is measured by a similar device. You have a silicon window that passes infrared, which is below the band gap of silicon, so it passes through it as though transparent. Then there’s some interference filters here to give you further discrimination against sunlight. So sunlight practically doesn’t go through this at all, so they call it solar solar blind since it doesn’t see the Sun.

But it sees thermal radiation very clearly with a big difference between this device and the sunlight sensing device I showed you. Because actually most of the time this is radiating up not down. Out in the open air this detector normally gets colder than the body of the instrument. And so it’s carefully calibrated for you to compare the balance of down coming radiation with the upcoming radiation. Upcoming is normally greater than down coming.

I’ll show you some measurements of the downwelling flux here; these are actually in Greenland in Thule and these are are watts per square meter on the vertical axis here. The first thing to notice is that the radiation continues day and night you can you if you look at the output of the pyrgeometer you can’t tell whether it’s day or night because the atmosphere is just as bright at night as it is during the day. However, the big difference is clouds: on a cloudy day you get a lot more downwelling radiation than you do on a clear day. Here’s a a near a full day of clear weather there’s another several days of clear weather. Then suddenly it gets cloudy. Radiation rises because the bottoms of the clouds are relatively warm at least compared to the clear sky. I think if you put the numbers In, this cloud bottom is around 5° Centigrade so it was fairly low Cloud. it was summertime in Greenland and this compares to about minus 5° for the clear sky.

So there’s a lot of data out there and there really is downwelling radiation there no no question about that you measure it routinely. And now you can do the same thing looking down from satellites so this is a picture that I downloaded a few weeks ago to get ready for this talk from Princeton and it was from Princeton at 6 PM so it was already dark in Europe. So this is a picture of the Earth from a geosynchronous satellite that’s parked over Ecuador. You are looking down on the Western Hemisphere and this is a filtered image of the Earth in Blue Light at 47 micrometers. So it’s a nice blue color not so different from the sky and it’s dark where the sun has set. There’s still a fair amount of sunlight over the United States and the further west.

Here is exactly the same time and from the same satellite the infrared radiation coming up at 10.3 which is right in the middle of the infrared window where there’s not much Greenhouse gas absorption; there’s a little bit from water vapor but very little, trivial from CO2.

As you can see, you can’t tell which side is night and which side is day. So even though the sun has set over here it is still glowing nice and bright. There’s sort of a pesky difference here because what you’re looking at here is reflected sunlight over the intertropical Convergence Zone. There are lots of high clouds that have been pushed up by the convection in the tropics and uh so this means more visible light here. You’re looking at emission of the cloud top so this is less thermal light so white here means less light, white there means more light so you have to calibrate your thinking. to

But the Striking thing about all of this: if you can see the Earth is covered with clouds, you have to look hard to find a a clear spot of the earth. Roughly half of the earth maybe is clear at any given time but most of it’s covered with clouds. So if anything governs the climate it is clouds and and so that’s one of the reasons I admire so much the work that Svensmark and Nir Shaviv have done. Because they’re focusing on the most important mechanism of the earth: it’s not Greenhouse Gases, it’s Clouds. You can see that here.

Now this is a single frequency let me show you what happens if you look down from a satellite and do look at the Spectrum. This is the spectrum of light coming up over the Sahara Desert measured from a satellite. And so here is the infrared window; there’s the 10.3 microns I mentioned in the previous slide it’s it’s a clear region. So radiation in this region can get up from the surface of the Sahara right up to outer space.

Notice that the units on these scales are very different; over the Sahara the top unit is 200, 150 over the Mediterranean and it’s only 60 over the South Pole. But at least the Mediterranean and the Sahara are roughly similar so the right side here these three curves on the right are observations from satellites and the three curves on the left are are calculations modeling that we’ve done. The point here is that you can hardly tell the difference between a model calculation and observed radiation.

So it’s really straightforward to calculate radiation transfer. If someone quotes you a number in watts per square centimeter you should take it seriously; that probably a good number. If they tell you a temperature you don’t know what to make about it. Because there’s a big step between going from watts per square centimeter to a temperature change. All the mischief in the whole climate business is going from watts per square centimeter to to Centigrade or Kelvin.

Now I will say just a few words about clear sky because that is the simplest. Then we’ll get on to clouds, the topic of this talk. This is a calculation with the same codes that I showed you in the previous slide which as you saw work very well. It’s worth spending a little time because this is the famous Planck curve that was the birth of quantum mechanics. There is Max Planck who figured out what the formula for that curve is and why it is that way. This is what the Earth would radiate at 15° Centigrade if there were no greenhouse gases. You would get this beautiful smooth curve the Planck curve. If you actually look at the Earth from the satellites you get a raggedy jaggedy black curve. We like to call that the Schwarzchild curve because Carl Schwarzchild was the person who showed how to do that calculation. Tragically he died during World War I, a Big Big loss to science.

There are two colored curves that I want to draw your attention. The green curve is is what Earth would radiate to space if you took away all the CO2 so it only differs from the black curve you know in the CO2 band here this is the bending band of CO2 which is the main greenhouse effect of CO2. There’s a little additional effect here which is the asymmetric stretch but it it doesn’t contribute very much. Then here is a red curve and that’s what happens if you double CO2.

So notice the huge asymmetry. If taking all 400 parts per million of CO2 away from the atmosphere causes this enormous change 30 watts per square meter, the difference between this green 307 and and the black 277, that’s 30 watts per square meter. But if you double CO2 you practically don’t make any change. This is the famous saturation of CO2. At the levels we have now doubling CO2, a 100% Increase of CO2 only changes the radiation to space by 3 watts per square meter. The difference between 274 for the red curve and 277 for the curve for today. So it’s a tiny amount: for 100% increase in CO2 a 1% decrease of radiation to space.

That allows you to estimate the feedback-free climate sensitivity in your head. I’ll talk you through the feedback-free climate free sensitivity. So doubling CO2 is a 1% decrease of radiation to space. If that happens then the Earth will start to warm up. But it will radiate as the fourth power of the temperature. So temperature starts to rise but if you’ve got a fourth power, the temperature only has to rise by one-quarter of a percent absolute temperature. So a 1% forcing in watts per square centimeter is a one-quarter percent of temperature in Kelvin. Since the ambient Kelvin temperature is about 300 Kelvin (actually a little less) a quarter of that is 75 Kelvin. So the feedback free equilibrium climate sensitivity is less than 1 Degree. It’s 0.75 Centigrade. It’s a number you can do in your head.

So when you hear about 3 centigrade instead of .75 C that’s a factor of four, all of which is positive feedback. So how is there really that much positive feedback? Because most feedbacks in nature are negative. The famous Le Chatelier principle which says that if you perturb a system it reacts in a way to to dampen the perturbation not increase it. There are a few positive feedback systems that we’re familiar with for example High explosives have positive feedback. So if the earth’s climate were like other positive feedback systems, all of them are highly explosive, it would have exploded a long time ago. But the climate has never done that, so the empirical observational evidence from geology is that the climate is like any other feedback system it’s probably negative Okay so I leave that thought with you and and let me stress again:

This is clear skies no clouds; if you add clouds all this does is
suppress the effects of changes of the greenhouse gas.

So now let’s talk about clouds and the theory of clouds, since we’ve already seen clouds are very important. Here is the formidable equation of transfer which has been around since Schwarzchild’s day. So some of the symbols here relate to the intensity, another represents scattering. If you have a thermal radiation on a greenhouse gas where it comes in and immediately is absorbed, there’s no scattering at all. If you hit a cloud particle it will scatter this way or that way, or some maybe even backwards.

So all of that’s described by this integral so you’ve got incoming light at One Direction and you’ve got outgoing light at a second Direction. And then at the same time you’ve got thermal radiation so the warm particles of the cloud are are emitting radiation creating photons which are coming out and and increasing the Earth glow the and this is represented by two parameters. Even a single cloud particle has an albedo, this is is the fraction of radiation that hits the cloud that is scattered as opposed to absorbed and being converted to heat. It’s a very important parameter for visible light and white clouds, typically 99% of the encounters are scattered. But for thermal radiation it’s much less. So water scatters thermal radiation only half as efficiently as shorter wavelengths.

The big problem is that in spite of all the billions of dollars that we have spent, these things which should be known and and would have been known if there hadn’t been this crazy fixation on carbon dioxide and greenhouse gases. And so we’ve neglected working on these areas that are really important as opposed to the trivial effects of greenhouse gases. Attenuation in a cloud is both scattering and absorption. Of course you have to solve these equations for every different frequency of the light because especially for molecules, there’s a strong frequency dependence.

In summary,  let me show you this photo which was taken by Harrison Schmitt who was a friend of mine on one of the first moonshots. It was taken in December and looking at this you can see that they were south of Madagascar when the photograph was taken. You can see it was Winter because here the Intertropical Convergence Zone is quite a bit south of the Equator; it’s moved Way South of India and Saudi Arabia. By good luck they had the sun behind them so they had the whole earth Irradiated.

There’s a lot of information there and and again let me draw your attention to how much of the Earth is covered with clouds. So only very small parts of the Earth can actually be directly affected by greenhouse gases, of the order of half. The takeaway message is that clouds and water vapor are much more important than greenhouse gases for earth’s climate. The second point is the reason they’re much more important: doubling CO2 as I indicated in the middle of the talk only causes a 1% difference of radiation to space. It is a very tiny effect because of saturation. You know people like to say that’s not so, but you can’t really argue that one, even the IPCC gets the same numbers that we do.

And you also know that covering half of the sky with clouds will decrease solar heating by 50%. So for clouds it’s one to one, for greenhouse gases it’s a 100 to one. If you really want to affect the climate, you want to do something to the clouds. You will have a very hard time making any difference with Net Zero with CO2 if you are alarmed about the warmings that have happened.

So one would hope that with all the money that we’ve spent trying to turn CO2 into a demon that some good science has come out of it. From my point of view this is a small part of it, this scattering theory that I think will be here a long time after the craze over greenhouse gases has gone away. I hope there will be other things too. You can point to the better instrumentation that we’ve got, satellite instrumentation as well as ground instrumentation. So that’s been a good investment of money. But the money we’ve spent on supercomputers and modeling has been completely wasted in my view.

Lacking Data, Climate Models Rely on Guesses

A recent question was posed on  Quora: Say there are merely 15 variables involved in predicting global climate change. Assume climatologists have mastered each variable to a near perfect accuracy of 95%. How accurate would a climate model built on this simplified system be?  Keith Minor has a PhD in organic chemistry, PhD in Geology, and PhD in Geology & Paleontology from The University of Texas at Austin.  He responded with the text posted below in italics with my bolds and added images.

I like the answers to this question, and Matthew stole my thunder on the climate models not being statistical models. If we take the question and it’s assumptions at face value, one unsolvable overriding problem, and a limit to developing an accurate climate model that is rarely ever addressed, is the sampling issue. Knowing 15 parameters to 99+% accuracy won’t solve this problem.

The modeling of the atmosphere is a boundary condition problem. No, I’m not talking about frontal boundaries. Thermodynamic systems are boundary condition problems, meaning that the evolution of a thermodynamic system is dependent not only on the conditions at t > 0 (is the system under adiabatic conditions, isothermal conditions, do these conditions change during the process, etc.?), but also on the initial conditions at t = 0 (sec, whatever). Knowing almost nothing about what even a fraction of a fraction of the molecules in the atmosphere are doing at t = 0 or at t > 0 is a huge problem to accurately predicting what the atmosphere will do in the near or far future. [See footnote at end on this issue.]

Edward Lorenz attempted to model the thermodynamic behavior of the atmosphere by using models that took into account twelve variables (instead of fifteen as posed by the questioner), and found (not surprisingly) that there was a large variability in the models. Seemingly inconsequential perturbations would lead to drastically different results, which diverged (euphemism for “got even worse”) the longer out in time the models were run (they still do). This presumably is the origin of Lorenz’s phrase “the butterfly effect”. He probably meant it to be taken more as an instructive hypothetical rather than a literal effect, as it is too often taken today. He was merely illustrating the sensitivity of the system to the values of the parameters, and not equating it to the probability of outcomes, chaos theory, etc., which is how the term has come to be known. This divergence over time is bad for climate models, which try to predict the climate decades from now. Just look at the divergence of hurricane “spaghetti” models, which operate on a multiple-week scale.

The sources of variability include:

♦  the inability of the models to handle water (the most important greenhouse gas in the atmosphere, not CO2) and processes related to it;
♦  e.g., models still can’t handle the formation and non-formation of clouds;
♦  the non-linearity of thermodynamic properties of matter (which seem to be an afterthought, especially in popular discussions regarding the roles that CO2 plays in the atmosphere and biosphere), and
♦  the always-present sampling problem.

While in theory it is possible to know what a statistically significant number of the air and water molecules are doing at any point in time (that would be a lot of atoms and molecules!), a statistically significant sample of air molecules is certainly not being sampled by releasing balloons twice a day from 90 some odd weather stations in the US and territories, plus the data from commercial aircraft, plus all of the weather data from around the World. Doubling this number wouldn’t help, i.e wouldn’t make any difference. Though there are some blind spots, such as northeast Texas that might benefit from having a radar in the area. So you have to weigh the cost of sampling more of the atmosphere versus the 0% increase in forecasting accuracy (within experimental error) that you would get by doing so.

I’ll go out on a limb and say that the NWS (National Weather Service) is actually doing pretty good job in their 5-day forecasts with the current data and technologies that they have (e.g., S-band radar), and the local meteorologists use their years of experience and judgment to refine the forecasts to their viewing areas. The old joke is that a meteorologist’s job is the one job where you can be wrong more than half the time and still keep your job, but everyone knows that they go to work most, if not all, days with one hand tied behind their back, and sometimes two! The forecasts are not that far off on average, and so meteorologists get my unconditional respect.

In spite of these daunting challenges, there are certainly a number of areas in weather forecasting that can be improved by increased sampling, especially on a local scale. For example, for severe weather outbreaks, the CASA project is being implemented using multiple, shorter range radars that can get multiple scan directions on nearby severe-warned cells simultaneously. This resolves the problem caused by the curvature of the Earth as well as other problems associated with detecting storm-scale features tens or hundreds of miles away from the radar. So high winds, hail, and tornadoes are weather events where increasing the local sampling density/rate might help improve both the models and forecasts.

Prof. Wurman at OU has been doing this for decades with his pioneering work with mobile radar (the so-called “DOW’s”). Let’s not leave out the other researchers who have also been doing this for decades. The strategy of collecting data on a storm from multiple directions at short distances, coupled with supercomputer capabilities, has been paying off for a number of years. As a recent example, Prof. Orf at UW Madison, with his simulation of the May 24th, 2011 El Reno, OK tornado (you’ve probably seen it on the Internet), has shed light on some of the “black box” aspects to how tornadoes form. [Video below is Leigh Orf 1.5 min segment for 2018 Blue Waters Symposium plenary session. This segment summarizes, in 90 seconds, some of the team’s accomplishments on the Blue Waters supercomputer over the past five years.]

Prof. Orf’s simulation is just that, and the resolution is around ~10 m (~33 feet), but it illustrates how increased targeted sampling can be effective in at least understanding the complex, thermodynamic processes occurring within a storm. Colleagues have argued that the small improvements in warning times in the last couple of decades are really due more to the heavy spotter presence these days rather than case studies of severe storms. That may be true. However, in test cases of the CASA system, it picked out the subtle boundaries along which the storms fired that did go unnoticed with the current network of radars. So I’m optimistic about increased targeted sampling for use in an early warning system.

These two examples bring up a related problem-too much data! As commented on by a local meteorologist at a TESSA meeting, one of the issues with CASA that will have to be resolved is how to handle/process the tremendous amounts of data that will be generated during a severe weather outbreak. This is different from a research project where you can take your data back to the “lab”. In a real-time system, such as CASA, you need to have the ability to process the volumes of data rapidly so a meteorologist can quickly make a decision and get that life-saving info to the public. This data volume issue may be less of a problem for those using the data to develop climate models.

So back to the Quora question, with regard to a cost-effective (cost-effect is the operational term) climate model or models (say an ensemble model) that would “verify” say 50 years from now, the sampling issue is ever present, and likely cost-prohibitive at the level needed to make the sampling statistically significant. And will the climatologist be around in 50 years to be “hoisted with their own petard” when the climate model is proven to be wrong? The absence of accountability is the other problem with these long-range models into which many put so much faith.

But don’t stop using or trying to develop better climate models. Just be aware of what variables they include, how well they handle the parameters, and what their limitations are. How accurate would a climate model built on this simplified system [edit: of 15 well-defined variables (to 95% confidence level)] be? Not very!

My Comment

As Dr. Minor explains, powerful modern computers can process detailed observation data to simulate and forecast storm activity.  There are more such tools for preparing and adapting to extreme weather events which are normal in our climate system and beyond our control.  He also explains why long-range global climate models presently have major limitations for use by policymakers.

Footnote Regarding Initial Conditions Problem

What About the Double Pendulum?

Trajectories of a double pendulum

comment by tom0mason at alerted me to the science demonstrated by the double compound pendulum, that is, a second pendulum attached to the ball of the first one. It consists entirely of two simple objects functioning as pendulums, only now each is influenced by the behavior of the other.

Lo and behold, you observe that a double pendulum in motion produces chaotic behavior. In a remarkable achievement, complex equations have been developed that can and do predict the positions of the two balls over time, so in fact the movements are not truly chaotic, but with considerable effort can be determined. The equations and descriptions are at Wikipedia Double Pendulum

Long exposure of double pendulum exhibiting chaotic motion (tracked with an LED)

But here is the kicker, as described in tomomason’s comment:

If you arrive to observe the double pendulum at an arbitrary time after the motion has started from an unknown condition (unknown height, initial force, etc) you will be very taxed mathematically to predict where in space the pendulum will move to next, on a second to second basis. Indeed it would take considerable time and many iterative calculations (preferably on a super-computer) to be able to perform this feat. And all this on a very basic system of known elementary mechanics.

Our Chaotic Climate System

 

 

2024 Natural Climate Factors: Snow

Previously I posted an explanation by Dr. Judah Cohen regarding a correlation between autumn Siberian snow cover and the following winter conditions, not only in the Arctic but extending across the Northern Hemisphere. More recently, in looking into Climate Model Upgraded: INMCM5, I noticed some of the scientists were also involved in confirming the importance of snow cover for climate forecasting. Since the poles function as the primary vents for global cooling, what happens in the Arctic in no way stays in the Arctic. This post explores data suggesting changes in snow cover drive some climate changes.

The Snow Cover Climate Factor

The diagram represents how Dr. Judah Cohen pictures the Northern Hemisphere wintertime climate system.  He leads research regarding Arctic and NH weather patterns for AER.

cohen-schematic2

Dr. Cohen explains the mechanism in this diagram.

Conceptual model for how fall snow cover modifies winter circulation in both the stratosphere and the troposphere–The case for low snow cover on left; the case for extensive snow cover on right.

1. Snow cover increases rapidly in the fall across Siberia, when snow cover is above normal diabatic cooling helps to;
2. Strengthen the Siberian high and leads to below normal temperatures.
3. Snow forced diabatic cooling in proximity to high topography of Asia increases upward flux of energy in the troposphere, which is absorbed in the stratosphere.
4. Strong convergence of WAF (Wave Activity Flux) indicates higher geopotential heights.
5. A weakened polar vortex and warmer down from the stratosphere into the troposphere all the way to the surface.
6. Dynamic pathway culminates with strong negative phase of the Arctic Oscillation at the surface.

From Eurasian Snow Cover Variability and Links with Stratosphere-Troposphere
Coupling and Their Potential Use in Seasonal to Decadal Climate Predictions by Judah Cohen.

Observations of the Snow Climate Factor

The animation at the top shows from remote sensing that Eurasian snow cover fluctuates significantly from year to year, taking the end of October as a key indicator.

For more than five decades the IMS snow cover images have been digitized to produce a numerical database for NH snow cover, including area extents for Eurasia. The NOAA climate data record of Northern Hemisphere snow cover extent, Version 1, is archived and distributed by NCDC’s satellite Climate Data Record Program. The CDR is forward processed operationally every month, along with figures and tables made available at Rutgers University Global Snow Lab.

This first graph shows the snow extents of interest in Dr. Cohen’s paradigm. The Autumn snow area in Siberia is represented by the annual Eurasian averages of the months of October and November (ON). The following NH Winter is shown as the average snow area for December, January and February (DJF). Thus the year designates the December of that year plus the first two months of the next year.

Notes: NH snow cover minimum was 1981, trending upward since.  Siberian autumn snow cover was lowest in 1989, increasing since then.  Autumn Eurasian snow cover is about 1/3 of Winter NH snow area. Note also that fluctuations are sizable and correlated.

The second graph presents annual anomalies for the two series, each calculated as the deviation from the mean of its entire time series. Strikingly, the Eurasian Autumn flux is on the same scale as total NH flux, and closely aligned. While NH snow cover declined a few years prior to 2016, Eurasian snow has trended upward afterward.  If Dr. Cohen is correct, NH snowfall will follow. The linear trend is slightly positive, suggesting that fears of children never seeing snowfall have been exaggerated. The Eurasian trend line (not shown) is almost the same.

Illustration by Eleanor Lutz shows Earth’s seasonal climate changes. If played in full screen, the four corners present views from top, bottom and sides. It is a visual representation of scientific datasets measuring ice and snow extents.

 

 

Global Warming Big Lie–Skip the Distractions

The notion that CO2 from human activities causes global warming has multiple flaws, many of which have been dissected and rebutted here and elsewhere.  But The Big Lie is to fundamentally misrepresent how Earth’s climate system works. Richard Lindzen explains in the above interview with Jordan Peterson.  For those who prefer reading I provide a transcript from the closed captions in italics with my bolds and added images.

JP: When you started to object to the narrative, back say in ‘92, To what narrative were you objecting and on what grounds were you objecting?

RL: You’re touching on something that took me a while to understand. You know Goebbels famously said: If you tell a big enough lie and repeat it often enough, it’ll become the truth. there’s been a lot of that in this. But there are aspects of establishing the narrative, that is, what makes something the truth that I hadn’t appreciated.

So the narrative was the climate is determined by a greenhouse effect
and adding CO2 to it increases warming. And moreover besides CO2
the natural greenhouse substances–water vapor, clouds, upper level clouds–
will amplify whatever man does.

Now that immediately goes against Le Chatelier’s principle which says: If you perturb a system and it is capable internally of counteracting that, it will. And our system is so capable.

So that was a little bit odd. You began wondering, where did these feedbacks come from? Immediately people including myself started looking into the feedbacks, and seeing whether there were any negative ones, and how did it all work?

But underlying it, and this is what I learned: if you want to get a narrative established, the crucial thing is to pepper it with errors, questionable things. So that the critics will seize on those and not question the basic narrative.

The basic narrative was that climate is controlled by the greenhouse effect. In point of fact the earth’s climate system has many regions, but two distinctly different regions. There are the tropics roughly minus 30 to plus 30 degrees latitude, and the extra Tropics outside of plus or minus 30 degrees.

They have very different dynamics, and this is the crucial thing for the Earth by the way. And this is a technicality and much harder to convey than saying that greenhouse gases are a blanket or that 97 percent of scientists agree.

This is actually a technical issue. The Earth rotates. Now people are aware that we have day and night, but there is something called the Coriolis effect. When you’re on a rotating system it gives rise to the appearance of forces that change the winds relative to the rotation. So at the pole the rotation vector is perpendicular to the surface, while at the equator it’s parallel to the surface: it’s zero.

And this gives you phenomenally different Dynamics. So where you don’t have a vertical component to the rotation, vector motions do what they do in the laboratory in small scales. If you have a temperature difference, it acts to wipe it out.

Figure 11. Most sunlight is absorbed in the tropics, and some of the heat energy is carried by air currents to the polar regions to be released back into space as thermal radiation. Along with energy, angular momentum — imparted to the air from the rotating Earth’s surface near the equator — is transported to higher northern and southern latitudes, where it is reabsorbed by the Earth’s surface. The Hadley circulation near the equator is largely driven by buoyant forces on warm, solar-heated air, but for mid latitudes the “Coriolis force” due to the rotation of the earth leads to transport of energy and angular momentum through slanted “baroclinic eddies.” Among other consequences of the conservation of angular momentum are the easterly trade winds near the equator and the westerly winds at mid latitudes.

And so if you look at the tropics the temperatures at any surface are relatively flat: they don’t vary much with latitude. On the other hand you go to the mid Latitudes, in the extra Tropics the temperature varies a lot between the tropics and the pole. We know that about how temperatures are cold at high Latitudes. And if you look at changes in climate in the Earth’s history, what they show is a Tropics that stays relatively constant, and what changes is the temperature difference between the tropics and the pole.

During the Ice Age it was about 60 degrees Centigrade, today it’s about 40.  During 50 million years ago something called the eocene the difference was about 20. So that’s all a function of what’s going on outside the tropics. Within the tropics the greenhouse effect is significant but what determines the temperature change between the tropics and the pole has very little to do with the greenhouse effect.

It is a dynamic phenomenon based on the fact that a temperature difference with latitude generates instabilities. These instabilities take the form of the cyclonic and anticyclonic patterns that you see on the weather map. You can see the tropics are very different from even a casual look at a weather map.
The systems that bring us weather travel from west to east at latitudes outside the tropics. Within the tropics they travel from east to west. The prevailing winds are opposite in the two sections.

Sometimes people say that changes due to the greenhouse effect are amplified at the poles. That is not true: there’s no physical basis for that Statement. All they do is determine the starting point for where the temperature changes in mid-latitudes and that’s determined mainly by Hydrodynamics.

Okay that’s complicated to explain to someone and yet it’s the basis for those claims of seemingly large significance of these small numbers. You know they’re saying if Global mean temperature goes up one and a half degrees it’s the end. That’s based on it getting much bigger at high latitudes and determining that. But all one and a half degrees at the equator would do or in the greenhouse part of the Earth is change the temperature everywhere by one and a half degrees, which for most of us is less than the temperature change between breakfast and lunch.

See Also

Arctic “Amplification” Not What You Think

About Meridional Cooling and Climate Change

CO2 Not a Threat, But Greatly Benefits

 

Beware false and misleading Cartoons.

First, a plain language scientific explanation is in this article: THE BENEFITS OF CO2 – PART 2: CO2 does not cause global warming.  By Teri Ciccone.  Excerpts in italics with my bolds.

THE BENEFITS OF CO2 – PART 2: CO2 does not cause global warming

Abstract: Climate alarmists, the press/media, and politicians say we must achieve net zero CO2 by 2050 to avoid catastrophic global warming. In reality, all humanity and all life on Earth benefit from the increased atmospheric CO2 and the slight temperature increase. In Part 1, we presented scientific arguments to show how increased CO2 and warmer temperatures benefit humanity and all life on Earth. In Part 2, we discuss why a growing number of independent scientists and engineers argue that CO2 does not cause any measurable global warming. Part 3 will present how and why increased CO2 does not cause extreme climate/weather conditions. in the concluding Part 4 we present recommendations for policymakers.

Introduction: In this study, we set out facts and scientific principles consistent with established laws of physics, chemistry, and thermodynamics to state that greenhouse gases (GHGs) and the greenhouse effect (GHE) do not measurably warm the Earth. The concept that it does is at best a 100-year-old myth. The absorption and re-emission of longwave infrared radiation from the Earth’s surface does not cause any measurable warming of the Earth.It also raises the particular concern that the UN IPCC and NASA/NOAA do not address all of the sources of heat that warm the Earth. Nor consider the many natural forces and cycles that cause weather and climate variations that operate at the Astronomical, Endo-Earth, and Bio-Earth levels. Their well-published and promoted Earth Energy Budget needs serious revisions or to be discarded. Two main concepts describe atmospheric physics and thermodynamics, and each plays a vital role in understanding how the Sun warms the Earth and how the Earth cools. The first is the Radiative Transfer Concept (RTC) and the second is the Heat Transport Concept (HTC). They are both important and both need to be studied to fully understand our weather and climate systems. RTC needs an understanding of quantum physics, radiation, photons, absorption/emissions, etc. HTC needs an understanding of the more mundane science of atmospheric thermodynamics, convection, latent heat, conduction, evaporation-condensation, air and ocean circulations, etc.
Radiation Transfer Concept-RTC. The Sun provides most of the energyneeded to warm the Earth to a comfortable and sustainable level for all life. Solar radiation (photons) readily crosses the vacuum of space and arrives at Earth’s Top Of Atmosphere (TOA) at the speed of light. We see a simplified conceptual model of this radiation in Figure 1, showing the wavelength/ frequency, amplitude, and direction of propagation. These electromagnetic energy rays are emitted over a broad set of vibrational frequencies, spanning 20 orders of magnitude, called the spectrum. This radiant energy arrives at TOA at 1,366 W/m 2 with an average of 340 Watts per square meter (W/m 2 ) when averaged over the spherical world.About 30% of the 340 W/m 2 is reflected to space leaving 240 W/m 2 for the Earth. There one-third is absorbed by the atmosphere and two-thirds by the surface (oceans + lands). The high-energy UV radiation warms the air mostly by photodissociation and photoionization. In this process, the solar radian electromagnetic energy is converted to kinetic energy, and then to physical heat through collisions with air molecules. At the surface, about 67% warms and then cools by local physical heat processes like conduction, convection, and latent heat and 33% by radiation. Most of this radiation exits directly to space through an “atmospheric window” at frequencies that are not absorbable by GHGs. It’s estimated that only about 0.09 W of the solar energy absorbed by the surface is radiated through CO2 the non-water vapor greenhouse gasses (GHGs), a trivially small amount.
In essence, the IPCC claims that these greenhouse gasses already contribute about 33 degrees C of temperature to the average global temperature, and it will continue to increase with increased quantities of human-made CO2 /GHGs. In Summary, the IPCC claim is based on three false assumptions:
1) When a photon’s electromagnetic energy is absorbed by CO2 it warms the CO2 molecule with physical heat. This heat is then shared with other air molecules and the atmosphere warms. The energized CO2 molecule will then re-emit a comparable photon, and the process is repeated hundreds/thousands of times before its heat/energy finally exits into space.
2) Half of all the re-emitted photons go upward as described in 1). But half are radiated downward and are re-absorbed by the surface and warm it.
3) The Sun provides Earth with only enough energy to give the planet an average global temperature of -18°C. But measurements tell us that the average global temperature is about 15°C, therefore they claim that CO2 and the GHE provide the missing 33°C heat.
Together these concepts allegedly warm the Earth by first converting the photon’s Electromagnetic Energy (EMI) into physical heat, and secondly, this process delays the release of the original photon energy into space allowing heat to accumulate and delaying its exit to space. The details of their false science and the errors in their explanation are provided in this article Revised Why all the fuss with CO2 and the Greenhouse effect
Heat Transport Concept, or HTC is the second step towards what happens to the EMI energy of these photons as they approach the surface. Some will be reflected to space, for example, in Antarctica, nearly 100% is reflected to space by the ice and snow. Some will be scattered in the atmosphere and cause no measurable warming of anything. Most of these photons are visible light and Infrared and are not readily absorbed by the air. Nearly all of these photons are absorbed by the surface, the dense solid matter and liquid part, and warms it. This process of photon absorption by the surface and warming is called thermalization and we now discuss the Heat Transport Concept, or HTC. In general, the atmosphere does not warm the surface because the air is normally cooler than the surface. We know this based on the established atmospheric Temperature Lapse Rates. For example, the dry air lapse rate is 9.8°C per Km of altitude. This means that the atmospheric temperature cools by 9.8°C for each kilometre of altitude up to the top of the troposphere. According to the Second Law of thermodynamics heat can only flow from a warmer object to a colder object, never the other way around. The RTC espoused by the UN IPCC violates this Second Law, thereby falsifying their point 2) above. Nitrogen, Oxygen, argon, and water vapor make up more than 99% of the atmosphere. The optical depth, or transparency of the air close to the surface is very opaque. This ensures that almost 100% of resonant Longwave Infrared Radiation (LWIR) photons emitted by the surface will be absorbed by GHGs and immediately thermalized within the first several millimetres of altitude. This means that the resonant photon will be immediately absorbed by a CO2 molecule and become energized. An energized CO2 molecule could spontaneously re-emit the absorbed photon in about half a second as described by the UN IPCC theory. In reality, however, within a millionth/billionth of a second that energised CO2 molecule will physically collide with a non-GHG molecule. The impact of the collision will heat the non-GHG molecule and de-energize the CO2 molecule. So that initial photon has disappeared, and its electromagnetic energy is changed to physical heat and is dispersed throughout the atmosphere adding a tiny bit of warmth to the air. The detailed process is described in Part 2, pages 5-6 of the paper linked above. 
Conclusion. This CO2 photon absorption and immediate collision-thermalization process dominates throughout the troposphere. This means that the CO2 resonant LWIR radiation energy radiating from the surface is saturated to extinction by an overabundance of CO2/GHG molecules. There are no resonant photons left in the troposphere for the CO2 molecule to absorb. This is readily visible in Figure 2 by the presence of the CO2 notch, the top red arrow. In the stratosphere and mesosphere where the molecules are few and far apart, we see an increase in the CO2 spontaneous re-emissions implying a reverse- thermalization and the re-emitted photons exit to space. The UN IPCC claim that CO2/GHGs is the major cause of global warming is thus falsified by the above explanation that is consistent with established laws of science, combined with the absence of any scientific test data.
This makes two powerful cases: First, we can safely abandon the goal and costs of Net Zero and Carbon Capture and Sequestration. Even in the extreme case that atmospheric CO2 doubles or quadruples there is no risk of a catastrophic, run-away global warming threat. Second, the press/media, politicians, and compromised scientists will also say that we should reduce and remove CO2 from the air just to be sure. But, there’s a fact, that there is about 50 times more CO2 dissolved in the oceans than in the air. Henry’s Law tells us nature apportions how much CO2 goes into the air and how much goes into the oceans based only on the temperature of the water’s surface. During a hot sunny day, more CO 2 flows from the oceans to the air. And in cold winters more CO 2 will flow from the air into the oceans and all water on Earth. Imagine the silliness of spending $ billions/trillions to remove CO2 from the air and then have the oceans immediately replace it.
In summary, we find out that CO2/GHG absorption of LWIR energy emitted by the surface provides a negligible amount of physical heat in the troposphere. Consequently, the GHG/GHE provides near zero of the missing heat that the IPCC attributes to the GHGs and the GHE.

Background Paper with complete discussion

Missing Link in the GHE, Greenhouse Effect, by Thomas Shula – Markus Ott,  USA – Germany
2024.  From allaboutenergy.net

IPCC Crusade Built on Science Mistakes

“Mistake” definition (American Heritage Dictionary)

Noun

  1. An error or fault resulting from defective judgment, deficient knowledge, or carelessness.
  2. A misconception or misunderstanding.

Five Major IPCC Science Mistakes

♦  Surface stations records have warmed mostly from urban heat sources, not IR-active gases.

♦  Solar climate forcing varies more than IPCC admits.

♦  Experiments show more CO2 does not make air warmer.

♦  On all time scales temperature changes lead and CO2 changes follow.

♦  IPCC climate models exclude natural climate factors to blame all warming on GHGs.

Mistakes on Temperature Records and Solar Forcing

The first two misconceptions are described in a recent paper by CERES (Center for Environmental Research and Earth Sciences).  My post below provides the details.

Overview of CERES Study

Our review suggests that the IPCC reports have inadequately accounted for two major scientific concerns when they were evaluating the causes of global warming since the 1850s:

1. The global temperature estimates used in the IPCC reports are contaminated by urban warming biases.
2.  The estimates of solar activity changes since the 1850s considered by the IPCC substantially downplayed a possible large role for the Sun.

We conclude that it is not scientifically valid for the IPCC to rule out the possibility that global warming might be mostly natural.

Fatal Flaw Discredits IPCC Science

By way of John Ray comes this Spectator Australia article A basic flaw in IPCC science.  Excerpts in italics with my bolds and added images.

Detailed research is underway that threatens to undermine the foundations of the climate science promoted by the IPCC since its First Assessment Report in 1992. The research is re-examining the rural and urban temperature records in the Northern Hemisphere that are the foundation for the IPCC’s estimates of global warming since 1850. The research team has been led by Dr Willie Soon (a Malaysian solar astrophysicist associated with the Smithsonian Institute for many years) and two highly qualified Irish academics – Dr Michael Connolly and his son Dr Ronan Connolly. They have formed a climate research group CERES-SCIENCE. Their detailed research will be a challenge for the IPCC 7th Assessment Report due to be released in 2029 as their research results challenge the very foundations of IPCC science.

The climate warming trend published by the IPCC is a continually updated graph based on the temperature records of Northern Hemisphere land surface temperature stations dating from the mid 19th Century. The latest IPCC 2021 report uses data for the period 1850-2018. The IPCC’s selection of Northern Hemisphere land surface temperature records is not in question and is justifiable. The Northern Hemisphere records provide the best database for this period. The Southern Hemisphere land temperature records are not that extensive and are sparse for the 19th and early 20th Century. It is generally agreed that the urban temperature data is significantly warmer than the rural data in the same region because of an urban warming bias. This bias is due to night-time surface radiation of the daytime solar radiation absorbed by concrete and bitumen. Such radiation leads to higher urban night-time temperatures than say in the nearby countryside. The IPCC acknowledges such a warming bias but alleges the increased effect is only 10 per cent and therefore does not significantly distort its published global warming trend lines.


Since 2018, Dr Soon and his partners have analysed the data from rural and urban temperature recording stations in China, the USA, the Arctic, and Ireland. The number of stations with reliable temperature records in these areas increased from very few in the mid-19th Century to around 4,000 in the 1970s before decreasing to around 2,000 by the 1990s. The rural temperature recording stations with good records peaked at 400 and are presently around 200.

Their analysis of individual stations needs to account for any variation in their exposure to the Sun due to changes in their location, OR shadowing due to the construction of nearby buildings, OR nearby vegetation growth. The analysis of rural temperature stations is further complicated as over time many are encroached by nearby cities. Consequently, the data from such stations needs to be shifted at certain dates from the rural temperature database to either an intermediate database or to a full urban database. Consequently, an accurate analysis of the temperature records of each recording station is a time-consuming task.


This new analysis of 4,000 temperature recording stations in China, the USA, the Arctic, and Ireland shows a warming trend of 0.89ºC per century in the urban stations that is 1.61 times higher that a warming trend of 0.55ºC per century in the rural stations. This difference is far more significant than the 10 per cent divergence between urban and rural stations alleged in the IPCC reports; a divergence explained by a potential flaw in the IPCC’s methodology. The IPCC uses a technique called homogenisation that averages the rural and urban temperatures in a particular region. This method distorts the rural temperature records as over 75 per cent of the temperature records used in this homogenisation methodology are urban stations. So, a methodology that attempts to statistically identify and correct some biases that may be in the raw data, in effect, leads to an urban blending of the rural dataset. This result is biased as it downgrades the actual values of each rural temperature station. In contrast, Dr Soon and his coworkers avoided homogenisation so the temperature trends they identify for each rural region are accurate as the rural data are not distorted by the readings from nearby urban stations.


The rural temperature trend measured by this new research is 0.55ºC per century and it indicates the Earth has warmed 0.9ºC since 1850. In contrast, the urban temperature trend measured by this new research is 0.89ºC per century and indicates a much higher warming of 1.5ºC since 1850. Consequently, a distorted urban warming trend has been used by the IPCC to quantify the warming of the whole of the Earth since 1850. The exaggeration is significant as the urban temperature record database used by the IPCC only represents the temperatures on 3-4 per cent of the Earth’s land surface area; an area less than 2 per cent of the Earth’s total surface area. During the next few years, Dr Willie Soon and his research team are currently analysing the meta-history of 800 European temperature recording stations. When this is done their research will be based on very significant database of Northern Hemisphere rural and urban temperature records from China, the USA, the Arctic, Ireland, and Europe.

This new research has unveiled another flaw in the IPCC‘s temperature narrative as trend lines in its revised temperature datasets are different from those published by the IPCC. For example, the rural records now show a marked warming trend in the 1930s and 1940s while there is only a slight warming trend in the IPCC dataset. The most significant difference is the existence of a marked cooling period in the rural dataset for the 1960s and 1970s that is almost absent in the IPCC’s urban dataset. This later divergence upsets the common narrative that rising carbon dioxide levels control modern warming trends. For, if carbon dioxide levels are the driver of modern warming, how can a higher rate of increasing carbon dioxide levels exist within a cooling period in the 1960s and 1970s while a lower increasing rate of carbon dioxide levels coincides with an earlier warming interval in the 1930s and 1940s? Or, in other words, how can carbon dioxide levels increasing at 1.7 parts per million per decade cause a distinct warming period in the 1930s and 1940s while a larger increasing rate of 10.63 parts per million per decade is associated with a distinct cooling period in the 1960s and 1970s! Consequently, the research of Willie Soon and his coworkers is discrediting, not only the higher rate of global warming trends specified in IPCC Reports, but also the theory that rising carbon dioxide levels explain modern warming trends; a lynchpin of IPCC science for the last 25 years.

Willie Soon and his coworkers maintain that climate scientists need to consider other possible explanations for recent global warming. Willie Soon and his coworkers point to the Sun, but the IPCC maintains that variations in Total Solar Irradiance (TSI) are over eons and not over shorter periods such as the last few centuries. For that reason, the IPCC point to changes in greenhouse gases as the most obvious explanation for global warming since 1850. In contrast, Willie Soon and his coworkers maintain there can be short-term changes in solar activity and, for example, refer to a period of no sunspot activity that coincided with the Little Ice Age in the 17th Century. They also point out there is still no agreed average figure for Total Solar Irradiance (TSI) despite 30 years of measurements taken by various satellites. Consequently, they contend research in this area is not settled.

The CERES-SCIENCE research project pioneered by Dr Willie Soon and the father-son Connolly team has questioned the validity of the high global warming trends for the 1850-present period that have been published by the IPCC since its first report in 1992. The research also queries the IPCC narrative that rising greenhouse gas concentrations, particularly carbon dioxide, are the primary driver of global warming since 1850. That narrative has been the foundation of IPCC climate science for the last 40 years. It will be interesting to see how the IPCC’s 7th Assessment Report in 2029 treats this new research that questions the very basis of IPCC’s climate science.

The paper is The Detection and Attribution of Northern Hemisphere Land Surface Warming (1850–2018) in Terms of Human and Natural Factors: Challenges of Inadequate Data. 

Abstract

A statistical analysis was applied to Northern Hemisphere land surface temperatures (1850–2018) to try to identify the main drivers of the observed warming since the mid-19th century. Two different temperature estimates were considered—a rural and urban blend (that matches almost exactly with most current estimates) and a rural-only estimate. The rural and urban blend indicates a long-term warming of 0.89 °C/century since 1850, while the rural-only indicates 0.55 °C/century. This contradicts a common assumption that current thermometer-based global temperature indices are relatively unaffected by urban warming biases.

Three main climatic drivers were considered, following the approaches adopted by the Intergovernmental Panel on Climate Change (IPCC)’s recent 6th Assessment Report (AR6): two natural forcings (solar and volcanic) and the composite “all anthropogenic forcings combined” time series recommended by IPCC AR6. The volcanic time series was that recommended by IPCC AR6. Two alternative solar forcing datasets were contrasted. One was the Total Solar Irradiance (TSI) time series that was recommended by IPCC AR6. The other TSI time series was apparently overlooked by IPCC AR6. It was found that altering the temperature estimate and/or the choice of solar forcing dataset resulted in very different conclusions as to the primary drivers of the observed warming.

Our analysis focused on the Northern Hemispheric land component of global surface temperatures since this is the most data-rich component. It reveals that important challenges remain for the broader detection and attribution problem of global warming: (1) urbanization bias remains a substantial problem for the global land temperature data; (2) it is still unclear which (if any) of the many TSI time series in the literature are accurate estimates of past TSI; (3) the scientific community is not yet in a position to confidently establish whether the warming since 1850 is mostly human-caused, mostly natural, or some combination. Suggestions for how these scientific challenges might be resolved are offered.

Mistake on CO2 Warming Effect

Thomas Allmendinger is a Swiss physicist educated at Zurich ETH whose practical experience is in the fields of radiology and elemental particles physics.  His complete biography is here.

His independent research and experimental analyses of greenhouse gas (GHG) theory over the last decade led to several published studies, including the latest summation The Real Origin of Climate Change and the Feasibilities of Its Mitigation, 2023, at Atmospheric and Climate Sciences journal. The paper is a thorough and detailed discussion of which I provide here the abstract and the excerpt describing the experiment.  Excerpts are in italics with my bolds and added images. Full post is Experimental Proof Nil Warming from GHGs.

Abstract

The actual treatise represents a synopsis of six important previous contributions of the author, concerning atmospheric physics and climate change. Since this issue is influenced by politics like no other, and since the greenhouse-doctrine with CO2 as the culprit in climate change is predominant, the respective theory has to be outlined, revealing its flaws and inconsistencies.

But beyond that, the author’s own contributions are focused and deeply discussed. The most eminent one concerns the discovery of the absorption of thermal radiation by gases, leading to warming-up, and implying a thermal radiation of gases which depends on their pressure. This delivers the final evidence that trace gases such as CO2 don’t have any influence on the behaviour of the atmosphere, and thus on climate.

But the most useful contribution concerns the method which enables to determine the solar absorption coefficient βs of coloured opaque plates. It delivers the foundations for modifying materials with respect to their capability of climate mitigation. Thereby, the main influence is due to the colouring, in particular of roofs which should be painted, preferably light-brown (not white, from aesthetic reasons).

It must be clear that such a drive for brightening-up the World would be the only chance of mitigating the climate, whereas the greenhouse doctrine, related to CO2, has to be abandoned. However, a global climate model with forecasts cannot be aspired to since this problem is too complex, and since several climate zones exist.

4. Thermal Gas Absorption Measurements

If the warming-up behaviour of gases has to be determined by temperature measurements, interference by the walls of the gas vessel should be regarded since they exhibit a significantly higher heat capacity than the gas does, which implicates a slower warming-up rate. Since solid materials absorb thermal radiation stronger than gases do, the risk exists that the walls of the vessel are directly warmed up by the radiation, and that they subsequently transfer the heat to the gas. And finally, even the thin glass-walls of the thermometers may disturb the measurements by absorbing thermal radiation.

By these reasons, quadratic tubes with a relatively large profile (20 cm) were used which consisted of 3 cm thick plates from Styrofoam, and which were covered at the ends by thin plastic foils. In order to measure the temperature course along the tube, mercury-thermometers were mounted at three positions (beneath, in the middle, and atop) whose tips were covered with aluminum foils. The test gases were supplied from steel cylinders being equipped with reducing valves. They were introduced by a connecter during approx. one hour, because the tube was not gastight and not enough consistent for an evacuation. The filling process was monitored by means of a hygrometer since the air, which had to be replaced, was slightly humid. Afterwards, the tube was optimized by attaching adhesive foils and thin aluminum foils (see Figure 13). The equipment and the results are reported in [21].

Figure 13. Solar-tube, adjustable to the sun [21].

The initial measurements were made outdoor with twin-tubes in the presence of solar light. One tube was filled with air, and the other one with carbon-dioxide. Thereby, the temperature increased within a few minutes by approx. ten degrees till constant limiting temperatures were attained, namely simultaneously at all positions. Surprisingly, this was the case in both tubes, thus also in the tube which was filled with ambient air. Already this result delivered the proof that the greenhouse theory cannot be true. Moreover, it gave rise to investigate the phenomenon more thoroughly by means of artificial, better defined light.

Figure 14. Heat-radiation tube with IR-spot [21].

Accordingly, the subsequent experiments were made using IR-spots with wattages of 50 W, 100 W and 150W which are normally employed for terraria (Figure 14). Particularly the IR-spot with 150 W lead to a considerably higher temperature increase of the included gas than it was the case when sunlight was applied, since its ratio of thermal radiation was higher. Thereby, variable impacts such as the nature of the gas could be evaluated.

Due to the results with IR-spots at different gases (air, carbon-dioxide, the noble gases argon, neon and helium), essential knowledge could be gained. In each case, the irradiated gas warmed up until a stable limiting temperature was attained. Analogously to the case of irradiated coloured solid plates, the temperature increased until the equilibrium state was attained where the heat absorption rate was identically equal with the heat emission rate.

Figure 15. Time/temperature-curves for different gases [21] (150 W-spot, medium thermometer-position).

As evident from the diagram in Figure 15, the initial observation made with sunlight was approved that pure carbon-dioxide was warmed up almost to the same degree as air does (whereby ambient air only scarcely differed from a 4:1 mixture between nitrogen and oxygen). Moreover, noble gases absorb thermal radiation, too. As subsequently outlined, a theoretical explanation could be found thereto.

Conclusion

Finally, the theoretically suggested dependency of the atmospheric thermal radiation intensity on the atmospheric pressure could be empirically verified by measurements at different altitudes, namely in Glattbrugg (430 m above sea level and on the top of the Furka-pass (2430 m above sea level), both in Switzerland, delivering a so-called atmospheric emission constant A ≈ 22 W·m−2•bar−1•K−0.5. It explained the altitude-paradox of the atmospheric temperature and delivered the definitive evidence that the atmospheric behavior, and thus the climate, does not depend on trace gases such as CO2. However, the atmosphere thermally reradiates indeed, leading to something similar to a Greenhouse effect. But this effect is solely due to the atmospheric pressure.

Mistake on Warming Prior to CO2 Rising

Changes in CO2 follow changes in global temperatures on all time scales, from last month’s observations to ice core datasets spanning millennia. Since CO2 is the lagging variable, it cannot logically be the cause of temperature, the leading variable. It is folly to imagine that by reducing human emissions of CO2, we can change global temperatures, which are obviously driven by other factors.  Most recent post on this:

10/2024 Update Recent Warming Spike Drives Rise in CO2

Mistake on Models Bias Against Natural Factors

Figure 1. Anthropic and natural contributions. (a) Locked scaling factors, weak Pre Industrial Climate Anomalies (PCA). (b) Free scaling, strong PCA

In  2009, the iconic email from the Climategate leak included a comment by Phil Jones about the “trick” used by Michael Mann to “hide the decline,” in his Hockey Stick graph, referring to tree proxy temperatures  cooling rather than warming in modern times.  Now we have an important paper demonstrating that climate models insist on man-made global warming only by hiding the incline of natural warming in Pre-Industrial times.  The paper is From Behavioral Climate Models and Millennial Data to AGW Reassessment by Philippe de Larminat.  H/T No Tricks Zone. Excerpts in italics with my bolds.

Abstract

Context. The so called AGW (Anthropogenic Global Warming), is based on thousands of climate simulations indicating that human activity is virtually solely responsible for the recent global warming. The climate models used are derived from the meteorological models used for short-term predictions. They are based on the fundamental and empirical physical laws that govern the myriad of atmospheric and oceanic cells integrated by the finite element technique. Numerical approximations, empiricism and the inherent chaos in fluid circulations make these models questionable for validating the anthropogenic principle, given the accuracy required (better than one per thousand) in determining the Earth energy balance.

Aims and methods. The purpose is to quantify and simulate behavioral models of weak complexity, without referring to predefined parameters of the underlying physical laws, but relying exclusively on generally accepted historical and paleoclimate series.

Results. These models perform global temperature simulations that are consistent with those from the more complex physical models. However, the repartition of contributions in the present warming depends strongly on the retained temperature reconstructions, in particular the magnitudes of the Medieval Warm Period and the Little Ice Age. It also depends on the level of the solar activity series. It results from these observations and climate reconstructions that the anthropogenic principle only holds for climate profiles assuming almost no PCA neither significant variations in solar activity. Otherwise, it reduces to a weak principle where global warming is not only the result of human activity, but is largely due to solar activity.  Full post is here:

Climate Models Hide the Paleo Incline