Good Reasons to Distrust Climatists

The most recent case of climatists’ bad behavior is the retraction of a peer-reviewed paper analyzing the properties of CO2 as an IR active gas, concluding that additional levels of atmospheric CO2 will have negligible effect on temperatures.  From the Daily Sceptic:

Another important paper taking issue with the ‘settled’ climate narrative has been cancelled following a report in the Daily Sceptic and subsequent reposts that went viral across social media. The paper discussed the atmospheric ‘saturation’ of greenhouse gases such as carbon dioxide and argued that higher levels will not cause temperatures to rise. The work was led by the widely-published Polish scientist Dr. Jan Kubicki and appeared on Elsevier’s ScienceDirect website in December 2023. The paper has been widely discussed on social media since April 2024 when the Daily Sceptic reported on the findings. Interest is growing in the saturation hypothesis not least because it provides a coherent explanation for why life and the biosphere grew and often thrived for 600 million years despite much higher atmospheric levels of greenhouse gases. Alas for control freaks, it also destroys the science backing for the Net Zero fantasy.

Below are some comments responding to a Quora question, text in italics with my bolds and added images:

What are some reasons why some people do not believe in climate change or global warming despite scientific evidence? Is there any additional information that could help us understand their perspective?

Answer from Mike Jonas,  M.A. in Mathematics, Oxford University, UK, 

Good scientists do not lie and cheat to protect their science, they are happy to discuss their evidence and their findings, and they always understand that everything needs to be replicable and verifiable.

When Climategate erupted on the scene, and the climate scientists behind the man-made global warming narrative were found to have lied and cheated, all honest scientists thought that would be the end of it. Instead, what happened was that those climate scientists closed ranks and carried on, supported by a massive amount of government (ie, the public’s) money. One of the first things they did was to deflect Climategate by saying the emails involved had been hacked so should be ignored, but some of the people involved confirmed that all of the emails really were genuine.

It has been about 15 years since Climategate, and study after study has shown virtually all of the components of the man-made global warming narrative to be incorrect, even that none of the computer models used by the IPCC are fit for purpose,

And yet they maintained their closed ranks,
and the government money kept pouring in.

Did you know that the IPCC does not do any research (please do check that, on their web page About – IPCC they state “The IPCC does not conduct its own research”). It is, as its name says, an inter-governmental organisation, and it is run by and for governments. They say lots of persuasive sciency things, but the simple fact is that they cherry-pick and corrupt the science to achieve their ends. Regrettably, almost all the scientific societies are on the gravy train too. This is part of what the highly respected physicist Professor Hal Lewis said in his resignation letter to the American Physical Society (APS):

It is of course, the global warming scam, with the (literally) trillions of dollars driving it, that has corrupted so many scientists, and has carried APS before it like a rogue wave. It is the greatest and most successful pseudoscientific fraud I have seen in my long life as a physicist. Anyone who has the faintest doubt that this is so should force himself to read the ClimateGate documents, which lay it bare.

I don’t believe that any real physicist, nay scientist, can read that stuff without revulsion. I would almost make that revulsion a definition of the word scientist.

So what has the APS, as an organization, done in the face of this challenge?
It has accepted the corruption as the norm, and gone along with it.

If you want to find out more about this “greatest and most successful pseudoscientific fraud”, the website Watts Up With That? is a good place to start (the fraudsters absolutely hate it), and it links to many other good websites. It has the full text of Hal Lewis’ resignation letter at:

Hal Lewis: My Resignation From The American Physical Society – an important moment in science history

Answer from Susannah Moyer

It’s curious that climate science is the rare scientific field where dissenting scientists, those with contrarian views, are unwelcome and even ostracized.

There are some well known climate scientists that have doubts about the role of CO2 and man made global warming as it pertains to global temperature. They have raised the issue that computer generated prediction models have been inaccurate in predicting temperature patterns because the modeling requires assumptions that have not been shown to be accurate.

Here is a contrarian view from climate scientists who have published climate research results in Nature, which is no small feat:

McNider and Christy are professors of atmospheric science at the University of Alabama in Huntsville and fellows of the American Meteorological Society. Mr. Christy was a member of the Intergovernmental Panel on Climate Change that shared the 2007 Nobel Peace Prize with former Vice President Al Gore.

It is not a known fact by how much the Earth’s atmosphere will warm in response to this added carbon dioxide. The warming numbers most commonly advanced are created by climate computer models built almost entirely by scientists who believe in catastrophic global warming. The rate of warming forecast by these models depends on many assumptions and engineering to replicate a complex world in tractable terms, such as how water vapor and clouds will react to the direct heat added by carbon dioxide or the rate of heat uptake, or absorption, by the oceans.

We might forgive these modelers if their forecasts had not been so consistently and spectacularly wrong. From the beginning of climate modeling in the 1980s, these forecasts have, on average, always overstated the degree to which the Earth is warming compared with what we see in the real climate.

For instance, in 1994 we published an article in the journal Nature showing that the actual global temperature trend was “one-quarter of the magnitude of climate model results.” As the nearby graph shows, the disparity between the predicted temperature increases and real-world evidence has only grown in the past 20 years.

“Consensus” science that ignores reality can have tragic consequences if cures are ignored or promising research is abandoned. The climate-change consensus is not endangering lives, but the way it imperils economic growth and warps government policy making has made the future considerably bleaker. The recent Obama administration announcement that it would not provide aid for fossil-fuel energy in developing countries, thereby consigning millions of people to energy poverty, is all too reminiscent of the Sick and Health Board denying fresh fruit to dying British sailors.

Another questioner, Dr. Koonin was undersecretary for science in the Energy Department during President Barack Obama’s first term and is currently director of the Center for Urban Science and Progress at New York University. His previous positions include professor of theoretical physics and provost at Caltech, as well as chief scientist of BP, where his work focused on renewable and low-carbon energy technologies.

But—here’s the catch—those questions are the hardest ones to answer. They challenge, in a fundamental way, what science can tell us about future climates.

Firstly, even though human influences could have serious consequences for the climate, they are physically small in relation to the climate system as a whole. For example, human additions to carbon dioxide in the atmosphere by the middle of the 21st century are expected to directly shift the atmosphere’s natural greenhouse effect by only 1% to 2%. Since the climate system is highly variable on its own, that smallness sets a very high bar for confidently projecting the consequences of human influences.

A second challenge to “knowing” future climate is today’s poor understanding of the oceans. The oceans, which change over decades and centuries, hold most of the climate’s heat and strongly influence the atmosphere. Unfortunately, precise, comprehensive observations of the oceans are available only for the past few decades; the reliable record is still far too short to adequately understand how the oceans will change and how that will affect climate.

A third fundamental challenge arises from feedbacks that can dramatically amplify or mute the climate’s response to human and natural influences. One important feedback, which is thought to approximately double the direct heating effect of carbon dioxide, involves water vapor, clouds and temperature.

Climate Science Is Not Settled

Another group questioning what some consider “settled science”:

  • Claude Allegre, former director of the Institute for the Study of the Earth, University of Paris;
  • J. Scott Armstrong, cofounder of the Journal of Forecasting and the International Journal of Forecasting;
  • Jan Breslow, head of the Laboratory of Biochemical Genetics and Metabolism, Rockefeller University;
  • Roger Cohen, fellow, American Physical Society;
  • Edward David, member, National Academy of Engineering and National Academy of Sciences;
  • William Happer, professor of physics, Princeton;
  • Michael Kelly, professor of technology, University of Cambridge, U.K.;
  • William Kininmonth, former head of climate research at the Australian Bureau of Meteorology;
  • Richard Lindzen, professor of atmospheric sciences, MIT;
  • James McGrath, professor of chemistry, Virginia Technical University;
  • Rodney Nichols, former president and CEO of the New York Academy of Sciences;
  • Burt Rutan, aerospace engineer, designer of Voyager and SpaceShipOne;
  • Harrison H. Schmitt, Apollo 17 astronaut and former U.S. senator;
  • Nir Shaviv, professor of astrophysics, Hebrew University, Jerusalem;
  • Henk Tennekes, former director, Royal Dutch Meteorological Service;
  • Antonio Zichichi, president of the World Federation of Scientists, Geneva.

Although the number of publicly dissenting scientists is growing, many young scientists furtively say that while they also have serious doubts about the global-warming message, they are afraid to speak up for fear of not being promoted—or worse. They have good reason to worry. In 2003, Dr. Chris de Freitas, the editor of the journal Climate Research, dared to publish a peer-reviewed article with the politically incorrect (but factually correct) conclusion that the recent warming is not unusual in the context of climate changes over the past thousand years. The international warming establishment quickly mounted a determined campaign to have Dr. de Freitas removed from his editorial job and fired from his university position. Fortunately, Dr. de Freitas was able to keep his university job.

This is not the way science is supposed to work, but we have seen it before—for example, in the frightening period when Trofim Lysenko hijacked biology in the Soviet Union. Soviet biologists who revealed that they believed in genes, which Lysenko maintained were a bourgeois fiction, were fired from their jobs. Many were sent to the gulag and some were condemned to death.

Why is there so much passion about global warming, and why has the issue become so vexing that the American Physical Society (APS), from which Dr. Giaever resigned a few months ago, refused the seemingly reasonable request by many of its members to remove the word “incontrovertible” from its description of a scientific issue?

There are several reasons, but a good place to start is the old question
“cui bono?” Or the modern update, “Follow the money.”

 

 

 

 

Happer: Cloud Radiation Matters, CO2 Not So Much (2025)

This month van Wijngaarden and Happer published a new paper Radiation Transport in Clouds.

Last year William Happer spoke on Radiation Transfer in Clouds at the EIKE conference, and the video is above.  For those preferring to read, below is a transcript from the closed captions along with some key exhibits.  I left out the most technical section in the latter part of the presentation. Text in italics with my bolds.

William Happer: Radiation Transfer in Clouds

People have been looking at Clouds for a very long time in in a quantitative way. This is one of the first quantitative studies done about 1800. And this is John Leslie,  a Scottish physicist who built this gadget. He called it an Aethrioscope, but basically it was designed to figure out how effective the sky was in causing Frost. If you live in Scotland you worry about Frost. So it consisted of two glass bulbs with a very thin capillary attachment between them. And there was a little column of alcohol here.

The bulbs were full of air, and so if one bulb got a little bit warmer it would force the alcohol up through the capillary. If this one got colder it would suck the alcohol up. So he set this device out under the clear sky. And he described that the sensibility of the instrument is very striking. For the liquor incessantly falls and rises in the stem with every passing cloud. in fine weather the aethrioscope will seldom indicate a frigorific impression of less than 30 or more than 80 millesimal degrees. He’s talking about how high this column of alcohol would go up and down if the sky became overclouded. it may be reduced to as low as 15 refers to how much the sky cools or even five degrees when the congregated vapours hover over the hilly tracks. We don’t speak English that way anymore but I I love it.

The point was that even in 1800 Leslie and his colleagues knew very well that clouds have an enormous effect on the cooling of the earth. And of course anyone who has a garden knows that if you have a clear calm night you’re likely to get Frost and lose your crops. So this was a quantitative study of that.

Now it’s important to remember that if you go out today the atmosphere is full of two types of radiation. There’s sunlight which you can see and then there is the thermal radiation that’s generated by greenhouse gases, by clouds and by the surface of the Earth. You can’t see thermal radiation but you you can feel it if it’s intense enough by its warming effect. And these curves practically don’t overlap so we’re really dealing with two completely different types of radiation.

There’s sunlight which scatters very nicely and off of not only clouds but molecules; it’s the blue sky the Rayleigh scattering. Then there’s the thermal radiation which actually doesn’t scatter at all on molecules so greenhouse gases are very good at absorbing thermal radiation but they don’t scatter it. But clouds scatter thermal radiation and plotted here is the probability that you will find Photon of sunlight between you know log of its wavelength and the log of in this interval of the wavelength scale.

Since Leslie’s day two types of instruments have been developed to do what he did more precisely. One of them is called a pyranometer and this is designed to measure sunlight coming down onto the Earth on a day like this. So you put this instrument out there and it would read the flux of sunlight coming down. It’s designed to see sunlight coming in every direction so it doesn’t matter which angle the sun is shining; it’s uh calibrated to see them all.

Let me show you a measurement by a pyranometer. This is a actually a curve from a sales brochure of a company that will sell you one of these devices. It’s comparing two types of detectors and as you can see they’re very good you can hardly tell the difference. The point is that if you look on a clear day with no clouds you see sunlight beginning to increase at dawn it peaks at noon and it goes down to zero and there’s no sunlight at night. So half of the day over most of the Earth there’s no sunlight in the in the atmosphere.

Here’s a day with clouds, it’s just a few days later shown by days of the year going across. You can see every time a cloud goes by the intensity hitting the ground goes down. With a little clear sky it goes up, then down up and so on. On average at this particular day you get a lot less sunlight than you did on the clear day.

But you know nature is surprising. Einstein had this wonderful quote: God is subtle but he’s not malicious. He meant that nature does all of sorts of things you don’t expect, and so let me show you what happens on a partly cloudy day. Here so this is data taken near Munich. The blue curve is the measurement and the red curve is is the intensity on the ground if there were no clouds. This is a partly cloudy day and you can see there are brief periods when the sunlight is much brighter on the detector on a cloudy day than it is on the clear day. And that’s because coming through clouds you get focusing from the edges of the cloud pointing down toward your detector. That means somewhere else there’s less radiation reaching the ground. But this is rather surprising to most people. I was very surprised to learn about it but it just shows that the actual details of climate are a lot more subtle than you might think.

We know that visible light only happens during the daytime and stops at night. There’s a second type of important radiation which is the thermal radiation which is measured by a similar device. You have a silicon window that passes infrared, which is below the band gap of silicon, so it passes through it as though transparent. Then there’s some interference filters here to give you further discrimination against sunlight. So sunlight practically doesn’t go through this at all, so they call it solar solar blind since it doesn’t see the Sun.

But it sees thermal radiation very clearly with a big difference between this device and the sunlight sensing device I showed you. Because actually most of the time this is radiating up not down. Out in the open air this detector normally gets colder than the body of the instrument. And so it’s carefully calibrated for you to compare the balance of down coming radiation with the upcoming radiation. Upcoming is normally greater than down coming.

I’ll show you some measurements of the downwelling flux here; these are actually in Greenland in Thule and these are are watts per square meter on the vertical axis here. The first thing to notice is that the radiation continues day and night you can you if you look at the output of the pyrgeometer you can’t tell whether it’s day or night because the atmosphere is just as bright at night as it is during the day. However, the big difference is clouds: on a cloudy day you get a lot more downwelling radiation than you do on a clear day. Here’s a a near a full day of clear weather there’s another several days of clear weather. Then suddenly it gets cloudy. Radiation rises because the bottoms of the clouds are relatively warm at least compared to the clear sky. I think if you put the numbers In, this cloud bottom is around 5° Centigrade so it was fairly low Cloud. it was summertime in Greenland and this compares to about minus 5° for the clear sky.

So there’s a lot of data out there and there really is downwelling radiation there no no question about that you measure it routinely. And now you can do the same thing looking down from satellites so this is a picture that I downloaded a few weeks ago to get ready for this talk from Princeton and it was from Princeton at 6 PM so it was already dark in Europe. So this is a picture of the Earth from a geosynchronous satellite that’s parked over Ecuador. You are looking down on the Western Hemisphere and this is a filtered image of the Earth in Blue Light at 47 micrometers. So it’s a nice blue color not so different from the sky and it’s dark where the sun has set. There’s still a fair amount of sunlight over the United States and the further west.

Here is exactly the same time and from the same satellite the infrared radiation coming up at 10.3 which is right in the middle of the infrared window where there’s not much Greenhouse gas absorption; there’s a little bit from water vapor but very little, trivial from CO2.

As you can see, you can’t tell which side is night and which side is day. So even though the sun has set over here it is still glowing nice and bright. There’s sort of a pesky difference here because what you’re looking at here is reflected sunlight over the intertropical Convergence Zone. There are lots of high clouds that have been pushed up by the convection in the tropics and uh so this means more visible light here. You’re looking at emission of the cloud top so this is less thermal light so white here means less light, white there means more light so you have to calibrate your thinking. to

But the Striking thing about all of this: if you can see the Earth is covered with clouds, you have to look hard to find a a clear spot of the earth. Roughly half of the earth maybe is clear at any given time but most of it’s covered with clouds. So if anything governs the climate it is clouds and and so that’s one of the reasons I admire so much the work that Svensmark and Nir Shaviv have done. Because they’re focusing on the most important mechanism of the earth: it’s not Greenhouse Gases, it’s Clouds. You can see that here.

Now this is a single frequency let me show you what happens if you look down from a satellite and do look at the Spectrum. This is the spectrum of light coming up over the Sahara Desert measured from a satellite. And so here is the infrared window; there’s the 10.3 microns I mentioned in the previous slide it’s it’s a clear region. So radiation in this region can get up from the surface of the Sahara right up to outer space.

Notice that the units on these scales are very different; over the Sahara the top unit is 200, 150 over the Mediterranean and it’s only 60 over the South Pole. But at least the Mediterranean and the Sahara are roughly similar so the right side here these three curves on the right are observations from satellites and the three curves on the left are are calculations modeling that we’ve done. The point here is that you can hardly tell the difference between a model calculation and observed radiation.

So it’s really straightforward to calculate radiation transfer. If someone quotes you a number in watts per square centimeter you should take it seriously; that probably a good number. If they tell you a temperature you don’t know what to make about it. Because there’s a big step between going from watts per square centimeter to a temperature change. All the mischief in the whole climate business is going from watts per square centimeter to to Centigrade or Kelvin.

Now I will say just a few words about clear sky because that is the simplest. Then we’ll get on to clouds, the topic of this talk. This is a calculation with the same codes that I showed you in the previous slide which as you saw work very well. It’s worth spending a little time because this is the famous Planck curve that was the birth of quantum mechanics. There is Max Planck who figured out what the formula for that curve is and why it is that way. This is what the Earth would radiate at 15° Centigrade if there were no greenhouse gases. You would get this beautiful smooth curve the Planck curve. If you actually look at the Earth from the satellites you get a raggedy jaggedy black curve. We like to call that the Schwarzchild curve because Carl Schwarzchild was the person who showed how to do that calculation. Tragically he died during World War I, a Big Big loss to science.

There are two colored curves that I want to draw your attention. The green curve is is what Earth would radiate to space if you took away all the CO2 so it only differs from the black curve you know in the CO2 band here this is the bending band of CO2 which is the main greenhouse effect of CO2. There’s a little additional effect here which is the asymmetric stretch but it it doesn’t contribute very much. Then here is a red curve and that’s what happens if you double CO2.

So notice the huge asymmetry. If taking all 400 parts per million of CO2 away from the atmosphere causes this enormous change 30 watts per square meter, the difference between this green 307 and and the black 277, that’s 30 watts per square meter. But if you double CO2 you practically don’t make any change. This is the famous saturation of CO2. At the levels we have now doubling CO2, a 100% Increase of CO2 only changes the radiation to space by 3 watts per square meter. The difference between 274 for the red curve and 277 for the curve for today. So it’s a tiny amount: for 100% increase in CO2 a 1% decrease of radiation to space.

That allows you to estimate the feedback-free climate sensitivity in your head. I’ll talk you through the feedback-free climate free sensitivity. So doubling CO2 is a 1% decrease of radiation to space. If that happens then the Earth will start to warm up. But it will radiate as the fourth power of the temperature. So temperature starts to rise but if you’ve got a fourth power, the temperature only has to rise by one-quarter of a percent absolute temperature. So a 1% forcing in watts per square centimeter is a one-quarter percent of temperature in Kelvin. Since the ambient Kelvin temperature is about 300 Kelvin (actually a little less) a quarter of that is 75 Kelvin. So the feedback free equilibrium climate sensitivity is less than 1 Degree. It’s 0.75 Centigrade. It’s a number you can do in your head.

So when you hear about 3 centigrade instead of .75 C that’s a factor of four, all of which is positive feedback. So how is there really that much positive feedback? Because most feedbacks in nature are negative. The famous Le Chatelier principle which says that if you perturb a system it reacts in a way to to dampen the perturbation not increase it. There are a few positive feedback systems that we’re familiar with for example High explosives have positive feedback. So if the earth’s climate were like other positive feedback systems, all of them are highly explosive, it would have exploded a long time ago. But the climate has never done that, so the empirical observational evidence from geology is that the climate is like any other feedback system it’s probably negative Okay so I leave that thought with you and and let me stress again:

This is clear skies no clouds; if you add clouds all this does is
suppress the effects of changes of the greenhouse gas.

So now let’s talk about clouds and the theory of clouds, since we’ve already seen clouds are very important. Here is the formidable equation of transfer which has been around since Schwarzchild’s day. So some of the symbols here relate to the intensity, another represents scattering. If you have a thermal radiation on a greenhouse gas where it comes in and immediately is absorbed, there’s no scattering at all. If you hit a cloud particle it will scatter this way or that way, or some maybe even backwards.

So all of that’s described by this integral so you’ve got incoming light at One Direction and you’ve got outgoing light at a second Direction. And then at the same time you’ve got thermal radiation so the warm particles of the cloud are are emitting radiation creating photons which are coming out and and increasing the Earth glow the and this is represented by two parameters. Even a single cloud particle has an albedo, this is is the fraction of radiation that hits the cloud that is scattered as opposed to absorbed and being converted to heat. It’s a very important parameter for visible light and white clouds, typically 99% of the encounters are scattered. But for thermal radiation it’s much less. So water scatters thermal radiation only half as efficiently as shorter wavelengths.

The big problem is that in spite of all the billions of dollars that we have spent, these things which should be known and and would have been known if there hadn’t been this crazy fixation on carbon dioxide and greenhouse gases. And so we’ve neglected working on these areas that are really important as opposed to the trivial effects of greenhouse gases. Attenuation in a cloud is both scattering and absorption. Of course you have to solve these equations for every different frequency of the light because especially for molecules, there’s a strong frequency dependence.

In summary,  let me show you this photo which was taken by Harrison Schmitt who was a friend of mine on one of the first moonshots. It was taken in December and looking at this you can see that they were south of Madagascar when the photograph was taken. You can see it was Winter because here the Intertropical Convergence Zone is quite a bit south of the Equator; it’s moved Way South of India and Saudi Arabia. By good luck they had the sun behind them so they had the whole earth Irradiated.

There’s a lot of information there and and again let me draw your attention to how much of the Earth is covered with clouds. So only very small parts of the Earth can actually be directly affected by greenhouse gases, of the order of half. The takeaway message is that clouds and water vapor are much more important than greenhouse gases for earth’s climate. The second point is the reason they’re much more important: doubling CO2 as I indicated in the middle of the talk only causes a 1% difference of radiation to space. It is a very tiny effect because of saturation. You know people like to say that’s not so, but you can’t really argue that one, even the IPCC gets the same numbers that we do.

And you also know that covering half of the sky with clouds will decrease solar heating by 50%. So for clouds it’s one to one, for greenhouse gases it’s a 100 to one. If you really want to affect the climate, you want to do something to the clouds. You will have a very hard time making any difference with Net Zero with CO2 if you are alarmed about the warmings that have happened.

So one would hope that with all the money that we’ve spent trying to turn CO2 into a demon that some good science has come out of it. From my point of view this is a small part of it, this scattering theory that I think will be here a long time after the craze over greenhouse gases has gone away. I hope there will be other things too. You can point to the better instrumentation that we’ve got, satellite instrumentation as well as ground instrumentation. So that’s been a good investment of money. But the money we’ve spent on supercomputers and modeling has been completely wasted in my view.

Lacking Data, Climate Models Rely on Guesses

A recent question was posed on  Quora: Say there are merely 15 variables involved in predicting global climate change. Assume climatologists have mastered each variable to a near perfect accuracy of 95%. How accurate would a climate model built on this simplified system be?  Keith Minor has a PhD in organic chemistry, PhD in Geology, and PhD in Geology & Paleontology from The University of Texas at Austin.  He responded with the text posted below in italics with my bolds and added images.

I like the answers to this question, and Matthew stole my thunder on the climate models not being statistical models. If we take the question and it’s assumptions at face value, one unsolvable overriding problem, and a limit to developing an accurate climate model that is rarely ever addressed, is the sampling issue. Knowing 15 parameters to 99+% accuracy won’t solve this problem.

The modeling of the atmosphere is a boundary condition problem. No, I’m not talking about frontal boundaries. Thermodynamic systems are boundary condition problems, meaning that the evolution of a thermodynamic system is dependent not only on the conditions at t > 0 (is the system under adiabatic conditions, isothermal conditions, do these conditions change during the process, etc.?), but also on the initial conditions at t = 0 (sec, whatever). Knowing almost nothing about what even a fraction of a fraction of the molecules in the atmosphere are doing at t = 0 or at t > 0 is a huge problem to accurately predicting what the atmosphere will do in the near or far future. [See footnote at end on this issue.]

Edward Lorenz attempted to model the thermodynamic behavior of the atmosphere by using models that took into account twelve variables (instead of fifteen as posed by the questioner), and found (not surprisingly) that there was a large variability in the models. Seemingly inconsequential perturbations would lead to drastically different results, which diverged (euphemism for “got even worse”) the longer out in time the models were run (they still do). This presumably is the origin of Lorenz’s phrase “the butterfly effect”. He probably meant it to be taken more as an instructive hypothetical rather than a literal effect, as it is too often taken today. He was merely illustrating the sensitivity of the system to the values of the parameters, and not equating it to the probability of outcomes, chaos theory, etc., which is how the term has come to be known. This divergence over time is bad for climate models, which try to predict the climate decades from now. Just look at the divergence of hurricane “spaghetti” models, which operate on a multiple-week scale.

The sources of variability include:

♦  the inability of the models to handle water (the most important greenhouse gas in the atmosphere, not CO2) and processes related to it;
♦  e.g., models still can’t handle the formation and non-formation of clouds;
♦  the non-linearity of thermodynamic properties of matter (which seem to be an afterthought, especially in popular discussions regarding the roles that CO2 plays in the atmosphere and biosphere), and
♦  the always-present sampling problem.

While in theory it is possible to know what a statistically significant number of the air and water molecules are doing at any point in time (that would be a lot of atoms and molecules!), a statistically significant sample of air molecules is certainly not being sampled by releasing balloons twice a day from 90 some odd weather stations in the US and territories, plus the data from commercial aircraft, plus all of the weather data from around the World. Doubling this number wouldn’t help, i.e wouldn’t make any difference. Though there are some blind spots, such as northeast Texas that might benefit from having a radar in the area. So you have to weigh the cost of sampling more of the atmosphere versus the 0% increase in forecasting accuracy (within experimental error) that you would get by doing so.

I’ll go out on a limb and say that the NWS (National Weather Service) is actually doing pretty good job in their 5-day forecasts with the current data and technologies that they have (e.g., S-band radar), and the local meteorologists use their years of experience and judgment to refine the forecasts to their viewing areas. The old joke is that a meteorologist’s job is the one job where you can be wrong more than half the time and still keep your job, but everyone knows that they go to work most, if not all, days with one hand tied behind their back, and sometimes two! The forecasts are not that far off on average, and so meteorologists get my unconditional respect.

In spite of these daunting challenges, there are certainly a number of areas in weather forecasting that can be improved by increased sampling, especially on a local scale. For example, for severe weather outbreaks, the CASA project is being implemented using multiple, shorter range radars that can get multiple scan directions on nearby severe-warned cells simultaneously. This resolves the problem caused by the curvature of the Earth as well as other problems associated with detecting storm-scale features tens or hundreds of miles away from the radar. So high winds, hail, and tornadoes are weather events where increasing the local sampling density/rate might help improve both the models and forecasts.

Prof. Wurman at OU has been doing this for decades with his pioneering work with mobile radar (the so-called “DOW’s”). Let’s not leave out the other researchers who have also been doing this for decades. The strategy of collecting data on a storm from multiple directions at short distances, coupled with supercomputer capabilities, has been paying off for a number of years. As a recent example, Prof. Orf at UW Madison, with his simulation of the May 24th, 2011 El Reno, OK tornado (you’ve probably seen it on the Internet), has shed light on some of the “black box” aspects to how tornadoes form. [Video below is Leigh Orf 1.5 min segment for 2018 Blue Waters Symposium plenary session. This segment summarizes, in 90 seconds, some of the team’s accomplishments on the Blue Waters supercomputer over the past five years.]

Prof. Orf’s simulation is just that, and the resolution is around ~10 m (~33 feet), but it illustrates how increased targeted sampling can be effective in at least understanding the complex, thermodynamic processes occurring within a storm. Colleagues have argued that the small improvements in warning times in the last couple of decades are really due more to the heavy spotter presence these days rather than case studies of severe storms. That may be true. However, in test cases of the CASA system, it picked out the subtle boundaries along which the storms fired that did go unnoticed with the current network of radars. So I’m optimistic about increased targeted sampling for use in an early warning system.

These two examples bring up a related problem-too much data! As commented on by a local meteorologist at a TESSA meeting, one of the issues with CASA that will have to be resolved is how to handle/process the tremendous amounts of data that will be generated during a severe weather outbreak. This is different from a research project where you can take your data back to the “lab”. In a real-time system, such as CASA, you need to have the ability to process the volumes of data rapidly so a meteorologist can quickly make a decision and get that life-saving info to the public. This data volume issue may be less of a problem for those using the data to develop climate models.

So back to the Quora question, with regard to a cost-effective (cost-effect is the operational term) climate model or models (say an ensemble model) that would “verify” say 50 years from now, the sampling issue is ever present, and likely cost-prohibitive at the level needed to make the sampling statistically significant. And will the climatologist be around in 50 years to be “hoisted with their own petard” when the climate model is proven to be wrong? The absence of accountability is the other problem with these long-range models into which many put so much faith.

But don’t stop using or trying to develop better climate models. Just be aware of what variables they include, how well they handle the parameters, and what their limitations are. How accurate would a climate model built on this simplified system [edit: of 15 well-defined variables (to 95% confidence level)] be? Not very!

My Comment

As Dr. Minor explains, powerful modern computers can process detailed observation data to simulate and forecast storm activity.  There are more such tools for preparing and adapting to extreme weather events which are normal in our climate system and beyond our control.  He also explains why long-range global climate models presently have major limitations for use by policymakers.

Footnote Regarding Initial Conditions Problem

What About the Double Pendulum?

Trajectories of a double pendulum

comment by tom0mason at alerted me to the science demonstrated by the double compound pendulum, that is, a second pendulum attached to the ball of the first one. It consists entirely of two simple objects functioning as pendulums, only now each is influenced by the behavior of the other.

Lo and behold, you observe that a double pendulum in motion produces chaotic behavior. In a remarkable achievement, complex equations have been developed that can and do predict the positions of the two balls over time, so in fact the movements are not truly chaotic, but with considerable effort can be determined. The equations and descriptions are at Wikipedia Double Pendulum

Long exposure of double pendulum exhibiting chaotic motion (tracked with an LED)

But here is the kicker, as described in tomomason’s comment:

If you arrive to observe the double pendulum at an arbitrary time after the motion has started from an unknown condition (unknown height, initial force, etc) you will be very taxed mathematically to predict where in space the pendulum will move to next, on a second to second basis. Indeed it would take considerable time and many iterative calculations (preferably on a super-computer) to be able to perform this feat. And all this on a very basic system of known elementary mechanics.

Our Chaotic Climate System

 

 

2024 Natural Climate Factors: Snow

Previously I posted an explanation by Dr. Judah Cohen regarding a correlation between autumn Siberian snow cover and the following winter conditions, not only in the Arctic but extending across the Northern Hemisphere. More recently, in looking into Climate Model Upgraded: INMCM5, I noticed some of the scientists were also involved in confirming the importance of snow cover for climate forecasting. Since the poles function as the primary vents for global cooling, what happens in the Arctic in no way stays in the Arctic. This post explores data suggesting changes in snow cover drive some climate changes.

The Snow Cover Climate Factor

The diagram represents how Dr. Judah Cohen pictures the Northern Hemisphere wintertime climate system.  He leads research regarding Arctic and NH weather patterns for AER.

cohen-schematic2

Dr. Cohen explains the mechanism in this diagram.

Conceptual model for how fall snow cover modifies winter circulation in both the stratosphere and the troposphere–The case for low snow cover on left; the case for extensive snow cover on right.

1. Snow cover increases rapidly in the fall across Siberia, when snow cover is above normal diabatic cooling helps to;
2. Strengthen the Siberian high and leads to below normal temperatures.
3. Snow forced diabatic cooling in proximity to high topography of Asia increases upward flux of energy in the troposphere, which is absorbed in the stratosphere.
4. Strong convergence of WAF (Wave Activity Flux) indicates higher geopotential heights.
5. A weakened polar vortex and warmer down from the stratosphere into the troposphere all the way to the surface.
6. Dynamic pathway culminates with strong negative phase of the Arctic Oscillation at the surface.

From Eurasian Snow Cover Variability and Links with Stratosphere-Troposphere
Coupling and Their Potential Use in Seasonal to Decadal Climate Predictions by Judah Cohen.

Observations of the Snow Climate Factor

The animation at the top shows from remote sensing that Eurasian snow cover fluctuates significantly from year to year, taking the end of October as a key indicator.

For more than five decades the IMS snow cover images have been digitized to produce a numerical database for NH snow cover, including area extents for Eurasia. The NOAA climate data record of Northern Hemisphere snow cover extent, Version 1, is archived and distributed by NCDC’s satellite Climate Data Record Program. The CDR is forward processed operationally every month, along with figures and tables made available at Rutgers University Global Snow Lab.

This first graph shows the snow extents of interest in Dr. Cohen’s paradigm. The Autumn snow area in Siberia is represented by the annual Eurasian averages of the months of October and November (ON). The following NH Winter is shown as the average snow area for December, January and February (DJF). Thus the year designates the December of that year plus the first two months of the next year.

Notes: NH snow cover minimum was 1981, trending upward since.  Siberian autumn snow cover was lowest in 1989, increasing since then.  Autumn Eurasian snow cover is about 1/3 of Winter NH snow area. Note also that fluctuations are sizable and correlated.

The second graph presents annual anomalies for the two series, each calculated as the deviation from the mean of its entire time series. Strikingly, the Eurasian Autumn flux is on the same scale as total NH flux, and closely aligned. While NH snow cover declined a few years prior to 2016, Eurasian snow has trended upward afterward.  If Dr. Cohen is correct, NH snowfall will follow. The linear trend is slightly positive, suggesting that fears of children never seeing snowfall have been exaggerated. The Eurasian trend line (not shown) is almost the same.

Illustration by Eleanor Lutz shows Earth’s seasonal climate changes. If played in full screen, the four corners present views from top, bottom and sides. It is a visual representation of scientific datasets measuring ice and snow extents.

 

 

Global Warming Big Lie–Skip the Distractions

The notion that CO2 from human activities causes global warming has multiple flaws, many of which have been dissected and rebutted here and elsewhere.  But The Big Lie is to fundamentally misrepresent how Earth’s climate system works. Richard Lindzen explains in the above interview with Jordan Peterson.  For those who prefer reading I provide a transcript from the closed captions in italics with my bolds and added images.

JP: When you started to object to the narrative, back say in ‘92, To what narrative were you objecting and on what grounds were you objecting?

RL: You’re touching on something that took me a while to understand. You know Goebbels famously said: If you tell a big enough lie and repeat it often enough, it’ll become the truth. there’s been a lot of that in this. But there are aspects of establishing the narrative, that is, what makes something the truth that I hadn’t appreciated.

So the narrative was the climate is determined by a greenhouse effect
and adding CO2 to it increases warming. And moreover besides CO2
the natural greenhouse substances–water vapor, clouds, upper level clouds–
will amplify whatever man does.

Now that immediately goes against Le Chatelier’s principle which says: If you perturb a system and it is capable internally of counteracting that, it will. And our system is so capable.

So that was a little bit odd. You began wondering, where did these feedbacks come from? Immediately people including myself started looking into the feedbacks, and seeing whether there were any negative ones, and how did it all work?

But underlying it, and this is what I learned: if you want to get a narrative established, the crucial thing is to pepper it with errors, questionable things. So that the critics will seize on those and not question the basic narrative.

The basic narrative was that climate is controlled by the greenhouse effect. In point of fact the earth’s climate system has many regions, but two distinctly different regions. There are the tropics roughly minus 30 to plus 30 degrees latitude, and the extra Tropics outside of plus or minus 30 degrees.

They have very different dynamics, and this is the crucial thing for the Earth by the way. And this is a technicality and much harder to convey than saying that greenhouse gases are a blanket or that 97 percent of scientists agree.

This is actually a technical issue. The Earth rotates. Now people are aware that we have day and night, but there is something called the Coriolis effect. When you’re on a rotating system it gives rise to the appearance of forces that change the winds relative to the rotation. So at the pole the rotation vector is perpendicular to the surface, while at the equator it’s parallel to the surface: it’s zero.

And this gives you phenomenally different Dynamics. So where you don’t have a vertical component to the rotation, vector motions do what they do in the laboratory in small scales. If you have a temperature difference, it acts to wipe it out.

Figure 11. Most sunlight is absorbed in the tropics, and some of the heat energy is carried by air currents to the polar regions to be released back into space as thermal radiation. Along with energy, angular momentum — imparted to the air from the rotating Earth’s surface near the equator — is transported to higher northern and southern latitudes, where it is reabsorbed by the Earth’s surface. The Hadley circulation near the equator is largely driven by buoyant forces on warm, solar-heated air, but for mid latitudes the “Coriolis force” due to the rotation of the earth leads to transport of energy and angular momentum through slanted “baroclinic eddies.” Among other consequences of the conservation of angular momentum are the easterly trade winds near the equator and the westerly winds at mid latitudes.

And so if you look at the tropics the temperatures at any surface are relatively flat: they don’t vary much with latitude. On the other hand you go to the mid Latitudes, in the extra Tropics the temperature varies a lot between the tropics and the pole. We know that about how temperatures are cold at high Latitudes. And if you look at changes in climate in the Earth’s history, what they show is a Tropics that stays relatively constant, and what changes is the temperature difference between the tropics and the pole.

During the Ice Age it was about 60 degrees Centigrade, today it’s about 40.  During 50 million years ago something called the eocene the difference was about 20. So that’s all a function of what’s going on outside the tropics. Within the tropics the greenhouse effect is significant but what determines the temperature change between the tropics and the pole has very little to do with the greenhouse effect.

It is a dynamic phenomenon based on the fact that a temperature difference with latitude generates instabilities. These instabilities take the form of the cyclonic and anticyclonic patterns that you see on the weather map. You can see the tropics are very different from even a casual look at a weather map.
The systems that bring us weather travel from west to east at latitudes outside the tropics. Within the tropics they travel from east to west. The prevailing winds are opposite in the two sections.

Sometimes people say that changes due to the greenhouse effect are amplified at the poles. That is not true: there’s no physical basis for that Statement. All they do is determine the starting point for where the temperature changes in mid-latitudes and that’s determined mainly by Hydrodynamics.

Okay that’s complicated to explain to someone and yet it’s the basis for those claims of seemingly large significance of these small numbers. You know they’re saying if Global mean temperature goes up one and a half degrees it’s the end. That’s based on it getting much bigger at high latitudes and determining that. But all one and a half degrees at the equator would do or in the greenhouse part of the Earth is change the temperature everywhere by one and a half degrees, which for most of us is less than the temperature change between breakfast and lunch.

See Also

Arctic “Amplification” Not What You Think

About Meridional Cooling and Climate Change

CO2 Not a Threat, But Greatly Benefits

 

Beware false and misleading Cartoons.

First, a plain language scientific explanation is in this article: THE BENEFITS OF CO2 – PART 2: CO2 does not cause global warming.  By Teri Ciccone.  Excerpts in italics with my bolds.

THE BENEFITS OF CO2 – PART 2: CO2 does not cause global warming

Abstract: Climate alarmists, the press/media, and politicians say we must achieve net zero CO2 by 2050 to avoid catastrophic global warming. In reality, all humanity and all life on Earth benefit from the increased atmospheric CO2 and the slight temperature increase. In Part 1, we presented scientific arguments to show how increased CO2 and warmer temperatures benefit humanity and all life on Earth. In Part 2, we discuss why a growing number of independent scientists and engineers argue that CO2 does not cause any measurable global warming. Part 3 will present how and why increased CO2 does not cause extreme climate/weather conditions. in the concluding Part 4 we present recommendations for policymakers.

Introduction: In this study, we set out facts and scientific principles consistent with established laws of physics, chemistry, and thermodynamics to state that greenhouse gases (GHGs) and the greenhouse effect (GHE) do not measurably warm the Earth. The concept that it does is at best a 100-year-old myth. The absorption and re-emission of longwave infrared radiation from the Earth’s surface does not cause any measurable warming of the Earth.It also raises the particular concern that the UN IPCC and NASA/NOAA do not address all of the sources of heat that warm the Earth. Nor consider the many natural forces and cycles that cause weather and climate variations that operate at the Astronomical, Endo-Earth, and Bio-Earth levels. Their well-published and promoted Earth Energy Budget needs serious revisions or to be discarded. Two main concepts describe atmospheric physics and thermodynamics, and each plays a vital role in understanding how the Sun warms the Earth and how the Earth cools. The first is the Radiative Transfer Concept (RTC) and the second is the Heat Transport Concept (HTC). They are both important and both need to be studied to fully understand our weather and climate systems. RTC needs an understanding of quantum physics, radiation, photons, absorption/emissions, etc. HTC needs an understanding of the more mundane science of atmospheric thermodynamics, convection, latent heat, conduction, evaporation-condensation, air and ocean circulations, etc.
Radiation Transfer Concept-RTC. The Sun provides most of the energyneeded to warm the Earth to a comfortable and sustainable level for all life. Solar radiation (photons) readily crosses the vacuum of space and arrives at Earth’s Top Of Atmosphere (TOA) at the speed of light. We see a simplified conceptual model of this radiation in Figure 1, showing the wavelength/ frequency, amplitude, and direction of propagation. These electromagnetic energy rays are emitted over a broad set of vibrational frequencies, spanning 20 orders of magnitude, called the spectrum. This radiant energy arrives at TOA at 1,366 W/m 2 with an average of 340 Watts per square meter (W/m 2 ) when averaged over the spherical world.About 30% of the 340 W/m 2 is reflected to space leaving 240 W/m 2 for the Earth. There one-third is absorbed by the atmosphere and two-thirds by the surface (oceans + lands). The high-energy UV radiation warms the air mostly by photodissociation and photoionization. In this process, the solar radian electromagnetic energy is converted to kinetic energy, and then to physical heat through collisions with air molecules. At the surface, about 67% warms and then cools by local physical heat processes like conduction, convection, and latent heat and 33% by radiation. Most of this radiation exits directly to space through an “atmospheric window” at frequencies that are not absorbable by GHGs. It’s estimated that only about 0.09 W of the solar energy absorbed by the surface is radiated through CO2 the non-water vapor greenhouse gasses (GHGs), a trivially small amount.
In essence, the IPCC claims that these greenhouse gasses already contribute about 33 degrees C of temperature to the average global temperature, and it will continue to increase with increased quantities of human-made CO2 /GHGs. In Summary, the IPCC claim is based on three false assumptions:
1) When a photon’s electromagnetic energy is absorbed by CO2 it warms the CO2 molecule with physical heat. This heat is then shared with other air molecules and the atmosphere warms. The energized CO2 molecule will then re-emit a comparable photon, and the process is repeated hundreds/thousands of times before its heat/energy finally exits into space.
2) Half of all the re-emitted photons go upward as described in 1). But half are radiated downward and are re-absorbed by the surface and warm it.
3) The Sun provides Earth with only enough energy to give the planet an average global temperature of -18°C. But measurements tell us that the average global temperature is about 15°C, therefore they claim that CO2 and the GHE provide the missing 33°C heat.
Together these concepts allegedly warm the Earth by first converting the photon’s Electromagnetic Energy (EMI) into physical heat, and secondly, this process delays the release of the original photon energy into space allowing heat to accumulate and delaying its exit to space. The details of their false science and the errors in their explanation are provided in this article Revised Why all the fuss with CO2 and the Greenhouse effect
Heat Transport Concept, or HTC is the second step towards what happens to the EMI energy of these photons as they approach the surface. Some will be reflected to space, for example, in Antarctica, nearly 100% is reflected to space by the ice and snow. Some will be scattered in the atmosphere and cause no measurable warming of anything. Most of these photons are visible light and Infrared and are not readily absorbed by the air. Nearly all of these photons are absorbed by the surface, the dense solid matter and liquid part, and warms it. This process of photon absorption by the surface and warming is called thermalization and we now discuss the Heat Transport Concept, or HTC. In general, the atmosphere does not warm the surface because the air is normally cooler than the surface. We know this based on the established atmospheric Temperature Lapse Rates. For example, the dry air lapse rate is 9.8°C per Km of altitude. This means that the atmospheric temperature cools by 9.8°C for each kilometre of altitude up to the top of the troposphere. According to the Second Law of thermodynamics heat can only flow from a warmer object to a colder object, never the other way around. The RTC espoused by the UN IPCC violates this Second Law, thereby falsifying their point 2) above. Nitrogen, Oxygen, argon, and water vapor make up more than 99% of the atmosphere. The optical depth, or transparency of the air close to the surface is very opaque. This ensures that almost 100% of resonant Longwave Infrared Radiation (LWIR) photons emitted by the surface will be absorbed by GHGs and immediately thermalized within the first several millimetres of altitude. This means that the resonant photon will be immediately absorbed by a CO2 molecule and become energized. An energized CO2 molecule could spontaneously re-emit the absorbed photon in about half a second as described by the UN IPCC theory. In reality, however, within a millionth/billionth of a second that energised CO2 molecule will physically collide with a non-GHG molecule. The impact of the collision will heat the non-GHG molecule and de-energize the CO2 molecule. So that initial photon has disappeared, and its electromagnetic energy is changed to physical heat and is dispersed throughout the atmosphere adding a tiny bit of warmth to the air. The detailed process is described in Part 2, pages 5-6 of the paper linked above. 
Conclusion. This CO2 photon absorption and immediate collision-thermalization process dominates throughout the troposphere. This means that the CO2 resonant LWIR radiation energy radiating from the surface is saturated to extinction by an overabundance of CO2/GHG molecules. There are no resonant photons left in the troposphere for the CO2 molecule to absorb. This is readily visible in Figure 2 by the presence of the CO2 notch, the top red arrow. In the stratosphere and mesosphere where the molecules are few and far apart, we see an increase in the CO2 spontaneous re-emissions implying a reverse- thermalization and the re-emitted photons exit to space. The UN IPCC claim that CO2/GHGs is the major cause of global warming is thus falsified by the above explanation that is consistent with established laws of science, combined with the absence of any scientific test data.
This makes two powerful cases: First, we can safely abandon the goal and costs of Net Zero and Carbon Capture and Sequestration. Even in the extreme case that atmospheric CO2 doubles or quadruples there is no risk of a catastrophic, run-away global warming threat. Second, the press/media, politicians, and compromised scientists will also say that we should reduce and remove CO2 from the air just to be sure. But, there’s a fact, that there is about 50 times more CO2 dissolved in the oceans than in the air. Henry’s Law tells us nature apportions how much CO2 goes into the air and how much goes into the oceans based only on the temperature of the water’s surface. During a hot sunny day, more CO 2 flows from the oceans to the air. And in cold winters more CO 2 will flow from the air into the oceans and all water on Earth. Imagine the silliness of spending $ billions/trillions to remove CO2 from the air and then have the oceans immediately replace it.
In summary, we find out that CO2/GHG absorption of LWIR energy emitted by the surface provides a negligible amount of physical heat in the troposphere. Consequently, the GHG/GHE provides near zero of the missing heat that the IPCC attributes to the GHGs and the GHE.

Background Paper with complete discussion

Missing Link in the GHE, Greenhouse Effect, by Thomas Shula – Markus Ott,  USA – Germany
2024.  From allaboutenergy.net

IPCC Crusade Built on Science Mistakes

“Mistake” definition (American Heritage Dictionary)

Noun

  1. An error or fault resulting from defective judgment, deficient knowledge, or carelessness.
  2. A misconception or misunderstanding.

Five Major IPCC Science Mistakes

♦  Surface stations records have warmed mostly from urban heat sources, not IR-active gases.

♦  Solar climate forcing varies more than IPCC admits.

♦  Experiments show more CO2 does not make air warmer.

♦  On all time scales temperature changes lead and CO2 changes follow.

♦  IPCC climate models exclude natural climate factors to blame all warming on GHGs.

Mistakes on Temperature Records and Solar Forcing

The first two misconceptions are described in a recent paper by CERES (Center for Environmental Research and Earth Sciences).  My post below provides the details.

Overview of CERES Study

Our review suggests that the IPCC reports have inadequately accounted for two major scientific concerns when they were evaluating the causes of global warming since the 1850s:

1. The global temperature estimates used in the IPCC reports are contaminated by urban warming biases.
2.  The estimates of solar activity changes since the 1850s considered by the IPCC substantially downplayed a possible large role for the Sun.

We conclude that it is not scientifically valid for the IPCC to rule out the possibility that global warming might be mostly natural.

Fatal Flaw Discredits IPCC Science

By way of John Ray comes this Spectator Australia article A basic flaw in IPCC science.  Excerpts in italics with my bolds and added images.

Detailed research is underway that threatens to undermine the foundations of the climate science promoted by the IPCC since its First Assessment Report in 1992. The research is re-examining the rural and urban temperature records in the Northern Hemisphere that are the foundation for the IPCC’s estimates of global warming since 1850. The research team has been led by Dr Willie Soon (a Malaysian solar astrophysicist associated with the Smithsonian Institute for many years) and two highly qualified Irish academics – Dr Michael Connolly and his son Dr Ronan Connolly. They have formed a climate research group CERES-SCIENCE. Their detailed research will be a challenge for the IPCC 7th Assessment Report due to be released in 2029 as their research results challenge the very foundations of IPCC science.

The climate warming trend published by the IPCC is a continually updated graph based on the temperature records of Northern Hemisphere land surface temperature stations dating from the mid 19th Century. The latest IPCC 2021 report uses data for the period 1850-2018. The IPCC’s selection of Northern Hemisphere land surface temperature records is not in question and is justifiable. The Northern Hemisphere records provide the best database for this period. The Southern Hemisphere land temperature records are not that extensive and are sparse for the 19th and early 20th Century. It is generally agreed that the urban temperature data is significantly warmer than the rural data in the same region because of an urban warming bias. This bias is due to night-time surface radiation of the daytime solar radiation absorbed by concrete and bitumen. Such radiation leads to higher urban night-time temperatures than say in the nearby countryside. The IPCC acknowledges such a warming bias but alleges the increased effect is only 10 per cent and therefore does not significantly distort its published global warming trend lines.


Since 2018, Dr Soon and his partners have analysed the data from rural and urban temperature recording stations in China, the USA, the Arctic, and Ireland. The number of stations with reliable temperature records in these areas increased from very few in the mid-19th Century to around 4,000 in the 1970s before decreasing to around 2,000 by the 1990s. The rural temperature recording stations with good records peaked at 400 and are presently around 200.

Their analysis of individual stations needs to account for any variation in their exposure to the Sun due to changes in their location, OR shadowing due to the construction of nearby buildings, OR nearby vegetation growth. The analysis of rural temperature stations is further complicated as over time many are encroached by nearby cities. Consequently, the data from such stations needs to be shifted at certain dates from the rural temperature database to either an intermediate database or to a full urban database. Consequently, an accurate analysis of the temperature records of each recording station is a time-consuming task.


This new analysis of 4,000 temperature recording stations in China, the USA, the Arctic, and Ireland shows a warming trend of 0.89ºC per century in the urban stations that is 1.61 times higher that a warming trend of 0.55ºC per century in the rural stations. This difference is far more significant than the 10 per cent divergence between urban and rural stations alleged in the IPCC reports; a divergence explained by a potential flaw in the IPCC’s methodology. The IPCC uses a technique called homogenisation that averages the rural and urban temperatures in a particular region. This method distorts the rural temperature records as over 75 per cent of the temperature records used in this homogenisation methodology are urban stations. So, a methodology that attempts to statistically identify and correct some biases that may be in the raw data, in effect, leads to an urban blending of the rural dataset. This result is biased as it downgrades the actual values of each rural temperature station. In contrast, Dr Soon and his coworkers avoided homogenisation so the temperature trends they identify for each rural region are accurate as the rural data are not distorted by the readings from nearby urban stations.


The rural temperature trend measured by this new research is 0.55ºC per century and it indicates the Earth has warmed 0.9ºC since 1850. In contrast, the urban temperature trend measured by this new research is 0.89ºC per century and indicates a much higher warming of 1.5ºC since 1850. Consequently, a distorted urban warming trend has been used by the IPCC to quantify the warming of the whole of the Earth since 1850. The exaggeration is significant as the urban temperature record database used by the IPCC only represents the temperatures on 3-4 per cent of the Earth’s land surface area; an area less than 2 per cent of the Earth’s total surface area. During the next few years, Dr Willie Soon and his research team are currently analysing the meta-history of 800 European temperature recording stations. When this is done their research will be based on very significant database of Northern Hemisphere rural and urban temperature records from China, the USA, the Arctic, Ireland, and Europe.

This new research has unveiled another flaw in the IPCC‘s temperature narrative as trend lines in its revised temperature datasets are different from those published by the IPCC. For example, the rural records now show a marked warming trend in the 1930s and 1940s while there is only a slight warming trend in the IPCC dataset. The most significant difference is the existence of a marked cooling period in the rural dataset for the 1960s and 1970s that is almost absent in the IPCC’s urban dataset. This later divergence upsets the common narrative that rising carbon dioxide levels control modern warming trends. For, if carbon dioxide levels are the driver of modern warming, how can a higher rate of increasing carbon dioxide levels exist within a cooling period in the 1960s and 1970s while a lower increasing rate of carbon dioxide levels coincides with an earlier warming interval in the 1930s and 1940s? Or, in other words, how can carbon dioxide levels increasing at 1.7 parts per million per decade cause a distinct warming period in the 1930s and 1940s while a larger increasing rate of 10.63 parts per million per decade is associated with a distinct cooling period in the 1960s and 1970s! Consequently, the research of Willie Soon and his coworkers is discrediting, not only the higher rate of global warming trends specified in IPCC Reports, but also the theory that rising carbon dioxide levels explain modern warming trends; a lynchpin of IPCC science for the last 25 years.

Willie Soon and his coworkers maintain that climate scientists need to consider other possible explanations for recent global warming. Willie Soon and his coworkers point to the Sun, but the IPCC maintains that variations in Total Solar Irradiance (TSI) are over eons and not over shorter periods such as the last few centuries. For that reason, the IPCC point to changes in greenhouse gases as the most obvious explanation for global warming since 1850. In contrast, Willie Soon and his coworkers maintain there can be short-term changes in solar activity and, for example, refer to a period of no sunspot activity that coincided with the Little Ice Age in the 17th Century. They also point out there is still no agreed average figure for Total Solar Irradiance (TSI) despite 30 years of measurements taken by various satellites. Consequently, they contend research in this area is not settled.

The CERES-SCIENCE research project pioneered by Dr Willie Soon and the father-son Connolly team has questioned the validity of the high global warming trends for the 1850-present period that have been published by the IPCC since its first report in 1992. The research also queries the IPCC narrative that rising greenhouse gas concentrations, particularly carbon dioxide, are the primary driver of global warming since 1850. That narrative has been the foundation of IPCC climate science for the last 40 years. It will be interesting to see how the IPCC’s 7th Assessment Report in 2029 treats this new research that questions the very basis of IPCC’s climate science.

The paper is The Detection and Attribution of Northern Hemisphere Land Surface Warming (1850–2018) in Terms of Human and Natural Factors: Challenges of Inadequate Data. 

Abstract

A statistical analysis was applied to Northern Hemisphere land surface temperatures (1850–2018) to try to identify the main drivers of the observed warming since the mid-19th century. Two different temperature estimates were considered—a rural and urban blend (that matches almost exactly with most current estimates) and a rural-only estimate. The rural and urban blend indicates a long-term warming of 0.89 °C/century since 1850, while the rural-only indicates 0.55 °C/century. This contradicts a common assumption that current thermometer-based global temperature indices are relatively unaffected by urban warming biases.

Three main climatic drivers were considered, following the approaches adopted by the Intergovernmental Panel on Climate Change (IPCC)’s recent 6th Assessment Report (AR6): two natural forcings (solar and volcanic) and the composite “all anthropogenic forcings combined” time series recommended by IPCC AR6. The volcanic time series was that recommended by IPCC AR6. Two alternative solar forcing datasets were contrasted. One was the Total Solar Irradiance (TSI) time series that was recommended by IPCC AR6. The other TSI time series was apparently overlooked by IPCC AR6. It was found that altering the temperature estimate and/or the choice of solar forcing dataset resulted in very different conclusions as to the primary drivers of the observed warming.

Our analysis focused on the Northern Hemispheric land component of global surface temperatures since this is the most data-rich component. It reveals that important challenges remain for the broader detection and attribution problem of global warming: (1) urbanization bias remains a substantial problem for the global land temperature data; (2) it is still unclear which (if any) of the many TSI time series in the literature are accurate estimates of past TSI; (3) the scientific community is not yet in a position to confidently establish whether the warming since 1850 is mostly human-caused, mostly natural, or some combination. Suggestions for how these scientific challenges might be resolved are offered.

Mistake on CO2 Warming Effect

Thomas Allmendinger is a Swiss physicist educated at Zurich ETH whose practical experience is in the fields of radiology and elemental particles physics.  His complete biography is here.

His independent research and experimental analyses of greenhouse gas (GHG) theory over the last decade led to several published studies, including the latest summation The Real Origin of Climate Change and the Feasibilities of Its Mitigation, 2023, at Atmospheric and Climate Sciences journal. The paper is a thorough and detailed discussion of which I provide here the abstract and the excerpt describing the experiment.  Excerpts are in italics with my bolds and added images. Full post is Experimental Proof Nil Warming from GHGs.

Abstract

The actual treatise represents a synopsis of six important previous contributions of the author, concerning atmospheric physics and climate change. Since this issue is influenced by politics like no other, and since the greenhouse-doctrine with CO2 as the culprit in climate change is predominant, the respective theory has to be outlined, revealing its flaws and inconsistencies.

But beyond that, the author’s own contributions are focused and deeply discussed. The most eminent one concerns the discovery of the absorption of thermal radiation by gases, leading to warming-up, and implying a thermal radiation of gases which depends on their pressure. This delivers the final evidence that trace gases such as CO2 don’t have any influence on the behaviour of the atmosphere, and thus on climate.

But the most useful contribution concerns the method which enables to determine the solar absorption coefficient βs of coloured opaque plates. It delivers the foundations for modifying materials with respect to their capability of climate mitigation. Thereby, the main influence is due to the colouring, in particular of roofs which should be painted, preferably light-brown (not white, from aesthetic reasons).

It must be clear that such a drive for brightening-up the World would be the only chance of mitigating the climate, whereas the greenhouse doctrine, related to CO2, has to be abandoned. However, a global climate model with forecasts cannot be aspired to since this problem is too complex, and since several climate zones exist.

4. Thermal Gas Absorption Measurements

If the warming-up behaviour of gases has to be determined by temperature measurements, interference by the walls of the gas vessel should be regarded since they exhibit a significantly higher heat capacity than the gas does, which implicates a slower warming-up rate. Since solid materials absorb thermal radiation stronger than gases do, the risk exists that the walls of the vessel are directly warmed up by the radiation, and that they subsequently transfer the heat to the gas. And finally, even the thin glass-walls of the thermometers may disturb the measurements by absorbing thermal radiation.

By these reasons, quadratic tubes with a relatively large profile (20 cm) were used which consisted of 3 cm thick plates from Styrofoam, and which were covered at the ends by thin plastic foils. In order to measure the temperature course along the tube, mercury-thermometers were mounted at three positions (beneath, in the middle, and atop) whose tips were covered with aluminum foils. The test gases were supplied from steel cylinders being equipped with reducing valves. They were introduced by a connecter during approx. one hour, because the tube was not gastight and not enough consistent for an evacuation. The filling process was monitored by means of a hygrometer since the air, which had to be replaced, was slightly humid. Afterwards, the tube was optimized by attaching adhesive foils and thin aluminum foils (see Figure 13). The equipment and the results are reported in [21].

Figure 13. Solar-tube, adjustable to the sun [21].

The initial measurements were made outdoor with twin-tubes in the presence of solar light. One tube was filled with air, and the other one with carbon-dioxide. Thereby, the temperature increased within a few minutes by approx. ten degrees till constant limiting temperatures were attained, namely simultaneously at all positions. Surprisingly, this was the case in both tubes, thus also in the tube which was filled with ambient air. Already this result delivered the proof that the greenhouse theory cannot be true. Moreover, it gave rise to investigate the phenomenon more thoroughly by means of artificial, better defined light.

Figure 14. Heat-radiation tube with IR-spot [21].

Accordingly, the subsequent experiments were made using IR-spots with wattages of 50 W, 100 W and 150W which are normally employed for terraria (Figure 14). Particularly the IR-spot with 150 W lead to a considerably higher temperature increase of the included gas than it was the case when sunlight was applied, since its ratio of thermal radiation was higher. Thereby, variable impacts such as the nature of the gas could be evaluated.

Due to the results with IR-spots at different gases (air, carbon-dioxide, the noble gases argon, neon and helium), essential knowledge could be gained. In each case, the irradiated gas warmed up until a stable limiting temperature was attained. Analogously to the case of irradiated coloured solid plates, the temperature increased until the equilibrium state was attained where the heat absorption rate was identically equal with the heat emission rate.

Figure 15. Time/temperature-curves for different gases [21] (150 W-spot, medium thermometer-position).

As evident from the diagram in Figure 15, the initial observation made with sunlight was approved that pure carbon-dioxide was warmed up almost to the same degree as air does (whereby ambient air only scarcely differed from a 4:1 mixture between nitrogen and oxygen). Moreover, noble gases absorb thermal radiation, too. As subsequently outlined, a theoretical explanation could be found thereto.

Conclusion

Finally, the theoretically suggested dependency of the atmospheric thermal radiation intensity on the atmospheric pressure could be empirically verified by measurements at different altitudes, namely in Glattbrugg (430 m above sea level and on the top of the Furka-pass (2430 m above sea level), both in Switzerland, delivering a so-called atmospheric emission constant A ≈ 22 W·m−2•bar−1•K−0.5. It explained the altitude-paradox of the atmospheric temperature and delivered the definitive evidence that the atmospheric behavior, and thus the climate, does not depend on trace gases such as CO2. However, the atmosphere thermally reradiates indeed, leading to something similar to a Greenhouse effect. But this effect is solely due to the atmospheric pressure.

Mistake on Warming Prior to CO2 Rising

Changes in CO2 follow changes in global temperatures on all time scales, from last month’s observations to ice core datasets spanning millennia. Since CO2 is the lagging variable, it cannot logically be the cause of temperature, the leading variable. It is folly to imagine that by reducing human emissions of CO2, we can change global temperatures, which are obviously driven by other factors.  Most recent post on this:

10/2024 Update Recent Warming Spike Drives Rise in CO2

Mistake on Models Bias Against Natural Factors

Figure 1. Anthropic and natural contributions. (a) Locked scaling factors, weak Pre Industrial Climate Anomalies (PCA). (b) Free scaling, strong PCA

In  2009, the iconic email from the Climategate leak included a comment by Phil Jones about the “trick” used by Michael Mann to “hide the decline,” in his Hockey Stick graph, referring to tree proxy temperatures  cooling rather than warming in modern times.  Now we have an important paper demonstrating that climate models insist on man-made global warming only by hiding the incline of natural warming in Pre-Industrial times.  The paper is From Behavioral Climate Models and Millennial Data to AGW Reassessment by Philippe de Larminat.  H/T No Tricks Zone. Excerpts in italics with my bolds.

Abstract

Context. The so called AGW (Anthropogenic Global Warming), is based on thousands of climate simulations indicating that human activity is virtually solely responsible for the recent global warming. The climate models used are derived from the meteorological models used for short-term predictions. They are based on the fundamental and empirical physical laws that govern the myriad of atmospheric and oceanic cells integrated by the finite element technique. Numerical approximations, empiricism and the inherent chaos in fluid circulations make these models questionable for validating the anthropogenic principle, given the accuracy required (better than one per thousand) in determining the Earth energy balance.

Aims and methods. The purpose is to quantify and simulate behavioral models of weak complexity, without referring to predefined parameters of the underlying physical laws, but relying exclusively on generally accepted historical and paleoclimate series.

Results. These models perform global temperature simulations that are consistent with those from the more complex physical models. However, the repartition of contributions in the present warming depends strongly on the retained temperature reconstructions, in particular the magnitudes of the Medieval Warm Period and the Little Ice Age. It also depends on the level of the solar activity series. It results from these observations and climate reconstructions that the anthropogenic principle only holds for climate profiles assuming almost no PCA neither significant variations in solar activity. Otherwise, it reduces to a weak principle where global warming is not only the result of human activity, but is largely due to solar activity.  Full post is here:

Climate Models Hide the Paleo Incline

Methane False Alarm, Microbes Are to Blame

Home fireplace burning Nat Gas, which is 75% methane (CH4).

Jo Nova explains at her blog Mysterious record methane surge since 2020 was not fossil fuels but “90% due to microbes”.  Excerpts in italics with my bolds and added images.

Nobody checked the carbon-13 ratios!

Wouldn’t you know it — 150 nations signed the Global Methane Pledge without even bothering to check if the methane was man-made.

Methane — the second most hated Greenhouse gas — spiked to record historic levels in the last few years, over 1,900 parts per billion.  In 2019, even the WEF scientists admitted they couldn’t explain the baffling rise, and then in 2020, the world of methane went into the twilight zone.  We shut down the modern world due to the pandemic, and methane levels rose even faster.

It seems many have been blaming fossil fuels for the global
surge in emissions, but forgot to check the C13 isotopes.

Somehow we spend millions on breathalysing cows, measuring their burps, and feeding them seaweed, but didn’t think to do the basic chemistry. How could that be, you might wonder… 158 nations agreed to cut methane emissions by 30% by 2030, but none of them audited the science even though very strange things were happening. (The point was obviously the “pledge”, the junkets, the captive industries and subsidies, anything but the science).

Methane from fossil fuels has a higher carbon-13 ratio, but even though fossil fuel use was rising, the carbon-13 levels of atmospheric methane was rolling down a hill. Indeed this new study shows it’s been falling for 17 years.

It’s not like this snuck up on us….  any inquiring mind should have seen this coming a decade ago. The lab has been recording C13 in methane since 1998 and gets air samples from 22 sites around the world every week or two.

From the press release:

Microbes in environment drove methane emissions more than fossil fuels between 2020 and 2022, analysis finds

They found that between 2020 and 2022, the drastic increase in atmospheric methane was driven almost entirely by microbial sources. Since 2007, scientists have observed microbes playing a significant role in methane emissions, but their contribution has surged to over 90% starting in 2020.

“Some prior studies have suggested that human activities, especially fossil fuels, were the primary source of methane growth in recent years,” said Xin (Lindsay) Lan…

“These studies failed to look at the isotope profile of methane

They go on to mention that in a warmer world, bacteria have a higher metabolism, which means they are happier and work faster. Thus, like CO2, if the world warms for any reason at all, methane will rise — and there is nothing we can do about it.

The one last straw they could clutch is that maybe the microbes were “man-made” :  It remains unclear whether the increased microbial emissions came from natural sources like wetlands or human-driven sources, such as landfills and agriculture. The team plans to delve deeper to identify the exact source of methane.

As if somehow there was a surge in landfill, rice paddies
or cows in the last few years that no one had noticed.

This is a pretty big dealmethane has supposedly caused about 30% of our current temperature rise (says the broken climate models) yet 90% of that recent rise was microbes. It’s yet another slice of the climate we aren’t controlling, but we’re still designing burgers with mealworms and bacon from fungus, in the hope of reducing methane emissions and controlling the weather.  Then it turns out every swamp and square meter of soil is working against us.

Methane concentrations in the air have almost tripled since the 1700s, but that was the Little Ice Age.  It’s easy to believe that as the world warmed up, the planet’s wetlands and soil microbes have just been returning to normal business for the last 300 years.

We skeptics told the experts long ago it was mostly not man-made, Tom Quirk showed that methane rises and falls in time with El Ninos, and was thus largely a natural phenomenon. Willie Soon also pointed out that one of Saturn’s moons has more methane than all the oil and gas deposits on Earth, but has no dinosaurs, cows or leaky wells.

REFERENCE

Michel, Sylvia Englund, et al (2024) Rapid shift in methane carbon isotopes suggests microbial emissions drove record high atmospheric methane growth in 2020–2022, Proceedings of the National Academy of Sciences. DOI: 10.1073/pnas.2411212121

Give Daisy a Break!

 

 

 

 

 

Global Warming Abates in Autumn 2024

Hot, Hot, Hot.  You will have noticed that the term “climate change” is now synonymous with “summer”.  Since the northern hemisphere is where most of the world’s land, people and media are located, two typical summer months and a hot European August have been depicted as the fires of hell awaiting any and all who benefit from fossil fuels. If you were wondering what the media would do, apart from obsessing over the many small storms this year, you are getting the answer.

Fortunately, Autumn is on the way and already bringing cooler evenings in Montreal where I live. Once again open windows provide fresh air for sleeping, while mornings are showing condensation, and frost sometimes. This year’s period of “climate change” is winding down.  Unless of course, we get some hurricanes the next two months.  Below is a repost of seasonal changes in temperature and climate for those who may have been misled by the media reports of a forever hotter future.

geese-in-v-formation

Autumnal Climate Change

Seeing a lot more of this lately, along with hearing the geese  honking. And in the next week or two we expect that trees around here will lose their leaves. It definitely is climate change of the seasonal variety.

Interestingly, the science on this is settled: It is all due to reduction of solar energy because of the shorter length of days (LOD). The trees drop their leaves and go dormant because of less sunlight, not because of lower temperatures. The latter is an effect, not the cause.

Of course, the farther north you go, the more remarkable the seasonal climate change. St. Petersburg, Russia has their balmy “White Nights” in June when twilight is as dark as it gets, followed by the cold, dark winter and a chance to see the Northern Lights.

And as we have been monitoring, the Arctic ice has been melting from sunlight in recent months, but is already building again in the twilight, to reach its maximum in March under the cover of darkness.

We can also expect in January and February for another migration of millions of Canadians (nicknamed “snowbirds”) to fly south in search of a summer-like climate to renew their memories and hopes. As was said to me by one man in Saskatchewan (part of the Canadian wheat breadbasket region): “Around here we have Triple-A farmers: April to August, and then Arizona.” Here’s what he was talking about: Quartzsite Arizona annually hosts 1.5M visitors, mostly between November and March.

Of course, this is just North America. Similar migrations occur in Europe, and in the Southern Hemisphere, the climates are changing in the opposite direction, Springtime currently. Since it is so obviously the sun causing this seasonal change, the question arises: Does the sunlight vary on longer than annual timescales?

The Solar-Climate Debate

And therein lies a great, enduring controversy between those (like the IPCC) who dismiss the sun as a driver of multi-Decadal climate change, and those who see a connection between solar cycles and Earth’s climate history. One side can be accused of ignoring the sun because of a prior commitment to CO2 as the climate “control knob”.

The other side is repeatedly denounced as “cyclomaniacs” in search of curve-fitting patterns to prove one or another thesis. It is also argued that a claim of 60-year cycles can not be validated with only 150 years or so of reliable data. That point has weight, but it is usually made by those on the CO2 bandwagon despite temperature and CO2 trends correlating for only 3 decades during the last century.

One scientist in this field is Nicola Scafetta, who presents the basic concept this way:

“The theory is very simple in words. The solar system is characterized by a set of specific gravitational oscillations due to the fact that the planets are moving around the sun. Everything in the solar system tends to synchronize to these frequencies beginning with the sun itself. The oscillating sun then causes equivalent cycles in the climate system. Also the moon acts on the climate system with its own harmonics. In conclusion we have a climate system that is mostly made of a set of complex cycles that mirror astronomical cycles. Consequently it is possible to use these harmonics to both approximately hindcast and forecast the harmonic component of the climate, at least on a global scale. This theory is supported by strong empirical evidences using the available solar and climatic data.”

He goes on to say:

“The global surface temperature record appears to be made of natural specific oscillations with a likely solar/astronomical origin plus a noncyclical anthropogenic contribution during the last decades. Indeed, because the boundary condition of the climate system is regulated also by astronomical harmonic forcings, the astronomical frequencies need to be part of the climate signal in the same way the tidal oscillations are regulated by soli-lunar harmonics.”

He has concluded that “at least 60% of the warming of the Earth observed since 1970 appears to be induced by natural cycles which are present in the solar system.” For the near future he predicts a stabilization of global temperature and cooling until 2030-2040. Note that several El Nino spikes have temporarily taken the GMT (Global Mean Temperature) anomaly outside the forecasted bounds.

For more see Scafetta vs. IPCC: Dueling Climate Theories

A more recent Scafetta publication is Reconstruction of the Interannual to Millennial Scale Patterns of the Global Surface Temperature in the journal atmosphere.  There is provided this exhibit comparing his semi-empirical forecast to HadCRUT4

A Deeper, but Accessible Presentation of Solar-Climate Theory

I have found this presentation by Ian Wilson to be persuasive while honestly considering all of the complexities involved.

The author raises the question: What if there is a third factor that not only drives the variations in solar activity that we see on the Sun but also drives the changes that we see in climate here on the Earth?

The linked article is quite readable by a general audience, and comes to a similar conclusion as Scafetta above: There is a connection, but it is not simple cause and effect. And yes, length of day (LOD) is a factor beyond the annual cycle.

Click to access IanwilsonForum2008.pdf

It is fair to say that we are still at the theorizing stage of understanding a solar connection to earth’s climate. And at this stage, investigators look for correlations in the data and propose theories (explanations) for what mechanisms are at work. Interestingly, despite the lack of interest from the IPCC, solar and climate variability is a very active research field these days.

For example Svensmark has now a Cosmosclimatology theory supported by empirical studies described in more detail in the red link.

A summary of recent studies is provided at NoTricksZone: Since 2014, 400 Scientific Papers Affirm A Strong Sun-Climate Link

Ian Wilson has much more to say at his blog: http://astroclimateconnection.blogspot.com.au/

Once again, it appears that the world is more complicated than a simple cause and effect model suggests.

 

Fluctuations in observed global temperatures can be explained by a combination of oceanic and solar cycles.  See engineering analysis from first principles Quantifying Natural Climate Change.

For everything there is a season, a time for every purpose under heaven.

What has been will be again, what has been done will be done again;
there is nothing new under the sun.
(Ecclesiastes 3:1 and 1:9)

Footnote:

jimbob child activist

Methane Madness Strikes Again

The latest comes from Australia by way of John Ray at his blog Methane cuts on track for 2030 emissions goal.  Excerpts in italics with my bolds and added images.

Australia’s methane emissions have decreased over the past two decades, according to a new report by a leading global carbon research group.

While the world’s methane emissions grew by 20 per cent, meaning two thirds of methane in the atmosphere is from human activity, Australasia and Europe emitted lower levels of the gas.

It puts Australia in relatively good stead, compared to 150 other signatories, to meet its non-binding commitments to the Global Methane Pledge, which aims to cut methane emissions by 30 per cent by the end of the decade.

The findings were revealed in the fourth global methane budget, published by the Global Carbon Project, with contributions from 66 research institutions around the world, including the CSIRO.

According to the report, agriculture contributed 40 per cent of global methane emissions from human activities, followed by the fossil fuel sector (34 per cent), solid waste and waste­water (19 per cent), and biomass and biofuel burning (7 per cent).

Pep Canadell, CSIRO executive director for the Global Carbon Project, said government policies and a smaller national sheep flock were the primary reasons for the lower methane emissions in Australasia.

“We have seen higher growth rates for methane over the past three years, from 2020 to 2022, with a record high in 2021. This increase means methane concentrations in the atmosphere are 2.6 times higher than pre-­industrial (1750) levels,” Dr Canadell said.

The primary source of methane emissions in the agriculture sector is from the breakdown of plant matter in the stomachs of sheep and cattle.

It has led to controversial calls from some circles for less red meat consumption, outraging the livestock industry, which has lowered its net greenhouse gas emissions by 78 per cent since 2005 and is funding research into methane reduction.

Last week, the government agency advising Anthony Albanese on climate change suggested Australians could eat less red meat to help reduce emissions. And the government’s official dietary guidelines will be amended to incorporate the impact of certain foods on climate change.

There is ongoing disagreement among scientists and policymakers about whether there should be a distinction between biogenic methane emitted by livestock, which already exists in a balanced cycle in plants and soil and the atmosphere, and methane emitted from sources stored deep underground for millennia.

“The frustration is that methane, despite its source, gets lumped into one bag,” Cattle Australia vice-president Adam Coffey said. “Enteric methane from livestock is categorically different to methane from coal-seam gas or mining-related fossil fuels that has been dug up from where it’s been stored for millennia and is new to the atmosphere.

“Why are we ignoring what modern climate science is telling us, which is these emissions are inherently different?”  Mr Coffey said the methane budget report showed the intense focus on the domestic industry’s environmental credent­ials was overhyped.

“I think it’s based mainly on ideology and activism,” Mr Coffey said.

This concern about methane is nonsense.
Water vapour blocks all the frequencies that methane does
so the presence of methane adds nothing

Technical Background

Methane alarm is one of the moles continually popping up in the media Climate Whack-A-Mole game. An antidote to methane madness is now available to those inquiring minds who want to know reality without the hype.

Methane and Climate is a paper by W. A. van Wijngaarden (Department of Physics and Astronomy, York University, Canada) and W. Happer (Department of Physics, Princeton University, USA) published at CO2 Coalition November 22, 2019. Below is a summary of the more detailed publication. Excerpts in italics with my bolds.

Overview

Atmospheric methane (CH4) contributes to the radiative forcing of Earth’s atmosphere. Radiative forcing is the difference in the net upward thermal radiation from the Earth through a transparent atmosphere and radiation through an otherwise identical atmosphere with greenhouse gases. Radiative forcing, normally specified in units of W m−2 , depends on latitude, longitude and altitude, but it is often quoted for a representative temperate latitude, and for the altitude of the tropopause, or for the top of the atmosphere.

For current concentrations of greenhouse gases, the radiative forcing at the tropopause, per added CH4 molecule, is about 30 times larger than the forcing per added carbon-dioxide (CO2) molecule. This is due to the heavy saturation of the absorption band of the abundant greenhouse gas, CO2. But the rate of increase of CO2 molecules, about 2.3 ppm/year (ppm = part per million by mole), is about 300 times larger than the rate of increase of CH4 molecules, which has been around 0.0076 ppm/year since the year 2008.

So the contribution of methane to the annual increase in forcing is one tenth (30/300) that of carbon dioxide. The net forcing increase from CH4 and CO2 increases is about 0.05 W m−2 year−1 . Other things being equal, this will cause a temperature increase of about 0.012 C year−1 . Proposals to place harsh restrictions on methane emissions because of warming fears are not justified by facts.

The paper is focused on the greenhouse effects of atmospheric methane, since there have recently been proposals to put harsh restrictions on any human activities that release methane. The basic radiation-transfer physics outlined in this paper gives no support to the idea that greenhouse gases like methane, CH4, carbon dioxide, CO2 or nitrous oxide, N2O are contributing to a climate crisis. Given the huge benefits of more CO2 to agriculture, to forestry, and to primary photosynthetic productivity in general, more CO2 is almost certainly benefitting the world. And radiative effects of CH4 and N2O, another greenhouse gas produced by human activities, are so small that they are irrelevant to climate.

Transmission of shortwave solar irradiation and long wavelength radiation from the Earth’s surface through atmosphere, as permitted by Rohde [2]. Note absorption wavelengths of CH4 and N2O are already covered by H2O and CO2.

Radiative Properties of Earth Atmosphere

On the left of Fig. 2 we have indicated the three most important atmospheric layers for radiative heat transfer. The lowest atmospheric layer is the troposphere, where parcels of air, warmed by contact with the solar-heated surface, float upward, much like hot-air balloons. As they expand into the surrounding air, the parcels do work at the expense of internal thermal energy. This causes the parcels to cool with increasing altitude, since heat flow in or out of parcels is usually slow compared to the velocities of ascent of descent.

Figure 2: Left. A standard atmospheric temperature profile[9], T = T (z). The surface temperature is T (0) = 288.7 K . Right. Standard concentrations[10], C {i} = N {i}/N for greenhouse molecules versus altitude z. The total number density of atmospheric molecules is N . At sea level the concentrations are 7750 ppm of H2O, 1.8 ppm of CH4 and 0.32 ppm of N2O. The O3 concentration peaks at 7.8 ppm at an altitude of 35 km, and the CO2 concentration was approximated by 400 ppm at all altitudes. The data is based on experimental observations.

If the parcels consisted of dry air, the cooling rate would be 9.8 C km−1 the dry adiabatic lapse rate[12]. But rising air has usually picked up water vapor from the land or ocean. The condensation of water vapor to droplets of liquid or to ice crystallites in clouds, releases so much latent heat that the lapse rates are less than 9.8 C km−1 in the lower troposphere. A representative lapse rate for mid latitudes is dT/dz = 6.5 K km−1 as shown in Fig. 2.

The tropospheric lapse rate is familiar to vacationers who leave hot areas near sea level for cool vacation homes at higher altitudesin the mountains. On average, the temperature lapse rates are small enough to keep the troposphere buoyantly stable[13]. Tropospheric air parcels that are displaced in altitude will oscillate up and down around their original position with periods of a few minutes. However, at any given time, large regions of the troposphere (particularly in the tropics) are unstable to moist convection because of exceptionally large temperature lapse rates.

The vertical radiation flux Z, which is discussed below, can change rapidly in the troposphere and stratosphere. There can be a further small change of Z in the mesosphere. Changes in Z above the mesopause are small enough to be neglected, so we will often refer to the mesopause as “the top of the atmosphere” (TOA), with respect to radiation transfer. As shown in Fig. 2, the most abundant greenhouse gas at the surface is water vapor, H2O. However, the concentration of water vapor drops by a factor of a thousand or more between the surface and the tropopause. This is because of condensation of water vapor into clouds and eventual removal by precipitation. Carbon dioxide, CO2, the most abundant greenhouse gas after water vapor, is also the most uniformly mixed because of its chemical stability. Methane, the main topic of this discussion is much less abundant than CO2 and it has somewhat higher concentrations in the troposphere than in the stratosphere where it is oxidized by OH radicals and ozone, O3. The oxidation of methane[8] is the main source of the stratospheric water vapor shown in Fig. 2.

Future Forcings of CH4 and CO2

Methane levels in Earth’s atmosphere are slowly increasing.  If the current rate of increase, about 0.007 ppm/year for the past decade or so, were to continue unchanged it would take about 270 years to double the current concentration of C {i} = 1.8 ppm. But, as one can see from Fig.7, methane levels have stopped increasing for years at a time, so it is hard to be confident about future concentrations. Methane concentrations may never double, but if they do, WH[1] show that this would only increase the forcing by 0.8 W m−2. This is a tiny fraction of representative total forcings at midlatitudes of about 140 W m−2 at the tropopause and 120 W m−2 at the top of the atmosphere.

Figure 9: Projected mid-latitude forcing increments at the tropopause from continued increases of CO2 and CH4 at the rates of Fig. 7 and Fig. 8 for the next 50 years. The projected forcings are very small, especially for methane, compared to the current tropospheric forcing of 137 W m−2.

The per-molecule forcings P {i} of (13) and (14) have been used with the column density Nˆ of (12) and the concentration increase rates dC¯{i}/dt, noted in Fig. 7 and Fig. 8, to evaluate the future forcing (15), which is plotted in Fig. 9. Even after 50 years, the forcing increments from increased concentrations of methane (∆F = 0.23 W m−2), or the roughly ten times larger forcing from increased carbon dioxide (∆F = 2.2 W m−2) are very small compared to the total forcing, ∆F = 137 W m−2, shown in Fig. 3. The reason that the per-molecule forcing of methane is some 30 times larger than that of carbon dioxide for current concentrations is “saturation” of the absorption bands. The current density of CO2 molecules is some 200 times greater than that of CH4 molecules, so the absorption bands of CO2 are much more saturated than those of CH4. In the dilute“optically thin” limit, WH[1] show that the tropospheric forcing power per molecule is P {i} = 0.15 × 10−22 W for CH4, and P {i} = 2.73 × 10−22 W for CO2. Each CO2 molecule in the dilute limit causes about 5 times more forcing increase than an additional molecule of CH4, which is only a ”super greenhouse gas” because there is so little in the atmosphere, compared to CO2.

Methane Summary

Natural gas is 75% Methane (CH4) which burns cleanly to carbon dioxide and water. Methane is eagerly sought after as fuel for electric power plants because of its ease of transport and because it produces the least carbon dioxide for the most power. Also cars can be powered with compressed natural gas (CNG) for short distances.

In many countries CNG has been widely distributed as the main home heating fuel. As a consequence, in the past methane has leaked to the atmosphere in large quantities, now firmly controlled. Grazing animals also produce methane in their complicated stomachs and methane escapes from rice paddies and peat bogs like the Siberian permafrost.

It is thought that methane is a very potent greenhouse gas because it absorbs some infrared wavelengths 7 times more effectively than CO2, molecule for molecule, and by weight even 20 times. As we have seen previously, this also means that within a distance of metres, its effect has saturated, and further transmission of heat occurs by convection and conduction rather than by radiation.

Note that when H20 is present in the lower troposphere, there are few photons left for CH4 to absorb:

Even if the IPCC radiative greenhouse theory were true, methane occurs only in minute quantities in air, 1.8ppm versus CO2 of 390ppm. By weight, CH4 is only 5.24Gt versus CO2 3140Gt (on this assumption). If it truly were twenty times more potent, it would amount to an equivalent of 105Gt CO2 or one thirtieth that of CO2. A doubling in methane would thus have no noticeable effect on world temperature.

However, the factor of 20 is entirely misleading because absorption is proportional to the number of molecules (=volume), so the factor of 7 (7.3) is correct and 20 is wrong. With this in mind, the perceived threat from methane becomes even less.

Further still, methane has been rising from 1.6ppm to 1.8ppm in 30 years (1980-2010), assuming that it has not stopped rising, this amounts to a doubling in 2-3 centuries. In other words, methane can never have any measurable effect on temperature, even if the IPCC radiative cooling theory were right.

Because only a small fraction in the rise of methane in air can be attributed to farm animals, it is ludicrous to worry about this aspect or to try to farm with smaller emissions of methane, or to tax it or to trade credits.

The fact that methane in air has been leveling off in the past two decades, even though we do not know why, implies that it plays absolutely no role as a greenhouse gas.  (From Sea Friends (here):

More information at The Methane Misconceptions by Dr. Wilson Flood (UK) here.

Climatists Aim Forks at Our Food Supply