Ruinous Folly of CO2 Pricing

Dr. Lars Schernikau is an energy economist explaining why CO2 pricing (also falsely called “carbon pricing”) is a terrible idea fit only for discarding.  His blog article is The Dilemma of Pricing CO2.  Excerpts below in italics with my bolds and added images.

1.Understanding CO₂ pricing
2.Economic and environmental impact
3.Global economic impact
4.Alternative solutions
5.Conclusion
6.Additional comments and notes
7.References

Introduction

As an energy economist I am confronted daily with questions about the “energy transition” away from conventional fuels. As we know, the discussion about the “energy transition” stems from concerns about climatic changes.

The source of climatic changes is a widely discussed issue, with numerous policies and measures proposed to mitigate its impact. One such measure is the current and future pricing of carbon dioxide (CO₂) emissions. The logic followed is that if human CO₂ emissions are reduced, future global temperatures will be measurably lower, extreme weather events will be reduced, and sea-levels will rise less or stop rising all together.

Although intended to reduce greenhouse gases, this approach has sparked considerable debate. In this blog post I discuss the controversial topic of CO₂ pricing, examining its economic and environmental ramifications.

However, this article is not about the causes of climatic changes, nor is it about the negative or positive effects of a warming planet and higher atmospheric CO₂ concentrations. It is also not about the scientifically undisputed fact that we don’t know how much warming CO₂ causes (a list of recent academic research on CO₂’s climate sensitivity can be found at the end of this blog).

Nor do I unpack the undisputed and IPCC confirmed fact that each additional ton of CO₂ in the atmosphere has less warming effect than the previous ton as the climate sensitivity of CO₂ is a logarithmic function irrespective of us not knowing what that climate sensitivity is. I also don’t discuss the NASA satellite confirmed greening of the world over the past decades partially driven by higher atmopheric CO₂ concentrations (see sources inc Chen et al. 2024).

Instead, this blog post is about the environmental and economic “sense”, or lack thereof, of pricing CO₂ emissions as currently practiced in most OECD countries and increasingly seen in developing nations. It is about the “none-sense” of measuring practically all human activity with a “CO₂ footprint”, often mistakenly called “carbon footprint”, and having nearly every organization set claims for current or future “Net-Zero” (Figure 1).

1.Understanding CO₂ Pricing

CO₂ pricing aims to internalize the external costs of CO₂ emissions, thereby encouraging businesses and individuals to reduce their “carbon footprint”. The concept is straightforward: by assigning a cost to CO₂ emissions, it becomes financially advantageous to emit less CO₂.

However, this simplistic view overlooks
significant complexities and unintended consequences.

Our entire existence is based on drawing from nature (“renewable” or not), so the “Net-Zero” discussion ignores a fundamental requirement for our survival. I agree that it should be our aim to reduce the environmental footprint as much as possible but only if our lives, health, and wealth don’t deteriorate as a result.

Now, I am sure, some readers and many “activists” may disagree, which I respect but, at a global level, find unrealistic. However, I would assume that most agree that no-one’s life ought to be harmed or shortened for the sake of reducing the environmental impact made. Otherwise, there is little room for a conversation.

BloombergNEFs “New Energy Outlook” from May 2024 should possibly be called “CO₂ Outlook”, as there is little to be found about energy and its economics but rather all about CO₂ emissions and the so called “Net-Zero” (Figure 1), which is in line with media, government, and educational focus on primarily carbon dioxide emissions.

2.Economic and Environmental Impacts

One of the primary criticisms of CO₂ pricing is that it addresses only one environmental externality while ignoring others. This narrow focus can lead to economic distortions, as it fails to account for the full spectrum of environmental and social impacts. For instance, while CO₂ pricing might reduce emissions, it can also drive-up energy costs, disproportionately affecting lower-income populations and hindering economic development in lesser developed countries.

It is by now undisputed amongst energy economists that, large-scale “Net-Zero” intermittent and unpredictable wind and solar power generation increases the total or “full” cost of electricity, primarily because of their low energy density, intermittency, inherent net energy and raw material inefficiency, mounting integration costs for power grids, and the need for a drastic overbuild installation system plus an overbuild backup/storage system because of their intermittency.

CO₂ pricing can also result in environmental trade-offs. For example, the shift towards “renewable” energy sources like wind and solar, incentivized by CO₂ pricing, has its own set of environmental impacts, including land use, resource extraction, energy footprint, and energy storage challenges.

When BloombergNEF (Figure 1) displays
how clean power and electrification
will directly reduce CO₂ emissions to zero,
then they are clearly mistaken.

My native country Germany provides a notable example of the complexities involved in transitioning to “renewable” energy. The country has invested heavily in wind and solar power, leading to the highest electricity costs among larger nations. Germany’s installed wind and solar capacity is now twice the total peak power demand. This variable “renewable” wind and solar power capacity now produces about a third of the country’s electricity and contributes about 6% to Germany’s primary energy supply (Figure 2).

Sources: Schernikau based on Fraunhofer, Agora, AG Energiebilanzen. See also www.unpopular-truth.com/graphs.

3.Global Economic Implications

Higher energy costs, obviously and undisputedly, hurt less affluent people and stifles the development of poorer nations (Figure 3). Thus, a move to more expensive wind and solar energy has “human externalities”. The less fortunate will be “starved of” energy as they wouldn’t be able to afford it, leading to literal reduction in life expectancy.

Source: Eschenbach 2017; Figure 38 in Book “The Unpopular Truth… about Electricity and the Future of Energy”

CO₂ pricing typically focuses only on emissions during operation,
neglecting significant environmental and economic costs
incurred during other stages or by the entire system.

For instance, the production of solar panels involves substantial energy and raw material inputs. Today there is not one single solar panel that is produced without coal. Similarly, the manufacturing and transportation processes of wind turbines and electric vehicles are energy-intensive and environmentally impactful. These stages are rarely accounted for in CO₂ pricing schemes, leading to a distorted view of their true environmental footprint. Also not accounted for are:

a) the required overbuild,
b) short and long duration energy storage,
c) backup facilities, or
d) larger network integration and transmission infrastructure.

Source: Schernikau, adapted from Figure 39 in Book “The Unpopular Truth… about Electricity and the Future of Energy“

Figure 4 illustrates how virtually all CO₂ pricing or taxation happens only at the stage of “operation” or combustion. How else could a “Net-Zero” label be assigned to a solar panel produced from coal and minerals extracted in Africa with diesel-run equipment, transported to China on a vessel powered by fuel-oil, and processed with heat and electricity from coal- or gas-fired power partially using forced labour? All this energy-intensive activity and not a single kilogram of CO₂ is taxed (see my recent article on this subject here) The same applies to wind turbines, hydro power, biofuel, or electric vehicles.

It turns out, CO₂ tax is basically just a means to redistribute wealth, with the collecting agency (government) deciding where the funds go. Yes, a CO₂ tax does incentivize industry to reduce CO₂ emissions at their taxed operations only, but this comes at a cost to economies, the environment, and often people. . . Any economist will confirm that pricing one externality but not others leads to economic distortions and, many would say, worse environmental impacts.

4.Alternative Approaches

Distortion, in this case, is just another word for unintended consequences to the environment, our economies, and the people. Pricing CO₂ only during combustion but failing to price methane, raw material and recycling, inefficiency, or embodied energy, or energy shortages, or land requirement, or greening from CO₂… will cause undesirable outcomes. The world will be worse off economically and environmentally.

Protest if you must, but let me offer a simple example. The leaders of the Western world seem to have united around abandoning coal immediately, because it is the highest CO₂ emitter during combustion (UN 2019). Instead, demanding reliable and affordable energy, Bangladesh, Pakistan, Germany, and so many more nations have embraced liquified natural gas (LNG) as a “bridge” fuel to replace coal. This “switch” is taking place despite questions about LNG’s impact on the environment, including the “climate”. This policy, supported by almost all large consultancies, indirectly caused blackouts affecting over 150 million people in Bangladesh in October 2022 (Reuters and Bloomberg).

So, the world is embarking on an expensive venture
to replace as much coal as possible with
more expensive liquified natural gas LNG.

On top of that, wind and solar are given preference. For example, the IEA recently confirmed that 2024 sparks the first year where investments in solar outstrip the combined investments in all other power generation technologies. As a result, energy costs go up, dependencies increase, lights go off, and, as per the UN’s IPCC, the “climate gets worse.”

Now imagine what would happen if we would truly take into account all environmental and human impacts, both negative and positive, along the entire value chain of energy production, transportation, processing, generation, consumption, and disposal… we would all be surprised! You would look at fossil fuels and certainly nuclear through different eyes. Instead we should simply incentivize resource and energy efficiency which will truly make a positive difference!

From Schernikau et al. 2022.

5.Conclusion

No matter what your view on climate change is, pricing CO₂ is harmful… why?
Answer: … because pricing one externality but not others leads to economic and environmental distortions…causing human suffering.

That is why, even considering the entire value chain, I do not support any CO₂ pricing.. That is why I fight for environmental and economic justice so we can, by avoiding energy starvation and resulting poverty, make a truly positive difference not only for ourselves but also for future generations to come.. . We need INvestment in, not DIvestment from 80% of our energy supply to rationalize our energy systems and to allow people and the planet to flourish.

I strongly support increasing adaptation efforts, which have already been successful in drastically reducing the death rate and GDP adjusted financial damage from natural disasters during the past
100 years (OurWorldInData, Pielke 2022, Economist).

 

 

 

 

 

 

 

 

Experimental Proof Nil Warming from GHGs

Thomas Allmendinger is a Swiss physicist educated at Zurich ETH whose practical experience is in the fields of radiology and elemental particles physics.  His complete biography is here.

His independent research and experimental analyses of greenhouse gas (GHG) theory over the last decade led to several published studies, including the latest summation The Real Origin of Climate Change and the Feasibilities of Its Mitigation, 2023, at Atmospheric and Climate Sciences journal. The paper is a thorough and detailed discussion of which I provide here a synopsis of his methods, findings and conclusions. Excerpts are in italics with my bolds and added images.

Abstract

The actual treatise represents a synopsis of six important previous contributions of the author, concerning atmospheric physics and climate change. Since this issue is influenced by politics like no other, and since the greenhouse-doctrine with CO2 as the culprit in climate change is predominant, the respective theory has to be outlined, revealing its flaws and inconsistencies.

But beyond that, the author’s own contributions are focused and deeply discussed. The most eminent one concerns the discovery of the absorption of thermal radiation by gases, leading to warming-up, and implying a thermal radiation of gases which depends on their pressure. This delivers the final evidence that trace gases such as CO2 don’t have any influence on the behaviour of the atmosphere, and thus on climate.

But the most useful contribution concerns the method which enables to determine the solar absorption coefficient βs of coloured opaque plates. It delivers the foundations for modifying materials with respect to their capability of climate mitigation. Thereby, the main influence is due to the colouring, in particular of roofs which should be painted, preferably light-brown (not white, from aesthetic reasons).

It must be clear that such a drive for brightening-up the World would be the only chance of mitigating the climate, whereas the greenhouse doctrine, related to CO2, has to be abandoned. However, a global climate model with forecasts cannot be aspired to since this problem is too complex, and since several climate zones exist.

Background

The alleged proof for the correctness of this theory was delivered 25 years later by an article in the Scientific American of the year 1982 [4]. Therein, the measurements of C.D. Keeling were reported which had been made at two remote locations, namely at the South Pole and in Hawaii, and according to which a continuous rise of the atmospheric CO2-concentration from 316 to 336 ppm had been detected between the years 1958 and 1978 (cf. Figure 1), suggesting coherence between the CO2 concentration and the average global temperature.

But apart from the fact that these CO2-concentrations are quite minor (400 ppm = 0.04%), and that a constant proportion between the atmospheric CO2-concentration and the average global temperature could not be asserted over a longer period, it should be borne in mind that this conclusion was an analogous one, and not a causal one, since solely a temporal coincidence existed. Rather, other influences could have been effective which happened simultaneously, in particular the increasing urbanisation, influencing the structure and the coloration of large parts of Earth surface.

However, this contingency was, and still is, categorically excluded. Solely the two possibilities are considered as explanation of the climate change: either the anthropogenic influence due to CO2-production, or a natural one which cannot be influenced. A third influence, the one suggested here, namely the one of colours, is a priori excluded, even though nobody denies the influence of colouring on the surface temperature of Earth and the existence of urban heat islands, and although an increase of winds and storms cannot be explained by the greenhouse theory.

However, already well in advance institutions were founded which aimed at mitigating climate change through political measures. Thereby, climate change was equated with the industrial CO2 production, although physical evidence for such a relation was not given. It was just a matter of belief. In this regard, in 1992 the UNFCCC (United Nations Framework Convention on Climate Change) was founded, supported by the IPCC (Intergovernmental Panel on Climate Change). In advance, side by the side with the UNO, numerous so-called COPs (Conferences on the Parties) were hold: the first one in 1985 in Berlin, the most popular one in 1997 in Kyoto, and the most important one in 2015 in Paris, leading to a climate convention which was signed by representatives of 195 nations. Thereby, numerous documents were compiled, altogether more than 40,000. But actually these documents didn’t fulfil the standards of scientific publications since they were not peer reviewed.

Subsequently, intensive research activities emerged, accompanied by a flood of publications, and culminating in several text books. Several climate models were presented with different scenarios and diverging long-term forecasts. Thereby, the fact was disregarded that indeed no global climate exists but solely a plurality of climates, or rather of micro-climates and at best of climate-zones, and that the Latin word “clima” (as well as the English word “clime”) means “region”. Moreover, an average global temperature is not really defined and thus not measurable because the temperature-differences are immense, for instance with regard to the geographic latitude, the altitude, the distinct conditions over sea and over land, and not least between the seasons and between day and night. Moreover, the term “climate” implicates rain and snow as well as winds and storms which, in the long-term, are not foreseeable. In particular, it should be realized that atmospheric processes are energetically determined, whereto the temperature contributes only a part.

2. The Historical Inducement for the Greenhouse Theory and Their Flaws

The scientific literature about the greenhouse theory is so extensive that it is difficult to find a clearly outlined and consistent description. Nevertheless, the publications of James E. Hansen [5] and of V. Ramanathan et al. [6] may be considered as authoritative. Moreover, the textbooks [7] [8] and [9] are worth mentioning. Therein it is assumed that Earth surface, which is heated up by sun irradiation, emits thermal radiation into the atmosphere, warming it up due to heat absorption by “greenhouse gases” such as CO2 and CH4. Thereby, counter-radiation occurs which induces a so-called radiative transfer. This aspect involved the rise of numerous theories (e.g. [10] [11] [12]). But the co-existence of theories is in contrast to the scientific principle that for each phenomenon solely one explanation or theory is admissible.

Already simple thoughts may lead one to question this theory. For instance: Supposing the present CO2-concentration of approx. 400 ppm (parts per million) = 0.04%, one should wonder how the temperature of the atmosphere can depend on such an extremely low gas amount, and why this component can be the predominant or even the sole cause for the atmospheric temperature. This would actually mean that the temperature would be situated near the absolute zero of −273˚C if the air would contain no CO2 or other greenhouse gases.

Indeed, no special physical knowledge is needed in order to realize that this theory cannot be correct. However, the fact that it has settled in the public mind, becoming an important political issue, requires a more detailed investigation of the measuring methods and their results which delivered the foundations of this theory, and why misinterpretations arose. Thereto, the two subsequent points have to be particularly considered: The first point concerns the photometrical measurements on gases in the electromagnetic range of thermal radiation which initially Tyndall had carried out in the 1860s [13], and which had been expanded to IR-measurements evaluated by Plass nineteen years later [14]. The second point concerns the application of the Stefan/Boltzmann-law on the Earth-atmosphere system firstly made by Arrhenius in 1896 [2], and more or less adopted by modern atmospheric physics. Both approaches are deficient and would question the greenhouse theory without requiring the author’s own approaches.

2.1 The Photometric and IR-Measurement Methods for CO2

By variation of the wave length and measuring the respective absorption, the spectrum of a substance can be evaluated. This IR-spectroscopic method is widely used in order to characterize organic chemical substances and chemical bonds, usually in solution. But even there this method is not suited for quantitative measurements, i.e. the absorption of the IR-active substance is not proportional to its concentration as the Beer-Lambert law predicts. It probably will even less be the case in the gaseous phase and, all the more, at high pressures which were applied in order to imitate the large distances in the atmosphere in the range of several (up to 10) kilometres. Thereby it is disregarded that the pressure of the atmosphere depends on the altitude above sea level, which prohibits the assumption of a linear progress.

Moreover, it is disregarded that at IR-spectrographs the effective radiation intensity is not known, and that in the atmosphere a gas mixture exists where the CO2 amounts solely to a little extent, whereas for the spectroscopic measurements pure CO2 was used. Nevertheless, in the text books for atmospheric physics the Beer-Lambert law is frequently mentioned, however without delivering concrete numerical results about the absorbed radiation.

In both cases solely the absorption degree of the radiation was determined, i.e. the decrease of the radiation intensity due to its run through a gas, but never its heating-up, that means its temperature increase. Instead, it was assumed that a gas is necessarily warmed up when it absorbs thermal radiation. According to this assumption, pure air, or rather a 4:1 mixture of nitrogen and oxygen, is expected to be not warmed up when it is thermally irradiated since it is IR-spectroscopically inactive, in contrast to pure CO2.

However, no physical formula exists which would allow to calculate such an effect, and no respective empirical evidence was given so far. Rather, the measurements which were recently performed by the author delivered converse, surprising results.

2.2. The Impact of Solar Radiation onto the Earth Surface and Its Reflexion

Besides, a further error is implicated in the usual greenhouse theory. It results from the fact that the atmosphere is only partly warmed up by direct solar radiation. In addition, it is warmed up indirectly, namely via Earth surface which is warmed up due to solar irradiation, and which transmits the absorbed heat to the atmosphere either by thermal conduction or by thermal radiation. Moreover, air convection contributes a considerable part. This process is called Anthropogenic Heat Flux (AHF). It has recently been discussed by Lindgren [16]. However, herewith a more fundamental view is outlined.

The thermal radiation corresponds to the radiative emission of a so-called “black body”. Such a body is defined as a body which entirely absorbs electromagnetic radiation in the range from IR to UV light. Likewise, it emits electromagnetic radiation all the more as its temperature grows. Its radiative behaviour is formulated by the law of Stefan and Boltzmann. . . According to this law, the radiation wattage Φ of a black body is proportional to the fourth power of its absolute temperature. Usually, this wattage is related to the area, exhibiting the dimension W/m2.

This formula does not allow making a statement about the wave-length or the frequency of the emitted light. This is only possible by means of Max Planck’s formula which was published in 1900. According to that, the frequencies of the emitted light tend to be the higher the temperature is. At low temperatures, only heat is emitted, i.e. IR-radiation. At higher temperatures the body begins to glow: first of all in red, and later in white, a mixture of different colours. Finally, UV-radiation emerges. The emission spectrum of the sun is in quite good accordance with Planck’s emission spectrum for approx. 6000 K.

Black co2 absorption lines are not to scale.

This model can be applied on Earth surface considering it as a coloured opaque body: On one side, with respect to its thermal emission, it behaves like a black body fulfilling the Stefan/Boltzmann-law. On the other side, it adsorbs only a part βs of the incident solar light, converting it into heat, whereas the complementary part is reflected. However, the intensity of the incident solar light on Earth surface, Φsurface, is not identically equal with its extra-terrestrial intensity beyond the atmosphere, but depends on the sea level since the atmosphere absorbs a part of the sunlight. Remarkably, the atmosphere behaves like a black body, too, but solely with respect to the emission: On one side, it radiates inwards to the Earth surface, and on the other side, it radiates outwards in the direction of the rest of the atmosphere.

However, this method implies three considerable snags:
•  Firstly, TEarth means the constant limiting temperature of the Earth surface which is attained when the sun had constantly shone onto the same parcel and with the same intensity. But this is never the case, except at thin plates which are thermally insulated at the bottom and at the sides, since the position of the sun changes permanently.

•  Secondly, this formula does not allow making a statement about the rate of the warming up-process, which depends on the heat capacity of the involved plate, too. This is solely possible using the author’s approach (see Chapter 3). Nevertheless, it is often attempted (e.g. in [35]), not least within radiative transfer approaches.

•  Thirdly, it is principally impossible to determine the absolute values of the solar reflection coefficient αs with an Albedometer or a similar apparatus, because the intensity of the incident solar light is independent of the distance to the surface, whereas the intensity of the reflected light depends on it. Thus, the herewith obtained values depend on the distance from Earth surface where the apparatus is positioned. So they are not unambiguous but only relative.

In the modern approach of Hansen et al. [5] the Earth is apprehended as a coherent black body, disregarding its segmentation in a solid and a gaseous part, and thus disregarding the contact area between Earth surface and the atmosphere where the reflexion of the sunlight takes place. As a consequence, in Equation (4b) the expression with Tair disappears, whereas a total Earth temperature appears which is not definable and not determinable. This approach has been widely adopted in the textbooks, even though it is wrong (see also [15]).

Altogether, the matter of fact was neglected that the proportionality of the radiation intensity to the absolute temperature to the fourth is solely valid if a constant equilibrium is attained. In contrast, the subsequently described method enables the direct detection of the colour dependent solar absorption coefficient βs = 1 –αs using well-defined plates. Furthermore, the time/temperature-courses are mathematically modelled up to the limiting temperatures. Finally, relative field measurements are possible based on these results.

3. The Measurement of Solar Absorption-Coefficients with Coloured Plates

Within the here described and in [20] published lab-like method, not the reflected but the absorbed sun radiation was determined, namely by measuring the temperature courses of coloured quadratic plates (10 × 10 × 2 cm3) when sunlight of known intensity came vertically onto these plates. The temperatures of the plates were determined by mercury thermometers, while the intensity of the sunlight was measured by an electronic “Solarmeter” (KIMO SL 100). The plates were embedded in Styrofoam and covered with a thin transparent foil acting as an outer window in order to minimize erratic cooling by atmospheric turbulence (Figure 5). Their heat capacities were taken from literature values. The colours as well as the plate material were varied. Aluminium was used as a reference material, being favourable due to its high heat capacity which entails a low heating rate and a homogeneous heat distribution. For comparison, additional measurements were made by wooden plates, bricks and natural stones. For enabling a permanent optimal orientation towards the sun, six plate-modules were positioned on an adjustable panel (Figure 6).

The evaluation of the curves of Figure 7 yielded the colour specific solar absorption-coefficients βs rendered in Figure 9. They were independent of the plate material. Remarkably, the value for green was relatively high.

Figure 7. Warming-up of aluminium plates at 1040 W/m2 [20].

If the sunlight irradiation and thus the warming-up process would be continued, finally constant limiting temperatures are attained. However, when 20 mm thick aluminium plates are used, the hereto needed time would be too long, exceeding the constantly available sunshine period during a day. Instead, separate cooling-down experiments were made, allowing a mathematical modelling of the whole process including the determination of the limiting temperatures.

Figure 10. Cooling-down of different materials (in brackets: ambient temperature) [20]. al = aluminium 20 mm; st = stone 20.5 mm; br = brick 14.5 mm; wo = wood 17.5 mm.

These limiting temperature values are in good accordance with the empirical values reported in [24] and with the Stefan/Boltzmann-values. As obvious from the respective diagrams in Figure 11 and Figure 12, the limiting temperatures are independent of the plate-materials, whereas the heating rates strongly depend on them.  In principal, it is also possible to model combined heating-up and cooling-down processes [20]. However, this presumes constant environmental conditions which normally do not exist.

4. Thermal Gas Absorption Measurements

If the warming-up behaviour of gases has to be determined by temperature measurements, interference by the walls of the gas vessel should be regarded since they exhibit a significantly higher heat capacity than the gas does, which implicates a slower warming-up rate. Since solid materials absorb thermal radiation stronger than gases do, the risk exists that the walls of the vessel are directly warmed up by the radiation, and that they subsequently transfer the heat to the gas. And finally, even the thin glass-walls of the thermometers may disturb the measurements by absorbing thermal radiation.

By these reasons, quadratic tubes with a relatively large profile (20 cm) were used which consisted of 3 cm thick plates from Styrofoam, and which were covered at the ends by thin plastic foils. In order to measure the temperature course along the tube, mercury-thermometers were mounted at three positions (beneath, in the middle, and atop) whose tips were covered with aluminium foils. The test gases were supplied from steel cylinders being equipped with reducing valves. They were introduced by a connecter during approx. one hour, because the tube was not gastight and not enough consistent for an evacuation. The filling process was monitored by means of a hygrometer since the air, which had to be replaced, was slightly humid. Afterwards, the tube was optimized by attaching adhesive foils and thin aluminium foils (see Figure 13). The equipment and the results are reported in [21].

Figure 13. Solar-tube, adjustable to the sun [21].

The initial measurements were made outdoor with twin-tubes in the presence of solar light. One tube was filled with air, and the other one with carbon-dioxide. Thereby, the temperature increased within a few minutes by approx. ten degrees till constant limiting temperatures were attained, namely simultaneously at all positions. Surprisingly, this was the case in both tubes, thus also in the tube which was filled with ambient air. Already this result delivered the proof that the greenhouse theory cannot be true. Moreover, it gave rise to investigate the phenomenon more thoroughly by means of artificial, better defined light.

Figure 14. Heat-radiation tube with IR-spot [21].

Accordingly, the subsequent experiments were made using IR-spots with wattages of 50 W, 100 W and 150W which are normally employed for terraria (Figure 14). Particularly the IR-spot with 150 W lead to a considerably higher temperature increase of the included gas than it was the case when sunlight was applied, since its ratio of thermal radiation was higher. Thereby, variable impacts such as the nature of the gas could be evaluated.

Due to the results with IR-spots at different gases (air, carbon-dioxide, the noble gases argon, neon and helium), essential knowledge could be gained. In each case, the irradiated gas warmed up until a stable limiting temperature was attained. Analogously to the case of irradiated coloured solid plates, the temperature increased until the equilibrium state was attained where the heat absorption rate was identically equal with the heat emission rate.

Figure 15. Time/temperature-curves for different gases [21] (150 W-spot, medium thermometer-position).

As evident from the diagram in Figure 15, the initial observation made with sunlight was approved that pure carbon-dioxide was warmed up almost to the same degree as air does (whereby ambient air only scarcely differed from a 4:1 mixture between nitrogen and oxygen). Moreover, noble gases absorb thermal radiation, too. As subsequently outlined, a theoretical explanation could be found thereto.

Interpretation of the Results

Comparison of the results obtained by the IR-spots, on the one hand, and those obtained with solar radiation, on the other hand, corroborated the conclusion that comparatively short-wave IR-radiation was involved (namely between 0.9 and 1.9 μm). However, subsequent measurements with a hotplate (<90˚C), placed at the bottom of the heat-radiation tube ([15], Figure 16), yielded that long-wave thermal radiation (which is expected at bodies with lower temperatures such as Earth surface) induces also temperature increase of air and of carbon-dioxide, cf. Figure 17.

Thus, the herewith discovered absorption effect at gases proceeds over a relatively wide wave-length range, in contrast to the IR-spectroscopic measurements where only narrow absorption bands appear. This effect is not exceptional, i.e. it occurs at all gases, also at noble gases, and leads to a significant temperature increase, even though it is spectroscopically not detectable. This temperature increase overlays an eventual temperature increase due to the specific IR-absorption since the intensity ratio of the latter one is very small.

This may be explained as follows: In any case, an oscillation of particles, induced by thermal radiation, acts a part. But whereas in the case of the specific IR-absorption the nuclei inside the molecules are oscillating along the chemical bond (which must be polar), in the relevant case here the electronic shell inside the atoms, or rather the electron orbit, is oscillating implicating oscillation energy. Obviously, this oscillation energy can be converted into kinetic translation energy of the entire atoms which correlates to the gas temperature, and vice versa.

5. The Altitude-Paradox of the Atmospheric Temperature

The statement that it’s colder in the mountains than in the lowlands is trivial. Not trivial is the attempt to explain this phenomenon since the reason is not readily evident. The usual explanation is given by the fact that rising air cools down since it expands due to the decreasing air-pressure. However, this cannot be true in the case of plateaus, far away from hillsides which engender ascending air streams. It appears virtually paradoxical in view of the fact that the intensity of the sun irradiation is much greater in the mountains than in the lowlands, in particular with respect to its UV-amount. Thereby, the intensity decrease is due to the scattering and the absorption of sunlight within the atmosphere, not only within the IR-range but also in the whole remaining spectral area. If such an absorption, named Raleigh-scattering, didn’t occur, the sky would not be blue but black.

However, the direct absorption of sunlight is not the only factor which determines the temperature of the atmosphere. Its warming-up via Earth surface,which is warmed up due to absorbed sun-irradiation, is even more important. Thereby, the heat transfer occurs partly by heat conduction and air convection, and partly by thermal radiation. But there is an additional factor which has to be regarded: namely the thermal radiation of the atmosphere. It runs on the one hand towards Earth (as counter-radiation), and on the other hand towards Space. Thus the situation becomes quite complicated, all the more the formal treatment based on the Stefan/Boltzmann-relation would require limiting equilibrated temperature conditions. But in particular, that relation does not reveal an influence of the atmospheric pressure which obviously acts a considerable part.

In order to study the dependency on the atmospheric pressure, it would be desirable solely varying the pressure, whereas the other terms remain constant by varying the altitude of the measuring station above sea level which implicates a variation of the intensity of the sunlight and of the ambient atmosphere temperature, too. The here reported measurements were made at two locations in Switzerland, namely at Glattbrugg (close to Zürich), 430 m above sea level, and at the top of the Furka pass, 2430 m above sea level. Using the barometric height formula, the respective atmospheric pressures were approx. 0.948 and 0.748 bar.  At any position, two measurements were made in the same space of time.

Figure 18. Comparison of the temperature courses during two measurements [24] (continuous lines: Glattbrugg; dotted lines: Furka).

Figure 18 renders the data of one measurement pair. Obviously, the limiting temperatures were not ideally attained within 90 minutes. Moreover, the evaluation of the data didn’t provide strictly invariant values for A. But this is reasonable in view of the fact that the sunlight intensity was not entirely constant during that period, and that its spectrum depends on the altitude over sea level. Nevertheless, for the atmospheric emission constant A an approximate value of 22W·m−2• bar−1• K−0.5 could be found.

These findings indeed confirm that in a way a greenhouse-effect occurs, since the atmosphere thermally radiates back to Earth surface. But this radiation has  nothing to do with trace gases such as CO2. It rather depends on the atmospheric pressure which diminishes at higher altitudes.

If the oxygen content of the air would be considerably reduced, a general reduction of the atmospheric pressure and, as a consequence, of the temperature would proceed. This may be an explanation for the appearance of glacial periods. However, other explanations are possible, in particular the temporary decrease of the sun activity.

Over all it can be stated that climate change cannot be explained by ominous greenhouse gases such as CO2, but mainly by artificial alterations of Earth surface, particularly in urban areas by darkening and by enlargement of the surface (so-called roughness). These urban alterations are not least due to the enormous global population growth, but also to the character of modern buildings tending to get higher and higher, and employing alternative materials such as concrete and glass. As a consequence, respective measures have to be focussed, firstly mentioning the previous work, and then applying the here presented method.

7. Conclusions

The herewith summarized work of the author concerns atmospheric physics with respect to climate change, comprising three specific and interrelated points based on several previous publications: The first one consists in a critical discussion and refutation of the customary greenhouse theory; the second one outlines the method for measuring the thermal-radiative behaviour of gases; and the third one describes a lab-like method for the characterization of the solar-reflective behaviour of solid opaque bodies, in particular for the determination of the colour-specific solar absorption coefficients.

As to the first point, three main flaws were revealed:

•  Firstly, the insufficiency of photometric methods in order to determine the heating-up of gases in the presence of thermal radiation;
•  Secondly, the lack of  causal relationship between the CO2-concentration in the atmosphere and the average global temperature, based on the reasoning that the empiric simultaneous increase of its concentration and of the global temperature would prove a causal relationship instead of an analogous one; and
•  Thirdly, the inadmissible application of the Stefan/Boltzmann-law to the entire Earth (including the atmosphere) versus Space, instead of the application onto the boundary between the Earth surface and the atmosphere.

As to the second point, the discovery has to be taken into account according to which every gas is warmed up when it is thermally irradiated, even noble gases, attaining a limiting temperature where the absorption of radiation is in equilibrium with the emitted radiation. In particular, pure CO2 behaves similarly to pure air. Applying kinetic gas theory, a dependency of the emission intensity on the pressure, on the root of the absolute temperature, and on the particle size could be found and theoretically explained by oscillation of the electron shell.

As to the third point not only a lab-like measuring method for the colour dependent solar absorption coefficient βs was developed, but also a mathematical modelling of the time/temperature-course where coloured opaque plates are irradiated by sunlight. Thereby, the (colour-dependent) warming-up and the (colour-independent) cooling-down are detected separately. Likewise, a limiting temperature occurs where the intensity of the absorbed solar light is identical equal with the intensity of the emitted thermal radiation. In the absence of wind-convection, the so-called heat transfer coefficient B is invariant. Its value was empirically evaluated, amounting to approx. 9 W·m−2•K−1.

Finally, the theoretically suggested dependency of the atmospheric thermal radiation intensity on the atmospheric pressure could be empirically verified by measurements at different altitudes, namely in Glattbrugg (430 m above sea level and on the top of the Furka-pass (2430 m above sea level), both in Switzerland, delivering a so-called atmospheric emission constant A ≈ 22 W·m−2•bar−1•K−0.5. It explained the altitude-paradox of the atmospheric temperature and delivered the definitive evidence that the atmospheric behavior, and thus the climate, does not depend on trace gases such as CO2. However, the atmosphere thermally reradiates indeed, leading to something similar to a Greenhouse effect. But this effect is solely due to the atmospheric pressure.

Therefore, and also considering the results of Seim and Olsen [23], the customary greenhouse doctrine assuming CO2 as the culprit in climate change has to be abandoned and instead replaced by the here recommended concept of improving the albedo by brightening parts of the Earth surface, particularly in cities, unless fatal consequences will be hazarded.

Figure 24. Up-winds induced by an urban heat island.

Polar Bears, Dead Coral and Other Climate Fictions

Bjorn Lomborg calls out climate alarmist nonsense in his WSJ article Polar Bears, Dead Coral and Other Climate Fictions.  Excerpts in italics with my bolds and added images.

Activists’ tales of doom never pan out,
but they leave us poorly informed and feed bad policy.

Whatever happened to polar bears? They used to be all climate campaigners could talk about, but now they’re essentially absent from headlines. Over the past 20 years, climate activists have elevated various stories of climate catastrophe, then quietly dropped them without apology when the opposing evidence becomes overwhelming. The only constant is the scare tactics.

Protesters used to dress up as polar bears. Al Gore’s 2006 film, “An Inconvenient Truth,” depicted a sad cartoon polar bear floating away to its death. The Washington Post warned in 2004 that the species could face extinction, and the World Wildlife Fund’s chief scientist claimed some polar bear populations would be unable to reproduce by 2012.

Then in the 2010s, campaigners stopped talking about them.

After years of misrepresentation, it finally became impossible to ignore the mountain of evidence showing that the global polar-bear population has increased substantially. Whatever negative effect climate change had was swamped by the reduction in hunting of polar bears. The population has risen from around 12,000 in the 1960s to about 26,000.

The same thing has happened with activists’ outcry about Australia’s Great Barrier Reef. For years, they shouted that the reef was being killed off by rising sea temperatures. After a hurricane extensively damaged the reef in 2009, official Australian estimates of the percent of reef covered in coral reached a record low in 2012. The media overflowed with stories about the great reef catastrophe, and scientists predicted the coral cover would be reduced by another half by 2022. The Guardian even published an obituary in 2014.

The percentage of coral cover in the northern and central Great Barrier Reef has increased.(Supplied: Australian Institute of Marine Science)

The latest official statistics show a completely different picture. For the past three years the Great Barrier Reef has had more coral cover than at any point since records began in 1986, with 2024 setting a new record. This good news gets a fraction of the coverage that the panicked predictions did.

More recently, green campaigners were warning that small Pacific islands would drown as sea levels rose. In 2019 United Nations Secretary-General António Guterres flew all the way to Tuvalu, in the South Pacific, for a Time magazine cover shot. Wearing a suit, he stood up to his thighs in the water behind the headline “Our Sinking Planet.” The accompanying article warned the island—and others like it—would be struck “off the map entirely” by rising sea levels.

Hundreds of Pacific Islands are growing, not shrinking. No habitable island got smaller.

About a month ago, the New York Times finally shared what it called“surprising” climate news: Almost all atoll islands are stable or increasing in size. In fact, scientific literature has documented this for more than a decade. While rising sea levels do erode land, additional sand from old coral is washed up on low-lying shores. Extensive studies have long shown this accretion is stronger than climate-caused erosion, meaning the land area of Tuvalu and many other small islands is increasing.

Today, killer heat waves are the new climate horror story. In July President Biden claimed “extreme heat is the No. 1 weather-related killer in the United States.”

He is wrong by a factor of 25. While extreme heat kills nearly 6,000 Americans each year, cold kills 152,000, of which 12,000 die from extreme cold. Even including deaths from moderate heat, the toll comes to less than 10,000. Despite rising temperatures, age-standardized extreme-heat deaths have actually declined in the U.S. by almost 10% a decade and globally by even more, largely because the world is growing more prosperous. That allows more people to afford air-conditioners and other technology that protects them from the heat.

My Mind is Made Up, Don’t Confuse Me with the Facts. H/T Bjorn Lomborg, WUWT

The petrified tone of heat-wave coverage twists policy illogically. Whether from heat or cold, the most sensible way to save people from temperature-related deaths would be to ensure access to cheap, reliable electricity. That way, it wouldn’t be only the rich who could afford to keep safe from blistering or frigid weather. Unfortunately, 

Activists do the world a massive disservice by refusing to acknowledge
facts that challenge their intensely doom-ridden worldview.

There is ample evidence that man-made emissions cause changes in climate, and climate economics generally finds that the costs of these effects outweigh the benefits. But the net result is nowhere near catastrophic. The costs of all the extreme policies campaigners push for are much worse. All told, politicians across the world are now spending more than $2 trillion annually—far more than the estimated cost from climate change that these policies prevent each year.

Yes, those are Trillions of US$ they are projecting to spend.

Scare tactics leave everyone—especially young people—distressed and despondent. Fear leads to poor policy choices that further frustrate the public. And the ever-changing narrative of disasters erodes public trust.

Telling half-truths while piously pretending to “follow the science” benefits activists with their fundraising, generates clicks for media outlets, and helps climate-concerned politicians rally their bases. But it leaves all of us poorly informed and worse off.

Mr. Lomborg is president of the Copenhagen Consensus, a visiting fellow at Stanford University’s Hoover Institution and author of “Best Things First: The 12 Most Efficient Solutions for the World’s Poorest and our Global SDG Promises.”

See Also:

You Won’t Survive “Sustainability” Agenda 2024

Propaganda Machine is Failing, Beware the Jackboots

Matthias Desmet previously alerted us to the appearance of mass formation supporting totalitarian Covid controls.  Now he writes a warning concerning popular rejection of the elite propaganda machine. His Brownstone article is The Propaganda Model Has Limits.  Excerpts in italics with my bolds and added images.  H/T Tyer Durden.

Normally, I let my pen rest during the summer months, but for some things, you set aside your habits. What has been happening in the context of the US presidential elections over the past few weeks is, to say the least, remarkable. We are witnessing a social system that – to use a term from complex dynamic systems theory – is heading toward a catastrophe.

And the essence of the tipping point we are approaching is this:
the propaganda model is beginning to fail.

It started a few weeks ago like this: Trump, the presidential candidate who must not win, is up against Biden, the presidential candidate who must win. After the first debate, it was immediately clear: Trump will win against Biden. The big problem: Biden and Jill are about the only ones who don’t realize this.

The media then turned against Biden. That, in itself, is a revolution. They had praised President Biden to the skies for four years, turning a blind eye to the fact that the man either seemed hardly aware of what he was saying or was giving speeches that could only be described as having the characteristics of a fascist’s discourse.

In any case, the media’s love for Biden was suddenly over when it became clear that he could not possibly win the election, even not with a little help from the media. If you want to know how that ‘little help’ worked in 2020, look at one of the most important interviews of the past year, where Mike Benz – former director of the cyber portfolio of the US government – explains to Tucker Carlson in detail how information flows on the internet were manipulated during the 2020 elections (and the Covid crisis).

After the first debate, the media realized that even they could not help Biden win the election. They changed their approach. Biden was quickly stripped of his saintly status. The Veil of Appearances was pulled away, and he suddenly stood naked and vulnerable in the eye of the mainstream – a man in the autumn of his life, mentally confused, addicted to power, and arrogant. Some journalists even started attributing traits of the Great Narcissistic Monster Trump to him.

Then things took another turn, a turn so predictable that one is astonished that it actually happened. An overaged teenager calmly climbed onto a roof with a sniper rifle, under the watchful eyes of the security services, and nearly shot Trump in the head. The security services, which initially did not respond for minutes when people tried to draw attention to the overaged teenager with an assault rifle, suddenly reacted decisively: they shot the overaged teenager dead seconds after the assassination attempt.

What happened there? There are many reasons to have reservations about Trump, but one thing we cannot help but say: if Trump becomes president, the war in Ukraine will be over. Anyone who does not attribute any weight to that should subject themselves to a conscience examination. And no, Trump will not have to give half of Europe to Putin for that. My cautious estimate, for what it’s worth: It will suffice for NATO to stop and partially reverse its eastward expansion, for Russia to retain access to the Black Sea via Crimea (something everyone with historical awareness knows that denying would mean the death blow to Russia as a great power and thus a direct declaration of war), and for the population of the Russian-speaking part of Ukraine to choose in a referendum whether to belong to Russia or Ukraine.

So, I repeat my point: with Trump, the provocation of Russia stops, and the war in Ukraine ends. Presidents who threaten to end wars are sometimes shot at by lone gunmen. And those lone gunmen are, in turn, shot dead. And the archives about that remarkable act of lone gunmen sometimes remain sealed for a remarkably long time, much longer than they usually do.

The news reaction was predictable. We are used to the media. Some journalists even suggested that Trump had been shot with a paintball, others thought the most accurate way to report was that someone ‘wounded Trump on the ear.’

In any case, after the assassination attempt, the situation became even more dire for the mainstream: the presidential candidate who must not win is now even more popular, and his victory in a race with Biden is almost inevitable.

Then the next chapter begins. Biden suddenly changes his mind: he has come to his senses and drops out of the race. He announces this – of all things – in a letter with a signature that, even for his shaky condition, looked quite clumsy. Then he stayed out of the public eye for a few days. We are curious about what exactly happened there.

But the media are compliant again. Biden has now been sanctified again. Just like Kamala Harris, of course. They are already mentioning polls showing she will beat Trump. With a little help from the media, of course. Curious how this will continue, but I would be surprised if the rest of the campaign will be a walk in the park. Trump is not safe after the first attempt, that’s for sure. And to Kamala Harris, I say this: when totalitarian systems go into a chaotic phase, they become monsters that devour their own children.

Case in Point 1: Orbán: The West Sees Immigration As A “Way Of Getting Rid Of The Ethnic Homogeneity That Is The Basis Of The Nation-State”

 

From Zerohedge:  In Hungarian Prime Minister Viktor Orbán’s speech at Tusványos in Romania, he focuses on the intractable differences developing between the East and West of Europe, with immigration one of the key divisions. He not only rejects the Western view on immigration, but sees it as an agenda with a very specific ideology behind it, which is designed to erode the nation-state entirely.

“But Westerners, quite differently, believe that nation-states no longer exist. They therefore deny that there is a common culture and a public morality based on (the nation-state). There is no public morality, if you watched the Olympic opening yesterday, you saw it. So, they also think differently about migration. They believe that migration is not a threat or a problem, but in fact a way of getting rid of the ethnic homogeneity that is the basis of a nation. This is the essence of the progressive liberal international concept. That is why the absurdity does not occur to them, or they do not see it as absurd,” he said.

This dramatic ideological cleavage is not a “secret,” according to Orbán. He said that the documents and policy papers coming out of the EU show that the “clear aim is to transcend the nation.”

“But the point is that the powers, the sovereignty, should be transferred from the nation-states to Brussels. This is the logic behind all major measures. In their minds, the nation is a historical, or transitional, formation of the 18th and 19th centuries — as it came, so it may go. They are already in a post-national state in the Western half of Europe. It’s not just a politically different situation, but what I’m trying to talk about here is that it’s a new mental space.

And finally, the last element of reality is that this post-national situation that we see in the West has a serious, I would say dramatic, political consequence that is shaking democracy. Societies are increasingly resistant to migration, gender, war and globalism. And this creates the political problem of elites and the people, elitism and populism. This is a dominant phenomenon in Western politics today…This means that the elites condemn the people for drifting to the right. The feelings and ideas of the people are labeled xenophobia, homophobia and nationalism. The people, meanwhile, in response, accuse the elite of not caring about what is important to them, but of sinking into some kind of mindless globalism.

Case in Point 2: UK Prime Minister Calls For Crackdown On Angry Brits – Violent Migrants Get A Pass

 

From Zerohedge: Beyond the obvious Cloward-Piven agenda in play throughout most of Europe and the US, open border policies accomplish much more than simply erasing western culture with third-world migrants. The introduction of violent peoples from violent countries and ideologies is a perfect way to generate public hostility and getting them to react in anger. When a government refuses to represent the interests of actual citizens that are under attack by foreign elements the only avenue left to that populace is self defense.

Establishment elites understand very well that their malicious activities are going to generate a vengeful response. Their first measure is to shame the public with accusations of “extremism” when the public fights back. When that doesn’t work, the next measure is to use popular riots as an excuse to impose authoritarian controls “in the name of safety.”

UK police and political authorities specifically avoid keeping a record of the migrant status of most perpetrators, making it difficult to directly prove who is responsible for the crime spike.  Correlation is not necessarily causation, but there were no other dramatic changes to UK society during that time period that would account for the crime spike.  The only thing that changed was the type of migrants the UK government was accepting and the amount of people they were allowing in.

Remove the migrants and watch crime numbers plummet; it’s that simple.
The UK public knows it and those in power choose to ignore it.
This has led to predictable conflict.

Keep in mind, leftist protests and riots have been ongoing in the UK for the past year in the name of Gaza, among other causes.  The law enforcement response to these situations has been minimal.  It is clear that there is a grotesque double standard when it comes to conservative or nationalist protests – If you aren’t on Team Progressive, then you aren’t allowed a redress of grievances.  The left is allowed to riot, conservatives are not even allowed to take to the streets.  This is what happened after January 6th in the US and it’s happening right now in the UK.

If the goal is a dystopian nightmare state then the UK is well on its way.  The public has made clear that they will no longer be abused.  The question is, who will blink first – The populace under threat of Orwellian surveillance and lockdowns, or the establishment fearing that the civil unrest they so hoped to trigger might grow out of their control.  

 

 

 

Arctic Ice Slight Deficit July 31, 2024

 

The graph above shows July daily ice extents for 2024 compared to 18 year averages, and some years of note.

The black line shows on average Arctic ice extents during July decline 2.7M km2 down to 6.9M Km2 by day 213.  2024  tracked a little higher than the 18-year average early in July, then slipped into deficit in the last 10 days.  SII was close to MASIE early in July, then diverged mid month showing up to 666k km2 lower until ending July ~300k km2 less extent than MASIE.  2023 was higher than average, while 2007 ended ~ 540k km2 in deficit to average.  2020 ice ended nearly 1 Wadham or 1M km2 in deficit.

Why is this important?  All the claims of global climate emergency depend on dangerously higher temperatures, lower sea ice, and rising sea levels.  The lack of additional warming prior to 2023 El Nino is documented in a post UAH June 2024: Oceans Lead Cool Down.

The lack of acceleration in sea levels along coastlines has been discussed also.  See Observed vs. Imagined Sea Levels 2023 Update.

Also, a longer term perspective is informative:

post-glacial_sea_levelThe table below shows the distribution of Sea Ice on day 213 across the Arctic Regions, on average, this year and 2007. At this point in the year, Bering and Okhotsk seas are open water and thus dropped from the table.

Region 2024213 Day 213 Ave. 2024-Ave. 2007213 2024-2007
 (0) Northern_Hemisphere 6634637 6882380 -247743 6344860 289777
 (1) Beaufort_Sea 717847 791600 -73754 760576 -42729
 (2) Chukchi_Sea 702720 534093 168628 382350 320370
 (3) East_Siberian_Sea 760894 740772 20122 445385 315509
 (4) Laptev_Sea 223615 368247 -144631 314382 -90767
 (5) Kara_Sea 136159 163171 -27012 239232 -103073
 (6) Barents_Sea 454 31406 -30952 23703 -23249
 (7) Greenland_Sea 237168 294526 -57358 324737 -87570
 (8) Baffin_Bay_Gulf_of_St._Lawrence 170245 145062 25183 94179 76066
 (9) Canadian_Archipelago 454695 540106 -85411 510063 -55368
 (10) Hudson_Bay 129434 133138 -3704 93655 35780
 (11) Central_Arctic 3098283 3138628 -40345 3154837 -56554

The overall deficit to average is 248k km2, (4%).  The major deficits are in Laptev, Beaufort and CAA (Canadian Archipelago), while Kara is the only region with a large surplus.

bathymetric_map_arctic_ocean

Illustration by Eleanor Lutz shows Earth’s seasonal climate changes. If played in full screen, the four corners present views from top, bottom and sides. It is a visual representation of scientific datasets measuring ice and snow extents.

There is no charge for content on this site, nor for subscribers to receive email notifications of postings.

 

Roots of Climate Change Distortions

Roger Pielke Jr. explains at his blog Why Climate Misinformation Persists.  Excerpts in italics with my bolds and added images. H/T John Ray

Noble Lies, Conventional Wisdom, and Luxury Beliefs

In 2001, I participated in a roundtable discussion hosted at the headquarters of the National Academy of Sciences (NAS) with a group of U.S. Senators, the Secretary of Treasury, and about a half-dozen other researchers. The event was organized by Idaho Senator Larry Craig (R-ID) following the release of a short NAS report on climate to help the then-new administration of George W. Bush get up to speed on climate change.

At the time I was a 32 year-old fresh-faced researcher about to leave the National Center for Atmospheric Research for a faculty position across town at the University of Colorado. I had never testified before Congress or really had any high-level policy engagement.

When the roundtable was announced, I experienced something completely new in my professional career — several of my much more senior colleagues contacted me to lobby me to downplay or even to misrepresent my research on the roles of climate and society in the economic impacts of extreme weather. I had become fairly well known in the atmospheric sciences research community back then for our work showing that increasing U.S. hurricane damage could be explained entirely by more people and more wealth.

One colleague explained to me that my research, even though scientifically accurate, might distract from efforts to advocate for emissions reductions:

“I think we have a professional (or moral?) obligation to be very careful what we say and how we say it when the stakes are so high.”

At the time, I wrote that the message I heard was that the “ ends justify means or, in other words, doing the “right thing” for the wrong reasons is OK” — even if that meant downplaying or even misrepresenting my own research.

I have thought about that experience over the past few weeks as I have received many comments on the first four installments of the THB series Climate Fueled Extreme Weather (Part 1Part 2Part 3Part 4). One of the most common questions I’ve received asks why it is that the scientific assessments of the Intergovernmental Panel on Climate Change (IPCC) are so different than what is reported in the media, proclaimed in policy, and promoted by the most public-facing climate experts. And, why can’t that gap be closed?

Over the past 23 years, I have wondered a lot myself about this question — not just how misinformation arises in policy discourse (we know a lot about that), but why it is that the expert climate community has been unable or unwilling to correct rampant misinformation about extreme weather, with some even promoting that misinformation.

Obviously, I don’t have good answers, but I will propose three inter-related explanations that help me to make sense of these dynamics — the noble lie, conventional wisdom, and luxury beliefs.

The Noble Lie

The most important explanation is that many in the climate community — like my senior colleague back in 2001 — appear to believe that achieving emissions reductions is so very important that its attainment trumps scientific integrity. The ends justify the means. They also believe that by hyping extreme weather, they will make emissions reductions more likely (I disagree, but that is a subject for another post).

I explained this as a “fear factor” in The Climate Fix:

Typically, the battle over climate-change science focused on convincing (or, rather, defeating) those skeptical has meant advocacy focused on increasing alarm. As one Australian academic put it at a conference at Oxford University in the fall of 2009: “The situation is so serious that, although people are afraid, they are not fearful enough given the science. Personally I cannot see any alternative to ramping up the fear factor.” Similarly, when asked how to motivate action on climate change, economist Thomas Schelling replied, “It’s a tough sell. And probably you have to find ways to exaggerate the threat. . . [P]art of me sympathizes with the case for disingenuousness. . . I sometimes wish that we could have, over the next five or ten years, a lot of horrid things happening—you know, like tornadoes in the Midwest and so forth—that would get people very concerned about climate change. But I don’t think that’s going to happen.” From the opening ceremony of the Copenhagen climate negotiations to Al Gore’s documentary, An Inconvenient Truth, to public comments from leading climate scientists, to the echo-chambers of the blogosphere, fear and alarm have been central to advocacy for action on climate change.

It’s just a short path from climate fueled extreme weather
to the noble lie at the heart of fear-based campaigns.

Conventional Wisdom

The phrase was popularized by John Kenneth Galbraith in his 1958 book, The Affluent Society, where he explained:

It will be convenient to have a name for the ideas which are esteemed at any time for their acceptability, and it should be a term that emphasizes this predictability. I shall refer to these ideas henceforth as the conventional wisdom.

The key point here is “acceptability” regardless of an idea’s conformance with truth. Conventional wisdom is that which everyone knows to be true whether actually true or not. Examples of beliefs that at one time or another were/are conventional wisdom include — Covid-19 did not come from a lab, Sudafed helps with hay fever, spinach is high in iron, and climate change is fueling extreme weather.

As the noble lie of climate fueled extreme weather has taken hold as conventional wisdom, few have been willing to offer correctives — Though there are important exceptions out in plain sight, like the IPCC Working Group 1 and the NOAA GFDL Hurricanes and Global Warming page.

Actively challenging conventional wisdom has professional and social costs. For instance, those who suggested that Covid-19 may have had a research-related origin were labeled conspiracy theorists and racists, those who suggested that President Biden was too old to serve another term were called Trump enablers, and those who accurately represent the science of climate and extreme weather are tarred as climate deniers and worse.

A few years ago I wrote an op-ed for a U.S. national newspaper and included a line about how hurricane landfalls had not increased in the U.S. in frequency or in their intensity since at least 1900. The editor removed the sentence and told me that while the statement was correct, most readers wouldn’t believe it and that would compromise the whole piece.

When noble lies become conventional wisdom,
they become harder to correct.

Luxury Beliefs

Yascha Mounk offers a useful definition:

Luxury beliefs are ideas professed by people who would be much less likely to hold them if they were not insulated from, and had therefore failed seriously to consider, their negative effects.

After all, what is the real harm if people believe in climate fueled extreme weather? If people believe that the extreme weather outside their window or on their feeds make aggressive climate policies more compelling, then that’s a good thing, right?

Surely there can only be positive outcomes that result from promoting misunderstandings of climate and extreme weather at odds with evidence and scientific assessments?

Well, no.

Last year I attended a forum at Lloyd’s of London — the event was held under the Chatham House rule, and in the event, later reported by the Financial Times, the company made a surprising admission contrary to the conventional wisdom (emphasis added):

Lloyd’s of London has warned insurers that the full impact of climate change has yet to translate into claims data despite annual natural catastrophe losses borne by the sector topping $100bn.

Insurance prices are surging as companies look to repair their margins after years of significant losses from severe weather to insured properties, exacerbated by inflation in rebuild costs. A warming planet has been identified by insurance experts and campaigners alike as a key factor.

But at a private event last month, one executive at the corporation that oversees the market told underwriters that it has not yet seen clear evidence that a warming climate is a major driver in claims costs.

It turns out that misunderstandings of the science of climate and extreme weather actually lead to unnecessary costs for people whose jobs involve making decisions about climate and extreme weather. Those unnecessary costs often trickle down to ordinary people. Kudos to Lloyd’s for calling things straight, even if only in a private forum — That is of course their professional responsibility.

Conclusion

In a classic paper in 2012, the late Steve Rayner explained that some forms of social ignorance are created and maintained on purpose:

To make sense of the complexity of the world so that they can act, individuals and institutions need to develop simplified, self-consistent versions of that world. The process of doing so means that much of what is known about the world needs to be excluded from those versions, and in particular that knowledge which is in tension or outright contradiction with those versions must be expunged. This is ‘uncomfortable knowledge’.

Climate fueled extreme weather is perhaps the canonical example
of dysfunctional socially constructed ignorance.  It may be
uncorrectable — a falsehood that persists permanently.

Here at THB, I am an optimist that science and policy are self-correcting, even if that takes a while. That means that the series on climate fueled extreme weather will keep going — Have a great weekend!

 

 

Power Density Physics Trump Energy Politics

A plethora of insane energy policy proposals are touted by clueless politicians, including the apparent Democrat candidate for US President.  So all talking heads need reminding of some basics of immutable energy physics.  This post is in service of restoring understanding of fundamentals that cannot be waved away.

The Key to Energy IQ

This brief video provides a key concept in order to think rationally about calls to change society’s energy platform.  Below is a transcript from the closed captions along with some of the video images and others added.

We know what the future of American energy will look like. Solar panels, drawing limitless energy from the sun. Wind turbines harnessing the bounty of nature to power our homes and businesses.  A nation effortlessly meeting all of its energy needs with minimal impact on the environment. We have the motivation, we have the technology. There’s only one problem: the physics.

The history of America is, in many ways, the history of energy. The steam power that revolutionized travel and the shipping of goods. The coal that fueled the railroads and the industrial revolution. The petroleum that helped birth the age of the automobile. And now, if we only have the will, a new era of renewable energy.

Except … it’s a little more complicated than that. It’s not really a matter of will, at least not primarily. There are powerful scientific and economic constraints on where we get our power from. An energy source has to be reliable; you have to know that the lights will go on when you flip the switch. An energy source needs to be affordable–because when energy is expensive…everything else gets more expensive too. And, if you want something to be society’s dominant energy source, it needs to be scalable, able to provide enough power for a whole nation.

Those are all incredibly important considerations, which is one of the reasons it’s so weird that one of the most important concepts we have for judging them … is a thing that most people have never heard of. Ladies and gentlemen, welcome to the exciting world of…power density.

Look, no one said scientists were gonna be great at branding. Put simply, power density is just how much stuff it takes to get your energy; how much land or other physical resources. And we measure it by how many watts you can get per square meter, or liter, or kilogram – which, if you’re like us…probably means nothing to you.

So let’s put this in tangible terms. Just about the worst energy source America has by the standards of power density are biofuels, things like corn-based ethanol. Biofuels only provide less than 3% of America’s energy needs–and yet, because of the amount of corn that has to be grown to produce it … they require more land than every other energy source in the country combined. Lots of resources going in, not much energy coming out–which means they’re never going to be able to be a serious fuel source.

Now, that’s an extreme example, but once you start to see the world in these terms, you start to realize why our choice of energy sources isn’t arbitrary. Coal, for example, is still America’s second largest source of electricity, despite the fact that it’s the dirtiest and most carbon-intensive way to produce it. Why do we still use so much of it? Well, because it’s significantly more affordable…in part because it’s way less resource-intensive.

An energy source like offshore wind, for example, is so dependent on materials like copper and zinc that it would require six times as many mineral resources to produce the same amount of power as coal. And by the way, getting all those minerals out of the ground…itself requires lots and lots of energy.

Now, the good news is that America has actually been cutting way down on its use of coal in recent years, thanks largely to technological breakthroughs that brought us cheap natural gas as a replacement. And because natural gas emits way less carbon than coal, that reduced our carbon emissions from electricity generation by more than 30%.

In fact, the government reports that switching over to natural gas did more than twice as much to cut carbon emissions as renewables did in recent years. Why did natural gas progress so much faster than renewables? It wasn’t an accident.

Energy is a little like money: You’ve gotta spend it to make it. To get usable natural gas, for example, you’ve first gotta drill a well, process and transport the gas, build a power plant, and generate the electricity. But the question is how much energy are you getting back for your investment? With natural gas, you get about 30 times as much power out of the system as you put into creating it.  By contrast, with something like solar power, you only get about 3 1/2 times as much power back.

Replacing the now closed Indian Point nuclear power plant would require covering all of Albany County NY with wind mills.

Hard to fuel an entire country that way. And everywhere you look, you see similarly eye-popping numbers. To replace the energy produced by just one oil well in the Permian Basin of Texas–and there are thousands of those–you’d need to build 10 windmills, each about 330 feet high. To meet just 10% of the country’s electricity needs, you’d have to build a wind farm the size of the state of New Hampshire. To get the same amount of power produced by one typical nuclear reactor, you’d need over three million solar panels, none of which means, by the way, that we shouldn’t be using renewables as a part of our energy future.

But it does mean that the dream of using only renewables is going to remain a dream,
at least given the constraints of current technology. We simply don’t know how
to do it while still providing the amount of energy that everyday life requires.

No energy source is ever going to painlessly solve all our problems. It’s always a compromise – which is why it’s so important for us to focus on the best outcomes that are achievable, because otherwise, New Hampshire’s gonna look like this.

Addendum from Michael J. Kelly

Energy return on investment (EROI)

The debate over decarbonization has focussed on technical feasibility and economics. There is one emerging measure that comes closely back to the engineering and the thermodynamics of energy production. The energy return on (energy) investment is a measure of the useful energy produced by a particular power plant divided by the energy needed to build, operate, maintain, and decommission the plant. This is a concept that owes its origin to animal ecology: a cheetah must get more energy from consuming his prey than expended on catching it, otherwise it will die. If the animal is to breed and nurture the next generation then the ratio of energy obtained from energy expended has to be higher, depending on the details of energy expenditure on these other activities. Weißbach et al. have analysed the EROI for a number of forms of energy production and their principal conclusion is that nuclear, hydro-, and gas- and coal-fired power stations have an EROI that is much greater than wind, solar photovoltaic (PV), concentrated solar power in a desert or cultivated biomass: see Fig. 2.

In human terms, with an EROI of 1, we can mine fuel and look at it—we have no energy left over. To get a society that can feed itself and provide a basic educational system we need an EROI of our base-load fuel to be in excess of 5, and for a society with international travel and high culture we need EROI greater than 10. The new renewable energies do not reach this last level when the extra energy costs of overcoming intermittency are added in. In energy terms the current generation of renewable energy technologies alone will not enable a civilized modern society to continue!

On Energy Transitions

Postscript

Fantasies of Clever Climate Policies

Chris Kenny writes at The Australian Facts at a premium in blustery climate debate. Excerpts in italics from text provided by John Ray at his blog, Greenie Watch.  My bolds and added images.

Collective Idiocy From Intellectual Vanity

We think we are so clever. The conceit of contemporary humankind is often unbearable.  Yet this modern self-regard has generated a collective idiocy, an inane confusion between feelings and facts, and an inability to distinguish between noble aims and hard reality.

This preference for virtue signalling over practical action can be explained only by intellectual vanity, a smugness that over-estimates humankind’s ability to shape the world it inhabits.

As a result we have a tendency to believe we are masters of the universe, that we can control the climate and regulate natural disasters. Too lazy or spoiled to weigh facts and think things through, we are more susceptible than ever to mass delusion.

We have seen this tendency play out in deeply worrying ways, such as the irrational belief in the communal benefits of Covid vaccination despite the distinct lack of scientific evidence. Too many people just wanted to believe the vaccine had this thing beaten.

Still, there is no area of public debate where rational thought is more readily cast aside than in the climate and energy debate. This is where alarmists demand that people “follow the science” while they deploy rhetoric, scare campaigns and policies that turn reality and science on their heads.

This nonsense is so widespread and amplified by so many authoritative figures that we have become inured to it. Teachers and children break from school to draw attention to what the UN calls a “climate emergency” as the world lives through its most populous and prosperous period in history, when people are shielded from the ill-effects of weather events better than they ever have been previously.

Politicians tell us in the same breath that producing clean energy is the most urgent and important task for the planet and reject nuclear energy, the only reliable form of emissions-free energy. The activists argue that reducing emissions is so imperative it is worth lowering living standards, alienating farmland, scarring forests and destroying industries, but it is not worth the challenge of boiling water to create energy-generating steam by using the tried and tested technology of nuclear fission.

Our acceptance of idiocy, unchecked and unchallenged, struck me in one interview this week given by teal MP Zali Steggall. In many ways it was an unexceptional interview; there are politicians and activists saying this sort of thing every day somewhere, usually unchallenged.

Steggall was preoccupied with Australia’s emissions reduction targets. “If we are going to be aligned to a science-based target and keep temperatures as close to 1.5 degrees as we can, we must have a minimum reduction of 75 per cent by 2035 as an interim target,” she said.

Steggall then patronised her audience by comparing meeting emissions targets to paying down a mortgage. The claim about controlling global temperatures is hard to take seriously, but to be fair it is merely aping the lines of the UN, which argues the increase in global average temperatures can be held to 1.5 degrees with emissions reductions of that size – globally.

We could talk all day about the imprecise nature of these calculations, the contested scientific debate about the role of other natural variabilities in climate, and the presumption that humankind, through policy imposed by a supranational authority, can control global climate as if with a thermostat. The simplistic relaying of this agenda as central to Australian policy decisions was not the worst aspect of Steggall’s presentation.

“The Coalition has no policy, so let’s be really clear, they are taking Australia out of the Paris Agreement if they fail to nominate an improvement with a 2035 target,” Steggall lectured, disingenuously.

This was Steggall promulgating the central lie of the national climate debatethat Australia’s emissions reduction policies can alter the climate. It is a fallacy embraced and advocated by Labor, the Greens and the teals, and which the Coalition is loath to challenge for fear of being tagged into a “climate denialism” argument.

It is arrant nonsense to suggest our policies can have any discernible effect on the climate or “climate risk”. Any politician suggesting so, directly or by implication, is part of a contemporary, fake-news-driven dumbing down of the public square, and injecting an urgency into our policy considerations that is hurting citizens already with high electricity prices, diminished reliability and a damaged economy.

Steggall went on to claim we were feeling the consequences of global warming already. “And for people wondering ‘How does that affect me?’, just look at your insurance premiums, our insurance premiums around Australia are going through the roof,” she extrapolated, claiming insurance costs were keeping people out of home ownership. “This is not a problem for the future,” Steggall stressed, “it is problem for now.”

It is a problem all right – it is unmitigated garbage masquerading as a policy debate. Taking it to its logical conclusion, Steggall claims if Australia reduced its emissions further we would lower the risk of natural disasters, leading to lower insurance premiums and improved housing affordability – it is surprising that world peace did not get a mention.

Mind you, these activists do like to talk about global warming as a security issue. They will say anything that heightens fears, escalates the problem and supports their push for more radical deindustrialisation.

Our national contribution to global emissions
is now just over 1 per cent and shrinking.

Australia’s annual emissions total less than 400 megatonnes while China’s are rising by more than that total each year and are now at 10,700Mt or about 30 times Australia’s. While our emissions reduce, global emissions are increasing. We could shut down our country, eliminating our emissions completely, and China’s increase would replace ours in less than a year.

So, whatever we are doing, it is not changing and cannot change the global climate. Our national chief scientist, Alan Finkel, clearly admitted this point in 2018, even though he was embarrassed by its implications in the political debate. Yet the pretence continues.

And before critics suggest I am arguing for inaction, I am not. But clearly, the logical and sensible baseline for our policy consideration should be a recognition that our national actions cannot change the weather. Therefore we should carefully consider adaptation to measured and verified climate change, while we involve ourselves as a responsible nation in global negotiations and action.

Obviously, we should not be leading that action but acting cautiously to protect our own interests and prosperity.

It is madness for us to undermine our cheap energy advantage to embark on a renewables-plus-storage experiment that no other country has dared to even try, when we know it cannot shift the global climate one iota. It is all pain for no gain.

Yet that is what this nation has done. So my question today is what has happened to our media, academia, political class and wider population so that it allows this debate and policy response to occur in a manner that is so divorced from reality?

Are we so complacent and overindulged that we accept post-rational debate to address our post-material concerns? Even when it is delivering material hardship to so many Australians and jeopardising our long-term economic security?

Should public debate accept absurd baseline propositions such as the idea that our energy transition sacrifice will improve the weather and reduce natural disasters, simply because they are being argued by major political groupings or the UN? Or should we not try to impose a dose of reality and stick to the facts?

This feebleness of our public debate has telling ramifications – there is no way this country could have embarked on the risky, expensive and doomed renewables-plus-storage experiment if policies and prognostications had been subject to proper scrutiny and debate.

Our media is now so polarised that the climate activists of Labor, the Greens and the teals are able to ensure their nonsensical advocacy is never challenged, and the green-left media, led by the publicly funded ABC, leads the charge in spreading misinformation.

Clearly, we are not as clever as we think. Our children need us to wise up.

Nine July Days Break Wind Power Bubble

Parker Gallant reports at his blog  Nine July Days Clearly Demonstrate Industrial Wind Turbines Intermittent Uselessness.  Excerpts in italics with my bolds and added image. H/T John Ray

The chart below uses IESO data for nine (9) July days and clearly demonstrates the vagaries of those IWT (Industrial Wind Turbines) which on their highest generation day operated at 39.7% of their capacity and on their lowest at 2.3%!  As the chart also notes, our natural gas plants were available to ramp up or down to ensure we had a stable supply of energy but rest assured IESO would have been busy either selling or buying power from our neighbours to ensure the system didn’t crash. [Independent Electricity System Operator for Ontario, Canada]

The only good news coming out of the review was that IESO did not curtail any wind generation as demand was atypical of Ontario’s summer days with much higher demand then those winter ones.

Days Gone By:         

Back and shortly after the McGuinty led Ontario Liberal Party had directed IESO to contract IWT as a generation source; theirAnnual Planning Outlook would suggest/guess those IWT would generate an average of 15% of their capacity during our warmer months (summer) and 45% of their capacity during our colder months (winter). For the full year they would be projecting an average generation of 30% of their capacity and presumably that assumption was based on average annual Ontario winds!

The contracts for those IWT offered the owners $135/MWh so over the nine days contained in the chart below those 125,275 MWh generated revenue for the owners of $16,912,125 even though they only generated an average of 11.8% of their capacity.  They are paid despite missing the suggested target IESO used because they rank ahead of most of Ontario’s other generation capacity with the exception of nuclear power due to the “first-to-the-grid” rights contained in their contracts at the expense of us ratepayers/taxpayers!

Should one bother to do the math as to the annual costs based on the 15% summer and 45% winter IESO previously used it would mean annual generation from those IWT in the summer would be about 3.9 TWh and 11.7 TWh in the winter with an annual cost of just over $2.1 billion for serving up frequently unneeded generation which is either sold off at a loss or curtailed!

Replacing Natural Gas Plants with BESS:

Anyone who has followed the perceived solution of ridding the electricity grid of fossil fuels such as natural gas will recognize ENGO [Environmental Non-Governmental Organizations] have convinced politicians that battery energy storage systems are the solution!  Well is it, and how much would Ontario have needed over those nine charted July days? One good example is July 9th and 10th and combining the energy generated by natural gas from the chart over those two days is the place to start. To replace that generation of 221,989 MW with BESS units the math is simple as those BESS units are reputed to store four (4) times their rated capacity. Dividing the MWh generated by Ontario’s natural gas generators by four over those two days therefore would mean we would need approximately 55,500 MW of BESS to replace what those natural gas plants generated.  That 55,500 MW of BESS storage is over 27 times what IESO have already contracted for and add huge costs to electricity generation in the province driving up the costs for all ratepaying classes. The BESS 2034 MW IESO already contracted are estimated to cost ratepayers $341 million annually meaning 55,500 MW of BESS to the grid would add over $9 billion annually to our costs to hopefully avoid blackouts!

The other interesting question is how would those 55,500 MW be able to recharge to be ready for future high demand days perhaps driven by EV recharging or those heating and cooling pumps operating?  The wind would have to be blowing strong and the sun would need to be shining but, as we know, both are frequently missing so bring us blackouts seems to be the theme proposed by those ENGO and our out of touch politicians and bureaucrats!

Just one simple example as to where we seem to be headed
based on the insane push to reach that “net-zero” emissions target!

IESO Ontario Electrical Energy Output by Source in 2023

Extreme Examples of Missing IWT generation:

What the chart doesn’t contain, or highlight is how those 4,900 MW of IWT capacity are undoubtedly consuming more power than they are generating on many occasions and the IESO data for those nine days contained some clear examples but less than a dozen are highlighted here!

To wit:

  • July 5th at Hour 11 they managed to deliver only 47 MWh!
  • July 7th at Hours 8, 9, and 10 they respectively generated 17 MWh, 3 MWh and 18 MWh! 
  • July 9th at Hour 9 they delivered 52 MWh!
  • July 12th at Hours 8, 9, 10 and 11 they respectively generated 33 MWh, 13 MWh, 13 MWh and 35 MWh. 
  • July 13th at Hours 9 and 10 they managed to generate 19 MWh and 39 MWh respectively! 

Conclusion:

Why politicians and bureaucrats around the world have been gobsmacked by those peddling the reputed concept of IWT generating cheap, reliable electricity is mind-blowing as the Chart coupled with the facts, clearly shows for just nine days and only looking at Ontario!

Much like the first electric car invented in 1839, by a Scottish inventor named Robert Davidson, the first electricity generated by a wind turbine came from another Scottish inventor, Sir James Blyth who in 1887 did exactly that. Neither of those old “inventions” garnered much global acceptance until those ENGO like Michael Mann and Greta arrived on the scene pontificating about “global warming” being caused by mankind’s use of fossil fuels!

As recent events have demonstrated both EV and IWT are not the panacea to save the world from either “global warming” or “climate change” even though both have “risen from the dead” due to the “net-zero” push by ENGO.

The time has come for our politicians to wake up and recognize they are supporting more then century old technology focused to try and rid the world of CO 2 emissions.  They fail to see without CO 2 mankind will be setback to a time when we had trouble surviving!

Stop the push and stop using ratepayer and taxpayer dollars for the fiction created by those pushing the “net-zero” initiative. That initiative is actually generating more CO 2 such as the 250 tons of concrete used for just one 2 MW IWT installation!   Reality Bites!

Happer: Cloud Radiation Matters, CO2 Not So Much

Earlier this month William Happer spoke on Radiation Transfer in Clouds at the EIKE conference, and the video is above.  For those preferring to read, below is a transcript from the closed captions along with some key exhibits.  I left out the most technical section in the latter part of the presentation. Text in italics with my bolds.

William Happer: Radiation Transfer in Clouds

People have been looking at Clouds for a very long time in in a quantitive way. This is one of the first quantitative studies done about 1800. And this is John Leslie,  a Scottish physicist who built this gadget. He called it an Aethrioscope, but basically it was designed to figure out how effective the sky was in causing Frost. If you live in Scotland you worry about Frost. So it consisted of two glass bulbs with a very thin capillary attachment between them. And there was a little column of alcohol here.

The bulbs were full of air, and so if one bulb got a little bit warmer it would force the alcohol up through the capillary. If this one got colder it would suck the alcohol up. So he set this device out under the clear sky. And he described that the sensibility of the instrument is very striking. For the liquor incessantly falls and rises in the stem with every passing cloud. in fine weather the aethrioscope will seldom indicate a frigorific impression of less than 30 or more than 80 millesimal degrees. He’s talking about how high this column of alcohol would go up and down if the sky became overclouded. it may be reduced to as low as 15 refers to how much the sky cools or even five degrees when the congregated vapours hover over the hilly tracks. We don’t speak English that way anymore but I I love it.

The point was that even in 1800 Leslie and his colleagues knew very well that clouds have an enormous effect on the cooling of the earth. And of course anyone who has a garden knows that if you have a clear calm night you’re likely to get Frost and lose your crops. So this was a quantitative study of that.

Now it’s important to remember that if you go out today the atmosphere is full of two types of radiation. There’s sunlight which you can see and then there is the thermal radiation that’s generated by greenhouse gases, by clouds and by the surface of the Earth. You can’t see thermal radiation but you you can feel it if it’s intense enough by its warming effect. And these curves practically don’t overlap so we’re really dealing with two completely different types of radiation.

There’s sunlight which scatters very nicely and off of not only clouds but molecules; it’s the blue sky the Rayleigh scattering. Then there’s the thermal radiation which actually doesn’t scatter at all on molecules so greenhouse gases are very good at absorbing thermal radiation but they don’t scatter it. But clouds scatter thermal radiation and plotted here is the probability that you will find Photon of sunlight between you know log of its wavelength and the log of in this interval of the wavelength scale.

Since Leslie’s day two types of instruments have been developed to do what he did more precisely. One of them is called a pyranometer and this is designed to measure sunlight coming down onto the Earth on a day like this. So you put this instrument out there and it would read the flux of sunlight coming down. It’s designed to see sunlight coming in every direction so it doesn’t matter which angle the sun is shining; it’s uh calibrated to see them all.

Let me show you a measurement by a pyranometer. This is a actually a curve from a sales brochure of a company that will sell you one of these devices. It’s comparing two types of detectors and as you can see they’re very good you can hardly tell the difference. The point is that if you look on a clear day with no clouds you see sunlight beginning to increase at dawn it peaks at noon and it goes down to zero and there’s no sunlight at night. So half of the day over most of the Earth there’s no sunlight in the in the atmosphere.

Here’s a day with clouds, it’s just a few days later shown by days of the year going across. You can see every time a cloud goes by the intensity hitting the ground goes down. With a little clear sky it goes up, then down up and so on. On average at this particular day you get a lot less sunlight than you did on the clear day.

But you know nature is surprising. Einstein had this wonderful quote: God is subtle but he’s not malicious. He meant that nature does all of sorts of things you don’t expect, and so let me show you what happens on a partly cloudy day. Here so this is data taken near Munich. The blue curve is the measurement and the red curve is is the intensity on the ground if there were no clouds. This is a partly cloudy day and you can see there are brief periods when the sunlight is much brighter on the detector on a cloudy day than it is on the clear day. And that’s because coming through clouds you get focusing from the edges of the cloud pointing down toward your detector. That means somewhere else there’s less radiation reaching the ground. But this is rather surprising to most people. I was very surprised to learn about it but it just shows that the actual details of climate are a lot more subtle than you might think.

We knnow that visible light only happens during the daytime and stops at night. There’s a second type of important radiation which is the thermal radiation which is measured by a similar divice. You have a silicon window that passes infrared, which is below the band gap of silicon, so it passes through it as though transparent. Then there’s some interference filters here to give you further discrimination against sunlight. So sunlight practically doesn’t go through this at all, so they call it solar solar blind since it doesn’t see the Sun.

But it sees thermal radiation very clearly with a big difference between this device and the sunlight sensing device I showed you. Because actually most of the time this is radiating up not down. Out in the open air this detector normally gets colder than the body of the instrument. And so it’s carefully calibrated for you to compare the balance of down coming radiation with the upcoming radiation. Upcoming is normally greater than downcoming.

I’ll show you some measurements of the downwelling flux here; these are actually in Greenland in Thule and these are are watts per square meter on the vertical axis here. The first thing to notice is that the radiation continues day and night you can you if you look at the output of the pyrgeometer you can’t tell whether it’s day or night because the atmosphere is just as bright at night as it is during the day. However, the big difference is clouds: on a cloudy day you get a lot more downwelling radiation than you do on a clear day. Here’s a a near a full day of clear weather there’s another several days of clear weather. Then suddenly it gets cloudy. Radiation rises because the bottoms of the clouds are relatively warm at least compared to the clear sky. I think if you put the numbers In, this cloud bottom is around 5° Centigrade so it was fairly low Cloud. it was summertime in Greenland and this compares to about minus 5° for the clear sky.

So there’s a lot of data out there and there really is downwelling radiation there no no question about that you measure it routinely. And now you can do the same thing looking down from satellites so this is a picture that I downloaded a few weeks ago to get ready for this talk from Princeton and it was from Princeton at 6 PM so it was already dark in Europe. So this is a picture of the Earth from a geosynchronous satellite that’s parked over Ecuador. You are looking down on the Western Hemisphere and this is a filtered image of the Earth in Blue Light at 47 micrometers. So it’s a nice blue color not so different from the sky and it’s dark where the sun has set. There’s still a fair amount of sunlight over the United States and the further west.

Here is exactly the same time and from the same satellite the infrared radiation coming up at 10.3 which is right in the middle of the infrared window where there’s not much Greenhouse gas absorption; there’s a little bit from water vapor but very little, trivial from CO2.

As you can see, you can’t tell which side is night and which side is day. So even though the sun has set over here it is still glowing nice and bright. There’s sort of a pesky difference here because what you’re looking at here is reflected sunlight over the intertropical Convergence Zone. There are lots of high clouds that have been pushed up by the convection in the tropics and uh so this means more visible light here. You’re looking at emission of the cloud top so this is less thermal light so white here means less light, white there means more light so you have to calibrate your thinking. to

But the Striking thing about all of this: if you can see the Earth is covered with clouds, you have to look hard to find a a clear spot of the earth. Roughly half of the earth maybe is clear at any given time but most of it’s covered with clouds. So if anything governs the climate it is clouds and and so that’s one of the reasons I admire so much the work that Svensmark and Nir Shaviv have done. Because they’re focusing on the most important mechanism of the earth: it’s not Greenhouse Gases, it’s Clouds. You can see that here.

Now this is a single frequency let me show you what happens if you look down from a satellite and do look at the Spectrum. This is the spectrum of light coming up over the Sahara Desert measured from a satellite. And so here is the infrared window; there’s the 10.3 microns I mentioned in the previous slide it’s it’s a clear region. So radiation in this region can get up from the surface of the Sahara right up to outer space.

Notice that the units on these scales are very different; over the Sahara the top unit is 200, 150 over the Mediterranean and it’s only 60 over the South Pole. But at least the Mediterranean and the Sahara are roughly similar so the right side here these three curves on the right are observations from satellites and the three curves on the left are are calculations modeling that we’ve done. The point here is that you can hardly tell the difference between a model calculation and observed radiation.

So it’s really straightforward to calculate radiation transfer. If someone quotes you a number in watts per square centimeter you should take it seriously; that probably a good number. If they tell you a temperature you don’t know what to make about it. Because there’s a big step between going from watts per square centimeter to a temperature change. All the mischief in the whole climate business is going from watts per square centimeter to to Centigrade or Kelvin.

Now I will say just a few words about clear sky because that is the simplest. Then we’ll get on to clouds, the topic of this talk. This is a calculation with the same codes that I showed you in the previous slide which as you saw work very well. It’s worth spending a little time because this is the famous Planck curve that was the birth of quantum mechanics. There is Max Planck who figured out what the formula for that curve is and why it is that way. This is what the Earth would radiate at 15° Centigrade if there were no greenhouse gases. You would get this beautiful smooth curve the Planck curve. If you actually look at the Earth from the satellites you get a raggedy jaggedy black curve. We like to call that the Schwarzchild curve because Carl Schwarzchild was the person who showed how to do that calculation. Tragically he died during World War I, a Big Big loss to science.

There are two colored curves that I want to draw your attention. The green curve is is what Earth would radiate to space if you took away all the CO2 so it only differs from the black curve you know in the CO2 band here this is the bending band of CO2 which is the main greenhouse effect of CO2. There’s a little additional effect here which is the asymmetric stretch but it it doesn’t contribute very much. Then here is a red curve and that’s what happens if you double CO2.

So notice the huge asymmetry. If taking all 400 parts per million of CO2 away from the atmosphere causes this enormous change 30 watts per square meter, the difference between this green 307 and and the black 277, that’s 30 watts per square meter. But if you double CO2 you practically don’t make any change. This is the famous saturation of CO2. At the levels we have now doubling CO2, a 100% Increase of CO2 only changes the radiation to space by 3 watts per square meter. The difference between 274 for the red curve and 277 for the curve for today. So it’s a tiny amount: for 100% increase in CO2 a 1% decrease of radiation to space.

That allows you to estimate the feedback-free climate sensitivity in your head. I’ll talk you through the feedback-free climate free sensitivity. So doubling CO2 is a 1% decrease of radiation to space. If that happens then the Earth will start to warm up. But it will radiate as the fourth power of the temperature. So temperature starts to rise but if you’ve got a fourth power, the temperature only has to rise by one-quarter of a percent absolute temperature. So a 1% forcing in watts per square centimeter is a one-quarter percent of temperature in Kelvin. Since the ambient Kelvin temperature is about 300 Kelvin (actually a little less) a quarter of that is 75 Kelvin. So the feedback free equilibrium climate sensitivity is less than 1 Degree. It’s 0.75 Centigrade. It’s a number you can do in your head.

So when you hear about 3 centigrade instead of .75 C that’s a factor of four, all of which is positive feedback. So how is there really that much positive feedback? Because most feedbacks in nature are negative. The famous Le Chatelier principle which says that if you perturb a system it reacts in a way to to dampen the perturbation not increase it. There are a few positive feedback systems that were’re familiar with for example High explosives have positive feedback. So if the earth’s climate were like other positive feedback systems, all of them are highly explosive, it would have exploded a long time ago. But the climate has never done that, so the empirical observational evidence from geology is that the climate is like any other feedback system it’s probably negative Okay so I leave that thought with you and and let me stress again:

This is clear skies no clouds; if you add clouds all this does is
suppress the effects of changes of the greenhouse gas.

So now let’s talk about clouds and the theory of clouds, since we’ve already seen clouds are very important. Here is the formidable equation of transfer which has been around since Schwarzchild’s day. So some of the symbols here relate to the intensity, another represents scattering. If you have a thermal radiation on a greenhouse gas where it comes in and immediately is absorbed, there’s no scattering at all. If you hit a cloud particle it will scatter this way or that way, or some maybe even backwards.

So all of that’s described by this integral so you’ve got incoming light at One Direction and you’ve got outgoing light at a second Direction. And then at the same time you’ve got thermal radiation so the warm particles of the cloud are are emitting radiation creating photons which are coming out and and increasing the Earth glow the and this is represented by two parameters. Even a single cloud particle has an albedo, this is is the fraction of radiation that hits the cloud that is scattered as opposed to absorbed and being converted to heat. It’s a very important parameter for visible light and white clouds, typically 99% of the encounters are scattered. But for thermal radiation it’s much less. So water scatters thermal radiation only half as efficiently as shorter wavelengths.

The big problem is that in spite of all the billions of dollars that we have spent, these things which should be known and and would have been known if there hadn’t been this crazy fixation on carbon dioxide and greenhouse gases. And so we’ve neglected working on these areas that are really important as opposed to the trivial effects of greenhouse gases. Attenuation in a cloud is both scattering and absorption. Of course you have to solve these equations for every different frequency of the light because especially for molecules, there’s a strong frequency dependence.

In summary,  let me show you this photo which was taken by Harrison Schmitt who was a friend of mine on one of the first moonshots. It was taken in December and looking at this you can see that they were south of Madagascar when the photograph was taken. You can see it was Winter because here the Intertropical Convergence Zone is quite a bit south of the Equator; it’s moved Way South of India and Saudi Arabia. By good luck they had the sun behind them so they had the whole earth Irradiated.

There’s a lot of information there and and again let me draw your attention to how much of the Earth is covered with clouds. So only very small parts of the Earth can actually be directly affected by greenhouse gases, of the order of half. The takeaway message is that clouds and water vapor are much more important than greenhouse gases for earth’s climate. The second point is the reason they’re much more important: doubling CO2 as I indicated in the middle of the talk only causes a 1% difference of radiation to space. It is a very tiny effect because of saturation. You know people like to say that’s not so, but you can’t really argue that one, even the IPCC gets the same numbers that we do.

And you also know that covering half of the sky with clouds will decrease solar heating by 50%. So for clouds it’s one to one, for greenhouse gases it’s a 100 to one. If you really want to affect the climate, you want to do something to the clouds. You will have a very hard time making any difference with Net Zero with CO2 if you are alarmed about the warmings that have happened.

So one would hope that with all the money that we’ve spent trying to turn CO2 into a demon that some good science has come out of it. Fom my point of view this is a small part of it, this scattering theory that I think will be here a long time after the craze over greenhouse gases has gone away. I hope there will be other things too. You can point to the better instrumentation that we’ve got, satellite instrumentation as well as ground instrumentation. So that’s been a good investment of money. But the money we’ve spent on supercomputers and modeling has been completely wasted in my view.