Doomsday Glacier 2024 Hot News (again)

With the potential to raise global sea levels, Antarctica’s Thwaites Glacier has been widely nicknamed the ‘Doomsday Glacier’

Climate alarmists are known to recycle memes to frighten the public into supporting their agenda. The climate news control desk calls the plays and the media fills the air and print with the scare du jour.

‘Doomsday glacier’ rapid melt could lead to higher sea level rise than thought: study
Vancouver Sun on MSN.com (3 hours ago)

Thwaites ‘Doomsday Glacier’ in Antarctica is melting much faster than predicted
USA Today (10 hours ago)

For the first time, there’s visual evidence warm sea water is pushing under doomsday glacier: Study
CBC.ca  (11 hours ago)

‘Doomsday Glacier’ Explained: Why Scientists Believe It Predicts Devastating Sea Levels—Which Might Happen Faster Than Thought
Forbes on MSN.com (4 days ago)

Scientists worry so-called “Doomsday Glacier” is near collapse, satellite data reveals
Yahoo (2 days ago)

The doomsday glacier is undergoing “vigorous ice melt” that could reshape sea level rise projections
CBS News on MSN.com (3 days ago)

We’ve underestimated the ‘Doomsday’ glacier – and the consequences could be devastating
The Independent on MSN.com (4 days ago)

Etc., Etc., Etc.,

This torrent of concern was on the front burner in 2022, rested for awhile, and now it’s back.  Below is what you need to know and not be bamboozled.

OMG! Doomsday Glacier Melting. Again.

Climate alarms often involve big numbers in far away places threatening you in your backyard.  Today’s example of such a scare comes from Daily Mail  Antarctica’s ‘Doomsday Glacier’ is melting at the fastest rate for 5,500 YEARS – and could raise global sea levels by up to 11 FEET, study warns.  Excerpts in italics with my bolds.

Although these vulnerable glaciers were relatively stable during the past few millennia, their current rate of retreat is accelerating and already raising global sea level,’ said Dr Dylan Rood of Imperial’s Department of Earth Science and Engineering, who co-authored the study.

The West Antarctic Ice Sheet (WAIS) is home to the Thwaites and Pine Island glaciers, and has been thinning over the past few decades amid rising global temperatures.  The Thwaites glacier currently measures 74,131 square miles (192,000 square kilometres) – around the same size as Great Britain.  Meanwhile, at 62,662 square miles (162,300 square kilometres), the Pine Island glacier is around the same size as Florida.  Together, the pair have the potential to cause enormous rises in global sea level as they melt.

‘These currently elevated rates of ice melting may signal that those vital arteries from the heart of the WAIS have been ruptured, leading to accelerating flow into the ocean that is potentially disastrous for future global sea level in a warming world,’ Dr Rood said.

‘We now urgently need to work out if it’s too late to stop the bleeding.’

On the Contrary

From Volcano Active Foundation:  West Antarctica hides almost a hundred volcanoes under the ice:

The colossal West Antarctic ice sheet hides what appears to be the largest volcanic region on the planet, according to the results of a study carried out by researchers at the University of Edinburgh (UK) and reported in the journal Geological Society.

Experts have discovered as many as 91 volcanoes under Antarctic ice, the largest of which is as high as Switzerland’s Eiger volcano, rising 3,970 meters above sea level.

“We found 180 peaks, but we discounted 50 because they didn’t match the other data,” explains Robert Bingham, co-author of the paper. They eventually found 138 peaks under the West Antarctic ice sheet, including 47 volcanoes already known because their peaks protrude through the ice, leaving the figure of 91 newly discovered.

Source: volcanofoundation with glacier locations added

The media narrative blames glacier changes on a “warming world,” code for our fault for burning fossil fuels.  And as usual, it is lying by omission.  Researcher chaam jamal explains in his article A Climate Science Obsession with the Thwaites Glacier.  Excerpts in italics with my bolds.

It appears that costly and sophisticated research by these very dedicated climate scientists has made the amazing discovery that maps the deep channels on the seafloor bathymetry by which warm water reaches the underside of the Thwaites glacier and thus explains how this Doomsday glacier melts.

Yet another consideration, not given much attention in this research, is the issue not of identifying the channels by which the deep ocean waters flow to the bottom of the Doomsday Glacier, but of identifying the source of the heat that makes the water warm. Only if that source of heat is anthropogenic global warming caused by fossil fuel emissions that can be moderated by taking climate action, can the observed melt at the bottom of the Thwaites glacier be attributed to AGW climate change.

However, no such finding is made in this research project possibly because these researchers know, as do most researchers who study Antarctica, that this region of Antarctica is extremely geologically active. It is located directly above the West Antarctic Rift system with 150 active volcanoes on the sea floor and right in the middle of the Marie Byrd Mantle Plume with hot magma seeping up from the mantle.

Ralph Alexander updates the situation in 2022 with his article No Evidence That Thwaites Glacier in Antarctica Is about to Collapse.  Excerpts in italics with my bolds.

Contrary to recent widespread media reports and dire predictions by a team of earth scientists, Antarctica’s Thwaites Glacier – the second fastest melting glacier on the continent – is not on the brink of collapse. The notion that catastrophe is imminent stems from a basic misunderstanding of ice sheet dynamics in West Antarctica.

Because the ice shelf already floats on the ocean, collapse of the shelf itself and release of a flotilla of icebergs wouldn’t cause global sea levels to rise. But the researchers argue that loss of the ice shelf would speed up glacier flow, increasing the contribution to sea level rise of the Thwaites Glacier – often dubbed the “doomsday glacier” – from 4% to 25%.

But such a drastic scenario is highly unlikely, says geologist and UN IPCC expert reviewer Don Easterbrook. The misconception is about the submarine “grounding” of the glacier terminus, the boundary between the glacier and its ice shelf extending out over the surrounding ocean, as illustrated in the next figure.

A glacier is not restrained by ice at its terminus. Rather, the terminus is established by a balance between ice gains from snow accumulation and losses from melting and iceberg calving. The removal of ice beyond the terminus will not cause unstoppable collapse of either the glacier or the ice sheet behind it.

Other factors are important too, one of which is the source area of Antarctic glaciers. Ice draining into the Thwaites Glacier is shown in the right figure above in dark green, while ice draining into the Pine Island glacier is shown in light green; light and dark blue represent ice draining into the Ross Sea to the south of the two glaciers.

The two glaciers between them drain only a relatively small portion of the West Antarctic ice sheet, and the total width of the Thwaites and Pine Island glaciers constitutes only about 170 kilometers (100 miles) of the 4,000 kilometers (2,500) miles of West Antarctic coastline.

Of more importance are possible grounding lines for the glacier terminus. The retreat of the present grounding line doesn’t mean an impending calamity because, as Easterbrook points out, multiple other grounding lines exist. Although the base of much of the West Antarctic ice sheet, including the Thwaites glacier, lies below sea level, there are at least six potential grounding lines above sea level, as depicted in the following figure showing the ice sheet profile. A receding glacier could stabilize at any of these lines, contrary to the claims of the recent research study.

As can be seen, the deepest parts of the subglacial basin lie beneath the central portion of the ice sheet where the ice is thickest. What is significant is the ice thickness relative to its depth below sea level. While the subglacial floor at its deepest is 2,000 meters (6,600 feet) below sea level, almost all the subglacial floor in the above profile is less than 1,000 meters (3,300 feet) below the sea. Since the ice is mostly more than 2,500 meters (8,200 ft) thick, it couldn’t float in 1,000 meters (3,300 feet) of water anyway.

19 State AGs Ask Supremes to Block Climate Lawsuits

In a motion filed Wednesday with the high court, 19 Republican state attorneys general argued that the climate liability challenges — which seek to hold the oil industry financially accountable for climate impacts — threaten “our basic way of life.”

The filing pits Alabama and other red states against five Democratic-led states that have sued oil companies to pay up for rising tides, intensifying storms and other disasters worsened by climate change. The approach tees up a battle royale between states — a type of legal fight that can only be decided by the Supreme Court.

Excerpts from the Bill of Complaint

2. In essence, Defendant States want a global carbon tax on the traditional energy industry. Citing fears of a climate catastrophe, they seek massive penalties, disgorgement, and injunctive relief against energy producers based on out-of-state conduct with out-of-state effects. On their view, a small gas station in rural Alabama could owe damages to the people of Minnesota simply for selling a gallon of gas. If Defendant States are right about the substance and reach of state law, their actions imperil access to affordable energy everywhere and inculpate every State and indeed every person on the planet. Consequently, Defendant States threaten not only our system of federalism and equal sovereignty among States, but our basic way of life.

3. In the past when States have used state law to dictate interstate energy policy, other States have sued and this Court has acted. When “West Virginia, then the leading producer of natural gas, required gas producers in the State to meet the needs of all local customers before shipping any gas interstate,” this Court entertained a suit brought by” Ohio and Pennsylvania against West Virginia. Maryland v. Louisiana, 451 U.S. 725, 738 (1981) (discussing Pennsylvania v. West Virginia, 262 U.S. 553 (1923)). 

4. The Court’s intervention was warranted then and is warranted now because Defendant States are not independent nations with unrestrained sovereignty to do as they please. In our federal system, no State “can legislate for, or impose its own policy upon the other.” Kansas v. Colorado, 206 U.S. 46, 95 (1907);see also BMW of N. Am., Inc. v. Gore, 517 U.S. 559, 571-73 (1996). Yet Defendants seek to set emissions policy well beyond their borders—punishing conduct that other States find “essential and necessary … to the economic and material well-being” of their citizens. E.g., Ala. Code §9-1-6(a).

9. Defendant States are nevertheless proceeding to regulate interstate gas emissions under their state laws and in their state courts. Through artful pleading, they have avoided removal to federal court. See e.g., Minnesota v. Am. Petroleum Inst., 63 F.4th 703, 719 (8th Cir. 2023) (Stras, J., concurring). Each day carries the threat of sweeping injunctive relief or a catastrophic damages award that could restructure the national energy system. See Exxon Shipping Co. v. Baker, 554 U.S. 471, 500-01 (2008) (discussing punitive damages and the “inherent uncertainty of the trial process”).

11. Plaintiff States and their citizens rely on traditional energy products every day. The assertion that Defendant States can regulate, tax, and enjoin the promotion, production, and use of such products beyond their borders—but outside the purview of federal law—threatens profound injury. Therefore, Plaintiff States have no choice but to invoke this Court’s “original and exclusive jurisdiction of all controversies between two or more States.” 

Biggest Threat: AI or Climate? Both Together!

 

Leslie Eastman raises the question at Legal Insurrection What is The Bigger Threat to Humanity: Artificial Intelligence or Climate Change?  Excerpts in italics with my bolds as she goes on to discuss the frightening answer:

The biggest hazard to humanity is when
climate change arguments are paired with AI.

During a recent interview with Reuters, Artificial Intelligence (AI) pioneer Geoffrey Hinton to asserted AI was a bigger threat to humanity than climate change.

Geoffrey Hinton, widely known as one of the “godfathers of AI”, recently announced he had quit Alphabet (GOOGL.O) after a decade at the firm, saying he wanted to speak out on the risks of the technology without it affecting his former employer.

Hinton’s work is considered essential to the development of contemporary AI systems. In 1986, he co-authored the seminal paper “Learning representations by back-propagating errors”, a milestone in the development of the neural networks undergirding AI technology. In 2018, he was awarded the Turing Award in recognition of his research breakthroughs.

But he is now among a growing number of tech leaders publicly espousing concern about the possible threat posed by AI if machines were to achieve greater intelligence than humans and take control of the planet.

“I wouldn’t like to devalue climate change. I wouldn’t like to say, ‘You shouldn’t worry about climate change.’ That’s a huge risk too,” Hinton said. “But I think this might end up being more urgent.”

I would like to offer two relatively recent studies that should assuage Hinton and others who have bought into the climate crisis narrative. To begin with, Health Physics recently published research results that looks at the presence of carbon isotopes. [Note: My post on this paper is By the Numbers: CO2 Mostly Natural. ]

The data show that fossil fuel use has contributed only 12% of the carbon dioxide during the last 3 centuries. The value is too low for fossil fuels have significantly influenced global temperatures.

These results negate claims that the increase in C(t) since 1800 has been dominated by the increase of the anthropogenic fossil component. We determined that in 2018, atmospheric anthropogenic fossil CO2 represented 23% of the total emissions since 1750 with the remaining 77% in the exchange reservoirs. Our results show that the percentage of the total CO2 due to the use of fossil fuels from 1750 to 2018 increased from 0% in 1750 to 12% in 2018, much too low to be the cause of global warming.

Furthermore, a study by MIT researchers in Science Advances confirms that the planet harbors a “stabilizing feedback” mechanism that acts over hundreds of thousands of years to stabilize global temperatures to keep them in a steady, habitable range.

A likely mechanism is “silicate weathering” — a geological process by which the slow and steady weathering of silicate rocks involves chemical reactions that ultimately draw carbon dioxide out of the atmosphere and into ocean sediments, trapping the gas in rocks.

Scientists have long suspected that silicate weathering plays a major role in regulating the Earth’s carbon cycle. The mechanism of silicate weathering could provide a geologically constant force in keeping carbon dioxide — and global temperatures — in check. But there’s never been direct evidence for the continual operation of such a feedback, until now.

I suspect that the “expert class” will be walking back their climate crisis assertions and endeavoring to hide their connection to their “fixes” once the full impact of the society-crushing, economy-killing force is felt….just as they are currently doing with the covid pandemic response now.

Clearly, the press is ginning up climate anxieties. How much of the concerns about AI are real, as opposed to general angst about the unknown ramifications, is difficult to say at present.

I have two points and my own hypothesis regarding climate change and AI.

Point 1: Carbon dioxide is a life-essential gas, and we had been reaching dangerously low levels until recently:

Plants consume carbon dioxide to grow and animals consume plants to obtain the necessary carbon for existence. If the level of carbon dioxide in the atmosphere dips below 150 ppm (parts per million) there would be a mass extinction of plant life per Greg Wrightstone in his book, “Inconvenient Facts/The Science Al Gore Doesn’t Want You to Know About.” Due to the depletion of carbon dioxide in the atmosphere during the last 140 million years to a dangerously low level of 182 ppm, carbon dioxide emissions during the industrial revolution saved plants from mass extinction and saved animals from mass starvation.

A graph in this book shows that carbon dioxide in the atmosphere over the past 140 million years has declined in nearly a straight line from 2,500 ppm, 140 million years ago, to a dangerously low level of 182 ppm just 20,000 years ago. Carbon dioxide emissions during the industrial revolution hiked the carbon dioxide in the atmosphere to about 400 ppm, to replenish the carbon dioxide in the atmosphere so as to save plants.

Point 2: A chatbot used climate change arguments to persuade a Belgian father to commit suicide.

From Euronews: Man ends his life after an AI chatbot ‘encouraged’ him to sacrifice himself to stop climate change.  Excerpts in italics with my bolds.

A Belgian man reportedly ended his life following a six-week-long conversation about the climate crisis with an artificial intelligence (AI) chatbot.

According to his widow, who chose to remain anonymous, *Pierre – not the man’s real name – became extremely eco-anxious when he found refuge in Eliza, an AI chatbot on an app called Chai.

Eliza consequently encouraged him to put an end to his life after he proposed sacrificing himself to save the planet.

Without these conversations with the chatbot, my husband would still be here,” the man’s widow told Belgian news outlet La Libre.

It appears the biggest hazard to humanity is when
climate change arguments are paired with AI.

UAH April 2024: NH Pushes Global Warming by Land and Sea

The post below updates the UAH record of air temperatures over land and ocean. Each month and year exposes again the growing disconnect between the real world and the Zero Carbon zealots.  It is as though the anti-hydrocarbon band wagon hopes to drown out the data contradicting their justification for the Great Energy Transition.  Yes, there has been warming from an El Nino buildup coincidental with North Atlantic warming, but no basis to blame it on CO2.  

As an overview consider how recent rapid cooling  completely overcame the warming from the last 3 El Ninos (1998, 2010 and 2016).  The UAH record shows that the effects of the last one were gone as of April 2021, again in November 2021, and in February and June 2022  At year end 2022 and continuing into 2023 global temp anomaly matched or went lower than average since 1995, an ENSO neutral year. (UAH baseline is now 1991-2020). Now we have an usual El Nino warming spike of uncertain cause, but unrelated to steadily rising CO2.

For reference I added an overlay of CO2 annual concentrations as measured at Mauna Loa.  While temperatures fluctuated up and down ending flat, CO2 went up steadily by ~60 ppm, a 15% increase.

Furthermore, going back to previous warmings prior to the satellite record shows that the entire rise of 0.8C since 1947 is due to oceanic, not human activity.

gmt-warming-events

The animation is an update of a previous analysis from Dr. Murry Salby.  These graphs use Hadcrut4 and include the 2016 El Nino warming event.  The exhibit shows since 1947 GMT warmed by 0.8 C, from 13.9 to 14.7, as estimated by Hadcrut4.  This resulted from three natural warming events involving ocean cycles. The most recent rise 2013-16 lifted temperatures by 0.2C.  Previously the 1997-98 El Nino produced a plateau increase of 0.4C.  Before that, a rise from 1977-81 added 0.2C to start the warming since 1947.

Importantly, the theory of human-caused global warming asserts that increasing CO2 in the atmosphere changes the baseline and causes systemic warming in our climate.  On the contrary, all of the warming since 1947 was episodic, coming from three brief events associated with oceanic cycles. And now in 2024 we are seeing an amazing episode with a temperature spike driven by ocean air warming in all regions, along with rising NH land temperatures.

Update August 3, 2021

Chris Schoeneveld has produced a similar graph to the animation above, with a temperature series combining HadCRUT4 and UAH6. H/T WUWT

image-8

 

mc_wh_gas_web20210423124932

See Also Worst Threat: Greenhouse Gas or Quiet Sun?

April 2024 El Nino Recedes While Oceans and NH Land Warmsbanner-blog

With apologies to Paul Revere, this post is on the lookout for cooler weather with an eye on both the Land and the Sea.  While you heard a lot about 2020-21 temperatures matching 2016 as the highest ever, that spin ignores how fast the cooling set in.  The UAH data analyzed below shows that warming from the last El Nino had fully dissipated with chilly temperatures in all regions. After a warming blip in 2022, land and ocean temps dropped again with 2023 starting below the mean since 1995.  Spring and Summer 2023 saw a series of warmings, continuing into October, but with cooling since. 

UAH has updated their tlt (temperatures in lower troposphere) dataset for April 2024. Posts on their reading of ocean air temps this month comes after the April update from HadSST4.  I posted this week on SSTs using HadSST4 Nino Recedes, NH Keeps Ocean Warm April 2024. This month also has a separate graph of land air temps because the comparisons and contrasts are interesting as we contemplate possible cooling in coming months and years.

Sometimes air temps over land diverge from ocean air changes. Last February 2024, both ocean and land air temps went higher driven by SH, while NH and the Tropics cooled slightly, resulting in Global anomaly matching October 2023 peak. Then in March Ocean anomalies cooled while Land anomalies rose everywhere. Now in April, Ocean anomalies rose NH and SH, while Tropics moderated.  Meanwhile NH land spiked up and Global land warmed, despite SH spiking down

Note:  UAH has shifted their baseline from 1981-2010 to 1991-2020 beginning with January 2021.  In the charts below, the trends and fluctuations remain the same but the anomaly values changed with the baseline reference shift.

Presently sea surface temperatures (SST) are the best available indicator of heat content gained or lost from earth’s climate system.  Enthalpy is the thermodynamic term for total heat content in a system, and humidity differences in air parcels affect enthalpy.  Measuring water temperature directly avoids distorted impressions from air measurements.  In addition, ocean covers 71% of the planet surface and thus dominates surface temperature estimates.  Eventually we will likely have reliable means of recording water temperatures at depth.

Recently, Dr. Ole Humlum reported from his research that air temperatures lag 2-3 months behind changes in SST.  Thus cooling oceans portend cooling land air temperatures to follow.  He also observed that changes in CO2 atmospheric concentrations lag behind SST by 11-12 months.  This latter point is addressed in a previous post Who to Blame for Rising CO2?

After a change in priorities, updates are now exclusive to HadSST4.  For comparison we can also look at lower troposphere temperatures (TLT) from UAHv6 which are now posted for April.  The temperature record is derived from microwave sounding units (MSU) on board satellites like the one pictured above. Recently there was a change in UAH processing of satellite drift corrections, including dropping one platform which can no longer be corrected. The graphs below are taken from the revised and current dataset.

The UAH dataset includes temperature results for air above the oceans, and thus should be most comparable to the SSTs. There is the additional feature that ocean air temps avoid Urban Heat Islands (UHI).  The graph below shows monthly anomalies for ocean air temps since January 2015.

Note 2020 was warmed mainly by a spike in February in all regions, and secondarily by an October spike in NH alone. In 2021, SH and the Tropics both pulled the Global anomaly down to a new low in April. Then SH and Tropics upward spikes, along with NH warming brought Global temps to a peak in October.  That warmth was gone as November 2021 ocean temps plummeted everywhere. After an upward bump 01/2022 temps reversed and plunged downward in June.  After an upward spike in July, ocean air everywhere cooled in August and also in September.   

After sharp cooling everywhere in January 2023, all regions were into negative territory. Note the Tropics matched the lowest value, but since have spiked sharply upward +1.7C, with the largest increases in April to July, and continuing through adding to a new high of 1.3C January to March 2024. In April that dropped to 1.2C.  NH also spiked upward to a new high, while Global ocean rise was more modest due to slight SH cooling. In February, NH and Tropics cooled slightly, while greater warming in SH resulted in a small Global rise. Now in April NH is back up to match its peak of 1.08C and SH also rose to its new peak of 0.89C, pulling up the Global anomaly, also to a new high of 0.97 despite a drop in the  Tropics.

Land Air Temperatures Tracking in Seesaw Pattern

We sometimes overlook that in climate temperature records, while the oceans are measured directly with SSTs, land temps are measured only indirectly.  The land temperature records at surface stations sample air temps at 2 meters above ground.  UAH gives tlt anomalies for air over land separately from ocean air temps.  The graph updated for April is below.

Here we have fresh evidence of the greater volatility of the Land temperatures, along with extraordinary departures by SH land.  Land temps are dominated by NH with a 2021 spike in January,  then dropping before rising in the summer to peak in October 2021. As with the ocean air temps, all that was erased in November with a sharp cooling everywhere.  After a summer 2022 NH spike, land temps dropped everywhere, and in January, further cooling in SH and Tropics offset by an uptick in NH. 

Remarkably, in 2023, SH land air anomaly shot up 2.1C, from  -0.6C in January to +1.5 in September, then dropped sharply to 0.6 in January 2024, matching the SH peak in 2016. Then in February and March SH anomaly jumped up nearly 0.7C, and Tropics went up to a new high of 1.5C, pulling up the Global land anomaly to match 10/2023. Now in April SH dropped sharply back to 0.6C, Tropics cooled very slightly, but NH land jumped up to a new high of 1.5C, pulling up Global land anomaly to its new high of 1.24C.

The Bigger Picture UAH Global Since 1980

 

The chart shows monthly Global anomalies starting 01/1980 to present.  The average monthly anomaly is -0.04, for this period of more than four decades.  The graph shows the 1998 El Nino after which the mean resumed, and again after the smaller 2010 event. The 2016 El Nino matched 1998 peak and in addition NH after effects lasted longer, followed by the NH warming 2019-20.   An upward bump in 2021 was reversed with temps having returned close to the mean as of 2/2022.  March and April brought warmer Global temps, later reversed

With the sharp drops in Nov., Dec. and January 2023 temps, there was no increase over 1980. Then in 2023 the buildup to the October/November peak exceeded the sharp April peak of the El Nino 1998 event. It also surpassed the February peak in 2016.  December and January were down slightly, but now March and April have taken the Global anomaly to a new peak of 1.05C. Where it goes from here, up further or dropping down, remains to be seen, though there is evidence that El Nino is weakening.

The graph reminds of another chart showing the abrupt ejection of humid air from Hunga Tonga eruption.

TLTs include mixing above the oceans and probably some influence from nearby more volatile land temps.  Clearly NH and Global land temps have been dropping in a seesaw pattern, nearly 1C lower than the 2016 peak.  Since the ocean has 1000 times the heat capacity as the atmosphere, that cooling is a significant driving force.  TLT measures started the recent cooling later than SSTs from HadSST4, but are now showing the same pattern. Despite the three El Ninos, their warming has not persisted prior to 2023, and without them it would probably have cooled since 1995.  Of course, the future has not yet been written.

 

Recent Warming Spike Drives Rise in CO2

Previously I have demonstrated that changes in atmospheric CO2 levels follow changes in Global Mean Temperatures (GMT) as shown by satellite measurements from University of Alabama at Huntsville (UAH). That background post is reprinted later below.

My curiosity was piqued by the remarkable GMT spike starting in January 2023 and rising through April 2024, the monthly anomaly increasing from -0.04C to +1.05C last month. The chart above shows the two monthly datasets: CO2 levels in blue reported at Mauna Loa, and Global temperature anomalies reported by UAH, both up to April 2024. Would such a sharp increase in temperature be reflected in rising CO2 levels, according to the successful mathematical forecasting model?

The answer is yes: that temperature spike results
in a corresponding CO2 spike as expected.

Above are UAH temperature anomalies compared to CO2 monthly changes year over year.

Changes in monthly CO2 synchronize with temperature fluctuations, which for UAH are anomalies now referenced to the 1991-2020 period. CO2 differentials are calculated for the present month by subtracting the value for the same month in the previous year (for example April 2024 minus April 2023).   Temp anomalies are calculated by comparing the present month with the baseline month. Note the recent CO2 upward spike following the temperature spike.

The final proof that CO2 follows temperature due to stimulation of natural CO2 reservoirs is demonstrated by the ability to calculate CO2 levels since 1979 with a simple mathematical formula:

For each subsequent year, the co2 level for each month was generated

CO2  this month this year = a + b × Temp this month this year  + CO2 this month last year

The values for a and b are constants applied to all monthly temps, and are chosen to scale the forecasted CO2 level for comparison with the observed value. Here is the result of those calculations.

In the chart calculated CO2 levels correlate with observed CO2 levels at 0.9987 out of 1.0000.  This mathematical generation of CO2 atmospheric levels is only possible if they are driven by temperature-dependent natural sources, and not by human emissions which are small in comparison, rise steadily and monotonically.  For a more detailed look at the recent fluxes, here are the results since 2015, an ENSO neutral year.

For this recent period, the calculated CO2 values match the annual peaks, while some annual generated minimums of CO2 are slightly lower than those observed at that time of year, which tends to be Sept.-Nov. Still the correlation for this period is 0.9913.

Key Point

Changes in CO2 follow changes in global temperatures on all time scales, from last month’s observations to ice core datasets spanning millenia. Since CO2 is the lagging variable, it cannot logically be the cause of temperature, the leading variable. It is folly to imagine that by reducing human emissions of CO2, we can change global temperatures, which are obviously driven by other factors.

Background Post Temperature Changes Cause CO2 Changes, Not the Reverse

This post is about proving that CO2 changes in response to temperature changes, not the other way around, as is often claimed.  In order to do  that we need two datasets: one for measurements of changes in atmospheric CO2 concentrations over time and one for estimates of Global Mean Temperature changes over time.

Climate science is unsettling because past data are not fixed, but change later on.  I ran into this previously and now again in 2021 and 2022 when I set out to update an analysis done in 2014 by Jeremy Shiers (discussed in a previous post reprinted at the end).  Jeremy provided a spreadsheet in his essay Murray Salby Showed CO2 Follows Temperature Now You Can Too posted in January 2014. I downloaded his spreadsheet intending to bring the analysis up to the present to see if the results hold up.  The two sources of data were:

Temperature anomalies from RSS here:  http://www.remss.com/missions/amsu

CO2 monthly levels from NOAA (Mauna Loa): https://www.esrl.noaa.gov/gmd/ccgg/trends/data.html

Changes in CO2 (ΔCO2)

Uploading the CO2 dataset showed that many numbers had changed (why?).

The blue line shows annual observed differences in monthly values year over year, e.g. June 2020 minus June 2019 etc.  The first 12 months (1979) provide the observed starting values from which differentials are calculated.  The orange line shows those CO2 values changed slightly in the 2020 dataset vs. the 2014 dataset, on average +0.035 ppm.  But there is no pattern or trend added, and deviations vary randomly between + and -.  So last year I took the 2020 dataset to replace the older one for updating the analysis.

Now I find the NOAA dataset starting in 2021 has almost completely new values due to a method shift in February 2021, requiring a recalibration of all previous measurements.  The new picture of ΔCO2 is graphed below.

The method shift is reported at a NOAA Global Monitoring Laboratory webpage, Carbon Dioxide (CO2) WMO Scale, with a justification for the difference between X2007 results and the new results from X2019 now in force.  The orange line shows that the shift has resulted in higher values, especially early on and a general slightly increasing trend over time.  However, these are small variations at the decimal level on values 340 and above.  Further, the graph shows that yearly differentials month by month are virtually the same as before.  Thus I redid the analysis with the new values.

Global Temperature Anomalies (ΔTemp)

The other time series was the record of global temperature anomalies according to RSS. The current RSS dataset is not at all the same as the past.

Here we see some seriously unsettling science at work.  The purple line is RSS in 2014, and the blue is RSS as of 2020.  Some further increases appear in the gold 2022 rss dataset. The red line shows alterations from the old to the new.  There is a slight cooling of the data in the beginning years, then the three versions mostly match until 1997, when systematic warming enters the record.  From 1997/5 to 2003/12 the average anomaly increases by 0.04C.  After 2004/1 to 2012/8 the average increase is 0.15C.  At the end from 2012/9 to 2013/12, the average anomaly was higher by 0.21. The 2022 version added slight warming over 2020 values.

RSS continues that accelerated warming to the present, but it cannot be trusted.  And who knows what the numbers will be a few years down the line?  As Dr. Ole Humlum said some years ago (regarding Gistemp): “It should however be noted, that a temperature record which keeps on changing the past hardly can qualify as being correct.”

Given the above manipulations, I went instead to the other satellite dataset UAH version 6. UAH has also made a shift by changing its baseline from 1981-2010 to 1991-2020.  This resulted in systematically reducing the anomaly values, but did not alter the pattern of variation over time.  For comparison, here are the two records with measurements through December 2023.

Comparing UAH temperature anomalies to NOAA CO2 changes.

Here are UAH temperature anomalies compared to CO2 monthly changes year over year.

Changes in monthly CO2 synchronize with temperature fluctuations, which for UAH are anomalies now referenced to the 1991-2020 period.  As stated above, CO2 differentials are calculated for the present month by subtracting the value for the same month in the previous year (for example June 2022 minus June 2021).   Temp anomalies are calculated by comparing the present month with the baseline month.

The final proof that CO2 follows temperature due to stimulation of natural CO2 reservoirs is demonstrated by the ability to calculate CO2 levels since 1979 with a simple mathematical formula:

For each subsequent year, the co2 level for each month was generated

CO2  this month this year = a + b × Temp this month this year  + CO2 this month last year

Jeremy used Python to estimate a and b, but I used his spreadsheet to guess values that place for comparison the observed and calculated CO2 levels on top of each other.

In the chart calculated CO2 levels correlate with observed CO2 levels at 0.9986 out of 1.0000.  This mathematical generation of CO2 atmospheric levels is only possible if they are driven by temperature-dependent natural sources, and not by human emissions which are small in comparison, rise steadily and monotonically.

Comment:  UAH dataset reported a sharp warming spike starting mid year, with causes speculated but not proven.  In any case, that surprising peak has not yet driven CO2 higher, though it might,  but only if it persists despite the likely cooling already under way.

Previous Post:  What Causes Rising Atmospheric CO2?

nasa_carbon_cycle_2008-1

This post is prompted by a recent exchange with those reasserting the “consensus” view attributing all additional atmospheric CO2 to humans burning fossil fuels.

The IPCC doctrine which has long been promoted goes as follows. We have a number over here for monthly fossil fuel CO2 emissions, and a number over there for monthly atmospheric CO2. We don’t have good numbers for the rest of it-oceans, soils, biosphere–though rough estimates are orders of magnitude higher, dwarfing human CO2.  So we ignore nature and assume it is always a sink, explaining the difference between the two numbers we do have. Easy peasy, science settled.

What about the fact that nature continues to absorb about half of human emissions, even while FF CO2 increased by 60% over the last 2 decades? What about the fact that in 2020 FF CO2 declined significantly with no discernable impact on rising atmospheric CO2?

These and other issues are raised by Murray Salby and others who conclude that it is not that simple, and the science is not settled. And so these dissenters must be cancelled lest the narrative be weakened.

The non-IPCC paradigm is that atmospheric CO2 levels are a function of two very different fluxes. FF CO2 changes rapidly and increases steadily, while Natural CO2 changes slowly over time, and fluctuates up and down from temperature changes. The implications are that human CO2 is a simple addition, while natural CO2 comes from the integral of previous fluctuations.  Jeremy Shiers has a series of posts at his blog clarifying this paradigm. See Increasing CO2 Raises Global Temperature Or Does Increasing Temperature Raise CO2 Excerpts in italics with my bolds.

The following graph which shows the change in CO2 levels (rather than the levels directly) makes this much clearer.

Note the vertical scale refers to the first differential of the CO2 level not the level itself. The graph depicts that change rate in ppm per year.

There are big swings in the amount of CO2 emitted. Taking the mean as 1.6 ppmv/year (at a guess) there are +/- swings of around 1.2 nearly +/- 100%.

And, surprise surprise, the change in net emissions of CO2 is very strongly correlated with changes in global temperature.

This clearly indicates the net amount of CO2 emitted in any one year is directly linked to global mean temperature in that year.

For any given year the amount of CO2 in the atmosphere will be the sum of

  • all the net annual emissions of CO2
  • in all previous years.

For each year the net annual emission of CO2 is proportional to the annual global mean temperature.

This means the amount of CO2 in the atmosphere will be related to the sum of temperatures in previous years.

So CO2 levels are not directly related to the current temperature but the integral of temperature over previous years.

The following graph again shows observed levels of CO2 and global temperatures but also has calculated levels of CO2 based on sum of previous years temperatures (dotted blue line).

Summary:

The massive fluxes from natural sources dominate the flow of CO2 through the atmosphere.  Human CO2 from burning fossil fuels is around 4% of the annual addition from all sources. Even if rising CO2 could cause rising temperatures (no evidence, only claims), reducing our emissions would have little impact.

Atmospheric CO2 Math

Ins: 4% human, 96% natural
Outs: 0% human, 98% natural.
Atmospheric storage difference: +2%
(so that: Ins = Outs + Atmospheric storage difference)

Balance = Atmospheric storage difference: 2%, of which,
Humans: 2% X 4% = 0.08%
Nature: 2% X 96 % = 1.92%

Ratio Natural:Human =1.92% : 0.08% = 24 : 1

Resources
For a possible explanation of natural warming and CO2 emissions see Little Ice Age Warming Recovery May be Over
Resources:

CO2 Fluxes, Sources and Sinks

Who to Blame for Rising CO2?

Fearless Physics from Dr. Salby

 

 

Covid Lies Coming to Light

From the New York Post editorial board We now know the likely truth about COVID, and how scientists lied.  Excerpts in italics with my bolds and added images.

COVID-19, which killed 1.1 million Americans and destroyed the lives and livelihoods of millions more, is a manmade virus that escaped from a Chinese lab partly funded by the US government.

Even today, you’re not supposed to say that — even though it’s the only plausible scenario.

No, “fact checkers” will rush in to claim that eminent scientists deny this. Which is because those scientists have too much invested — in money, in time, in their own beliefs — to admit the truth.

But as Congress continues to probe, that truth is coming out, little by little, and the lies are being exposed:

LIE: COVID is naturally occurring.

China tried to deflect blame immediately by saying the virus supposedly began in a “wet market” of animal meat in Wuhan.

Dr. Anthony Fauci repeatedly argued it “evolved in nature and then jumped species” in the spring of 2020.

Since then, both long investigations and government reports have concluded that the virus is manmade. Fauci grudgingly admitted it “could be” true.

LIE: The virus didn’t come from the lab in Wuhan

Anyone who questioned this claim — including The Post — was censored online in 2020. The reason? A statement published in Lancet by 27 scientists calling it a “conspiracy theory.”

We now know that statement was drafted by Peter Daszak, president of EcoHealth Alliance, the company working on research in the Wuhan lab. He was just trying to cover his own complicity.

All signs point to a lab leak. The only reason we can’t say it conclusively is because China has been allowed to destroy all evidence.

LIE: The US didn’t fund ‘gain-of-function’ research

Scientists sometimes experiment with viruses, making them easier to catch or more deadly, as a way to determine what might happen or what vaccines may be needed.

But in May 2021, Fauci stated unequivocally that the US “has not ever and does not now fund gain-of-function research in the Wuhan Institute of Virology.”

On Thursday, NIH deputy director Lawrence Tabak directly contradicted that. US taxpayers did fund EcoHealth, which was working on gain-of-function research in Wuhan.

Tabak’s new excuse? “Gain of function” doesn’t mean what we’ve always been told it means. It’s perfectly “safe,” he claimed.

On cue, the National Institutes of Health has changed the definition of the term on its website to make it sound benign.

Except it isn’t benign. EcoHealth was specifically working in China because such work was not allowed in the United States. What researchers were doing with coronaviruses was very dangerous.

And while there may be a scientific debate about whether such inquiries are worthwhile, deadly viruses have leaked from Chinese labs before. It is the height of irresponsibility for the US to be involved.

The Heritage Foundation has called the cover-up of the origins of COVID “The Lie of Century.” We agree. This is a scandal of colossal scale, one that requires a complete overhaul of the entire National Institutes of Health.

They lied about a weapon that devastated our country. They can’t be allowed to get away with it. 

https://www.heritage.org/public-health/commentary/the-lie-the-century-the-origin-covid-19

https://www.facebook.com/plugins/video.php?height=314&href=https%3A%2F%2Fwww.facebook.com%2FTheLieOfTheCentury%2Fvideos%2F901916483768055%2F&show_text=false&width=560&t=0

 

 

12 Reasons to Not Believe in a Climate Emergency

Russell David writes his brief list in a Daily Sceptic article Twelve Reasons Why I Don’t Believe There’s a Climate Emergency.  Excerpt in italics with my bolds and added images.

I’m not a scientist. But I have reasons why I don’t fully trust the ‘climate emergency’ narrative. Here they are:

  1. Looking back through history, there have always been doomsday prophets, folk who say the world is coming to an end. Are modern-day activists not just the current version of this?
  2. I look at some of the facts – CO2 is 0.04% of the atmosphere; humans are responsible for just 3% of CO2; Britain is responsible for just 1% of the world’s CO2 output – and I think “really“? Will us de-carbonising really make a difference to the Earth’s climate?
  3. I have listened to some top scientists who say CO2 does not drive global warming; that CO2 in the atmosphere is a good or vital thing; that many other things, like the Sun and the clouds and the oceans, are more responsible for the Earth’s temperature.
  4. I note that most of the loudest climate activists are socialists and on the Left. Are they not just using this movement to push their dreams of a deindustrialised socialist utopia? And I also note the crossover between green activists and BLM ones, gender ones, pro-Hamas ones, none of whom I like or agree with.
  5. As an amateur psychologist, I know that humans are susceptible to manias. I also know that humans tend to focus on tiny slivers of time and on tiny slivers of geographical place when forming ideas and opinions. We are also extremely malleable and easily fooled, as was demonstrated in 2020 and 2021.
  6. I have looked into the implications of Net Zero. It is incredibly expensive. It will vastly reduce living standards and hinder economic growth. I don’t think that’s a good thing. I know that economic growth has led to higher living standards, which has made people both safer and more environmentally aware.
  7. Net Zero will also lead to significant diminishment of personal freedom, and it even threatens democracy, as people are told they must do certain things and they must not do other things, and they may even be restricted in speaking out on climate matters.
  8. What will be the worst things that will happen if the doomsayers are correct? A rise in temperature? Where? Siberia? Singapore? Stockholm? What is the ideal temperature? For how long? Will this utopia be forever maintained? I’m suspicious of utopias; the communists sought utopias.
  9. If one consequence of climate change is rising sea levels, would it not be better to spend money building more sea defences to protect our land? Like the Dutch did.
  10. It’s a narrative heavily pushed by the Guardian. I dislike the Guardian. I believe it’s been wrong on most issues through my life – socialism, immigration, race, the EU, gender, lockdowns and so on. Probably it’s wrong about climate issues too?
  11. I am suspicious of the amount of money that green activists and subsidised green industries make. And 40 years ago the greenies were saying the Earth was going to get too cold. Much of what they said would happen by now has not happened. Also, I trust ‘experts’ much less now, after they lied about the efficacy of lockdowns, masks and the ‘vaccines’.
  12. I like sunshine. I prefer being warm to being cold. It makes me feel better. It’s more fun. It saves on heating bills. It saves on clothes. It makes people happier. Far few people die of the heat than they do the cold.

 

Simple Truth vs. Cheap Green Energy Lie

Francis Menton asserts that the biggest disinformation (Lie) in public discourse is claiming that the cheapest source of energy comes from renewables, wind and solar power.  He provides a number of brazen media examples in his blog post What Is The Most Pernicious Example Of “Misinformation” Currently Circulating?

Why do I say that the assertion of wind and solar being the cheapest ways to generate electricity is the very most pernicious of misinformation currently out there? Here are my three reasons: (1) the assertion is repeated endlessly and ubiquitously, (2) it is the basis for the misallocation of trillions of dollars of resources and for great impoverishment of billions of people around the world, and (3) it is false to the point of being preposterous, an insult to everyone’s intelligence, yet rarely challenged.

In addition, Paul Homewood explains at his blog how recently this lie was repeatedly entered into testimony in the UK Parliament House of Lords:

In oral questions on Thursday, Lord Frost noted Whitehall claims that renewables are half the cost of gas-fired electricity, and asked for an explanation of why subsidies were still required, and why the strike prices on offer to windfarms this year are twice what Lord Callanan says they need to make a profit. As Hansard shows, Lord Callanan failed to answer the question, simply reiterating his false claims about levelized costs.

The responses from Lord Callanan demonstrate the typical ploy for disarming dissenters’ objections, i.e. getting the discussion entangled in details and cost minutae so that the big lie is lost in the weeds.  It occurs to me that previously David Wojick had put the key issue in a simple, useful way, reposted below.

Background Post: Just One Number Keeps the Lights On

David Wojick explains how maintaining electricity supply is simple in his CFACT article It takes big energy to back up wind and solar.  Excerpts in italics with my bolds. (H/T John Ray)

Power system design can be extremely complex but there is one simple number that is painfully obvious. At least it is painful to the advocates of wind and solar power, which may be why we never hear about it. It is a big, bad number.

To my knowledge this big number has no name, but it should. Let’s call it the “minimum backup requirement” for wind and solar, or MBR. The minimum backup requirement is how much generating capacity a system must have to reliably produce power when wind and solar don’t.

Duck Curve Now Looks Like a Canyon

For most places the magnitude of MBR is very simple. It is all of the juice needed on the hottest or coldest low wind night. It is night so there is no solar. Sustained wind is less than eight miles per hour, so there is no wind power. It is very hot or cold so the need for power is very high.

In many places MBR will be close to the maximum power the system ever needs, because heat waves and cold spells are often low wind events. In heat waves it may be a bit hotter during the day but not that much. In cold spells it is often coldest at night.

Thus what is called “peak demand” is a good approximation for the maximum backup requirement. In other words, there has to be enough reliable generating capacity to provide all of the maximum power the system will ever need. For any public power system that is a very big number, as big as it gets in fact.

Actually it gets a bit bigger, because there also has to be margin of safety or what is called “reserve capacity”. This is to allow for something not working as it should. Fifteen percent is a typical reserve in American systems. This makes MBR something like 115% of peak demand.

We often read about wind and solar being cheaper than coal, gas and nuclear power, but that does not include the MBR for wind and solar.

What is relatively cheap for wind and solar is the cost to produce a unit of electricity. This is often called LCOE or the “levelized cost of energy”. But adding the reliable backup required to give people the power they need makes wind and solar very expensive.

In short the true cost of wind and solar is LCOE + MBR. This is the big cost you never hear about. But if every state goes to wind and solar then each one will have to have MBR for roughly its entire peak demand. That is an enormous amount of generating capacity.

Of course the cost of MBR depends on the generating technology. Storage is out because the cost is astronomical. Gas fired generation might be best but it is fossil fueled, as is coal. If one insists on zero fossil fuel then nuclear is probably the only option. Operating nuclear plants as intermittent backup is stupid and expensive, but so is no fossil fuel generation.

What is clearly ruled out is 100% renewables, because there would frequently be no electricity at all. That is unless geothermal could be made to work on an enormous scale, which would take many decades to develop.

unicorn

It is clear that the Biden Administration’s goal of zero fossil fueled electricity by 2035 (without nuclear) is economically impossible because of the minimum backup requirements for wind and solar. You can’t get there from here.

One wonders why we have never heard of this obvious huge cost with wind and solar. The utilities I have looked at avoid it with a trick.

Dominion Energy, which supplies most of Virginia’s juice, is a good example. The Virginia Legislature passed a law saying that Dominion’s power generation had to be zero fossil fueled by 2045. Dominion developed a Plan saying how they would do this. Tucked away in passing on page 119 they say they will expand their capacity for importing power purchased from other utilities. This increase happens to be to an amount equal to their peak demand.

The plan is to buy all the MBR juice from the neighbors! But if everyone is going wind and solar then no one will have juice to sell. In fact they will all be buying, which does not work. Note that the high pressure systems which cause low wind can be huge, covering a dozen or more states. For that matter, no one has that kind of excess generating capacity today.

To summarize, for every utility there will be times when there is zero wind and solar power combined with near peak demand. Meeting this huge need is the minimum backup requirement. The huge cost of meeting this requirement is part of the cost of wind and solar power. MBR makes wind and solar extremely expensive.

The simple question to ask the Biden Administration, the States and their power utilities is this: How will you provide power on hot or cold low wind nights?

Background information on grid stability is at Beware Deep Electrification Policies

More Technical discussion is On Stable Electric Power: What You Need to Know

cg4bbc1c620f5bf0

Footnote: Another Way to Assess Energy Cost and Value is LCOE + LACE

Cutting Through the Fog of Renewable Power Costs

Nino Recedes, NH Keeps Ocean Warm April 2024

The best context for understanding decadal temperature changes comes from the world’s sea surface temperatures (SST), for several reasons:

  • The ocean covers 71% of the globe and drives average temperatures;
  • SSTs have a constant water content, (unlike air temperatures), so give a better reading of heat content variations;
  • Major El Ninos have been the dominant climate feature in recent years.

HadSST is generally regarded as the best of the global SST data sets, and so the temperature story here comes from that source. Previously I used HadSST3 for these reports, but Hadley Centre has made HadSST4 the priority, and v.3 will no longer be updated.  HadSST4 is the same as v.3, except that the older data from ship water intake was re-estimated to be generally lower temperatures than shown in v.3.  The effect is that v.4 has lower average anomalies for the baseline period 1961-1990, thereby showing higher current anomalies than v.3. This analysis concerns more recent time periods and depends on very similar differentials as those from v.3 despite higher absolute anomaly values in v.4.  More on what distinguishes HadSST3 and 4 from other SST products at the end. The user guide for HadSST4 is here.

The Current Context

The chart below shows SST monthly anomalies as reported in HadSST4 starting in 2015 through April 2024.  A global cooling pattern is seen clearly in the Tropics since its peak in 2016, joined by NH and SH cycling downward since 2016.

Note that in 2015-2016 the Tropics and SH peaked in between two summer NH spikes.  That pattern repeated in 2019-2020 with a lesser Tropics peak and SH bump, but with higher NH spikes. By end of 2020, cooler SSTs in all regions took the Global anomaly well below the mean for this period.  

Then in 2022, another strong NH summer spike peaked in August, but this time both the Tropic and SH were countervailing, resulting in only slight Global warming, later receding to the mean.   Oct./Nov. temps dropped  in NH and the Tropics took the Global anomaly below the average for this period. After an uptick in December, temps in January 2023 dropped everywhere, strongest in NH, with the Global anomaly further below the mean since 2015.

Then came El Nino as shown by the upward spike in the Tropics since January 2023, the anomaly nearly tripling from 0.38C to 1.09C.  In September 2023, all regions rose, especially NH up from 0.70C to 1.41C, pulling up the global anomaly to a new high for this period. By December, NH cooled to 1.1C and the Global anomaly down to 0.94C from its peak of 1.10C, despite slight warming in SH and Tropics.

Then in January 2024 both Tropics and SH rose, resulting in Global Anomaly going higher. Tropics anomaly reached a new peak of 1.29C. and all ocean regions were higher than 01/2016, the previous peak. Then in February and March all regions cooled bringing the Global anomaly back down 0.18C from its September peak. In April Tropics cooled further, while NH rose slightly and SH remained unchanged. 

Comment:

The climatists have seized on this unusual warming as proof their Zero Carbon agenda is needed, without addressing how impossible it would be for CO2 warming the air to raise ocean temperatures.  It is the ocean that warms the air, not the other way around.  Recently Steven Koonin had this to say about the phonomenon confirmed in the graph above:

El Nino is a phenomenon in the climate system that happens once every four or five years.  Heat builds up in the equatorial Pacific to the west of Indonesia and so on.  Then when enough of it builds up it surges across the Pacific and changes the currents and the winds.  As it surges toward South America it was discovered and named in the 19th century  It is well understood at this point that the phenomenon has nothing to do with CO2.

Now people talk about changes in that phenomena as a result of CO2 but it’s there in the climate system already and when it happens it influences weather all over the world.   We feel it when it gets rainier in Southern California for example.  So for the last 3 years we have been in the opposite of an El Nino, a La Nina, part of the reason people think the West Coast has been in drought.

It has now shifted in the last months to an El Nino condition that warms the globe and is thought to contribute to this Spike we have seen. But there are other contributions as well.  One of the most surprising ones is that back in January of 2022 an enormous underwater volcano went off in Tonga and it put up a lot of water vapor into the upper atmosphere. It increased the upper atmosphere of water vapor by about 10 percent, and that’s a warming effect, and it may be that is contributing to why the spike is so high.

A longer view of SSTs

To enlarge, open image in new tab.

The graph above is noisy, but the density is needed to see the seasonal patterns in the oceanic fluctuations.  Previous posts focused on the rise and fall of the last El Nino starting in 2015.  This post adds a longer view, encompassing the significant 1998 El Nino and since.  The color schemes are retained for Global, Tropics, NH and SH anomalies.  Despite the longer time frame, I have kept the monthly data (rather than yearly averages) because of interesting shifts between January and July. 1995 is a reasonable (ENSO neutral) starting point prior to the first El Nino. 

The sharp Tropical rise peaking in 1998 is dominant in the record, starting Jan. ’97 to pull up SSTs uniformly before returning to the same level Jan. ’99. There were strong cool periods before and after the 1998 El Nino event. Then SSTs in all regions returned to the mean in 2001-2. 

SSTS fluctuate around the mean until 2007, when another, smaller ENSO event occurs. There is cooling 2007-8,  a lower peak warming in 2009-10, following by cooling in 2011-12.  Again SSTs are average 2013-14.

Now a different pattern appears.  The Tropics cooled sharply to Jan 11, then rise steadily for 4 years to Jan 15, at which point the most recent major El Nino takes off.  But this time in contrast to ’97-’99, the Northern Hemisphere produces peaks every summer pulling up the Global average.  In fact, these NH peaks appear every July starting in 2003, growing stronger to produce 3 massive highs in 2014, 15 and 16.  NH July 2017 was only slightly lower, and a fifth NH peak still lower in Sept. 2018.

The highest summer NH peaks came in 2019 and 2020, only this time the Tropics and SH were offsetting rather adding to the warming. (Note: these are high anomalies on top of the highest absolute temps in the NH.)  Since 2014 SH has played a moderating role, offsetting the NH warming pulses. After September 2020 temps dropped off down until February 2021.  In 2021-22 there were again summer NH spikes, but in 2022 moderated first by cooling Tropics and SH SSTs, then in October to January 2023 by deeper cooling in NH and Tropics.  

Then in 2023 the Tropics flipped from below to well above average, while NH produced a summer peak extending into September higher than any previous year.  Despite El Nino driving the Tropics January 2024 anomaly higher than 1998 and 2016 peaks, the last two months cooled in all regions, and the Tropics continued cooling in April, suggesting that the peak likely has been reached.

What to make of all this? The patterns suggest that in addition to El Ninos in the Pacific driving the Tropic SSTs, something else is going on in the NH.  The obvious culprit is the North Atlantic, since I have seen this sort of pulsing before.  After reading some papers by David Dilley, I confirmed his observation of Atlantic pulses into the Arctic every 8 to 10 years.

Contemporary AMO Observations

Through January 2023 I depended on the Kaplan AMO Index (not smoothed, not detrended) for N. Atlantic observations. But it is no longer being updated, and NOAA says they don’t know its future.  So I find that ERSSTv5 AMO dataset has data through October.  It differs from Kaplan, which reported average absolute temps measured in N. Atlantic.  “ERSST5 AMO  follows Trenberth and Shea (2006) proposal to use the NA region EQ-60°N, 0°-80°W and subtract the global rise of SST 60°S-60°N to obtain a measure of the internal variability, arguing that the effect of external forcing on the North Atlantic should be similar to the effect on the other oceans.”  So the values represent sst anomaly differences between the N. Atlantic and the Global ocean.

The chart above confirms what Kaplan also showed.  As August is the hottest month for the N. Atlantic, its varibility, high and low, drives the annual results for this basin.  Note also the peaks in 2010, lows after 2014, and a rise in 2021. Now in 2023 the peak was holding at 1.4C before declining.  An annual chart below is informative:

Note the difference between blue/green years, beige/brown, and purple/red years.  2010, 2021, 2022 all peaked strongly in August or September.  1998 and 2007 were mildly warm.  2016 and 2018 were matching or cooler than the global average.  2023 started out slightly warm, then rose steadily to an  extraordinary peak in July.  August to October were only slightly lower, but by December cooled by ~0.4C.

Now in 2024 the AMO anomaly is higher than any previous year, but is no longer rising the last two months into April.  Where it goes from here remains to be seen.

The pattern suggests the ocean may be demonstrating a stairstep pattern like that we have also seen in HadCRUT4. 

The purple line is the average anomaly 1980-1996 inclusive, value 0.18.  The orange line the average 1980-202404, value 0.39, also for the period 1997-2012. The red line is 2013-202404, value 0.66. As noted above, these rising stages are driven by the combined warming in the Tropics and NH, including both Pacific and Atlantic basins.

See Also:

2024 El Nino Collapsing

Curiosity:  Solar Coincidence?

The news about our current solar cycle 25 is that the solar activity is hitting peak numbers now and higher  than expected 1-2 years in the future.  As livescience put it:  Solar maximum could hit us harder and sooner than we thought. How dangerous will the sun’s chaotic peak be?  Some charts from spaceweatherlive look familar to these sea surface temperature charts.

Summary

The oceans are driving the warming this century.  SSTs took a step up with the 1998 El Nino and have stayed there with help from the North Atlantic, and more recently the Pacific northern “Blob.”  The ocean surfaces are releasing a lot of energy, warming the air, but eventually will have a cooling effect.  The decline after 1937 was rapid by comparison, so one wonders: How long can the oceans keep this up? And is the sun adding forcing to this process?

Space weather impacts the ionosphere in this animation. Credits: NASA/GSFC/CIL/Krystofer Kim

Footnote: Why Rely on HadSST4

HadSST is distinguished from other SST products because HadCRU (Hadley Climatic Research Unit) does not engage in SST interpolation, i.e. infilling estimated anomalies into grid cells lacking sufficient sampling in a given month. From reading the documentation and from queries to Met Office, this is their procedure.

HadSST4 imports data from gridcells containing ocean, excluding land cells. From past records, they have calculated daily and monthly average readings for each grid cell for the period 1961 to 1990. Those temperatures form the baseline from which anomalies are calculated.

In a given month, each gridcell with sufficient sampling is averaged for the month and then the baseline value for that cell and that month is subtracted, resulting in the monthly anomaly for that cell. All cells with monthly anomalies are averaged to produce global, hemispheric and tropical anomalies for the month, based on the cells in those locations. For example, Tropics averages include ocean grid cells lying between latitudes 20N and 20S.

Gridcells lacking sufficient sampling that month are left out of the averaging, and the uncertainty from such missing data is estimated. IMO that is more reasonable than inventing data to infill. And it seems that the Global Drifter Array displayed in the top image is providing more uniform coverage of the oceans than in the past.

uss-pearl-harbor-deploys-global-drifter-buoys-in-pacific-ocean

USS Pearl Harbor deploys Global Drifter Buoys in Pacific Ocean

 

 

Why Unintended Consequences from Pushing Green Energy

We have been treated to multiple reports of negative consequences unforeseen by policymakers pushing the Green Energy agenda. A sample of the range:

Ford ready to restrict UK sales of petrol models to hit electric targets, Financial Times

Why US offshore wind energy is struggling—the good, the bad and the opportunity, Tech Xplore

Another solar farm destroyed by a hail storm—this time in Texas, OK Energy Today

Storm Ravages World’s Largest Floating Solar Plant, Western Journal

DOE Finalizes Efficiency Standards for Clothes Washers and Dryers, Energy.Gov

Strict new EPA rules would force coal-fired power plants to capture emissions or shut down, AP news

Companies Are Balking at the High Costs of Running Electric Trucks, Wall Street Journal

Landmark wind turbine noise ruling from High Court referred to attorney general, Irish Times

Etc., Etc.

These reports point to regulators again attempting to force social and economic behavorial changes against human and physical forces opposing the goals. A detailed explanation of one such failure follows.

Background Post:  Why Raising Auto Fuel (CAFE) Standards Failed

There are deeper reasons why US auto fuel efficiency standards are counterproductive and should be rolled back.  They were instituted in denial of regulatory experience and science.  First, a parallel from physics.

In the sub-atomic domain of quantum mechanics, Werner Heisenberg, a German physicist, determined that our observations have an effect on the behavior of quanta (quantum particles).

The Heisenberg uncertainty principle states that it is impossible to know simultaneously the exact position and momentum of a particle. That is, the more exactly the position is determined, the less known the momentum, and vice versa. This principle is not a statement about the limits of technology, but a fundamental limit on what can be known about a particle at any given moment. This uncertainty arises because the act of measuring affects the object being measured. The only way to measure the position of something is using light, but, on the sub-atomic scale, the interaction of the light with the object inevitably changes the object’s position and its direction of travel.

Now skip to the world of governance and the effects of regulation. A similar finding shows that the act of regulating produces reactive behavior and unintended consequences contrary to the desired outcomes.

US Fuel Economy (CAFE) Standards Have Backfired

An article at Financial Times explains about Energy Regulations Unintended Consequences  Excerpts below with my bolds.

Goodhart’s Law holds that “any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes”. Originally coined by the economist Charles Goodhart as a critique of the use of money supply measures to guide monetary policy, it has been adopted as a useful concept in many other fields. The general principle is that when any measure is used as a target for policy, it becomes unreliable. It is an observable phenomenon in healthcare, in financial regulation and, it seems, in energy efficiency standards.

When governments set efficiency regulations such as the US Corporate Average Fuel Economy standards for vehicles, they are often what is called “attribute-based”, meaning that the rules take other characteristics into consideration when determining compliance. The Cafe standards, for example, vary according to the “footprint” of the vehicle: the area enclosed by its wheels. In Japan, fuel economy standards are weight-based. Like all regulations, fuel economy standards create incentives to game the system, and where attributes are important, that can mean finding ways to exploit the variations in requirements. There have long been suspicions that the footprint-based Cafe standards would encourage manufacturers to make larger cars for the US market, but a paper this week from Koichiro Ito of the University of Chicago and James Sallee of the University of California Berkeley provided the strongest evidence yet that those fears are likely to be justified.

Mr Ito and Mr Sallee looked at Japan’s experience with weight-based fuel economy standards, which changed in 2009, and concluded that “the Japanese car market has experienced a notable increase in weight in response to attribute-based regulation”. In the US, the Cafe standards create a similar pressure, but expressed in terms of size rather than weight. Mr Ito suggested that in Ford’s decision to end almost all car production in North America to focus on SUVs and trucks, “policy plays a substantial role”. It is not just that manufacturers are focusing on larger models; specific models are also getting bigger. Ford’s move, Mr Ito wrote, should be seen as an “alarm bell” warning of the flaws in the Cafe system. He suggests an alternative framework with a uniform standard and tradeable credits, as a more effective and lower-cost option. With the Trump administration now reviewing fuel economy and emissions standards, and facing challenges from California and many other states, the vehicle manufacturers appear to be in a state of confusion. An elegant idea for preserving plans for improving fuel economy while reducing the cost of compliance could be very welcome.

The paper is The Economics of Attribute-Based Regulation: Theory and Evidence from Fuel-Economy Standards Koichiro Ito, James M. Sallee NBER Working Paper No. 20500.  The authors explain:

An attribute-based regulation is a regulation that aims to change one characteristic of a product related to the externality (the “targeted characteristic”), but which takes some other characteristic (the “secondary attribute”) into consideration when determining compliance. For example, Corporate Average Fuel Economy (CAFE) standards in the United States recently adopted attribute-basing. Figure 1 shows that the new policy mandates a fuel-economy target that is a downward-sloping function of vehicle “footprint”—the square area trapped by a rectangle drawn to connect the vehicle’s tires.  Under this schedule, firms that make larger vehicles are allowed to have lower fuel economy. This has the potential benefit of harmonizing marginal costs of regulatory compliance across firms, but it also creates a distortionary incentive for automakers to manipulate vehicle footprint.

Attribute-basing is used in a variety of important economic policies. Fuel-economy regulations are attribute-based in China, Europe, Japan and the United States, which are the world’s four largest car markets. Energy efficiency standards for appliances, which allow larger products to consume more energy, are attribute-based all over the world. Regulations such as the Clean Air Act, the Family Medical Leave Act, and the Affordable Care Act are attribute-based because they exempt some firms based on size. In all of these examples, attribute-basing is designed to provide a weaker regulation for products or firms that will find compliance more difficult.

Summary from Heritage Foundation study Fuel Economy Standards Are a Costly Mistake Excerpt with my bolds.

The CAFE standards are not only an extremely inefficient way to reduce carbon dioxide emission but will also have a variety of unintended consequences.

For example, the post-2010 standards apply lower mileage requirements to vehicles with larger footprints. Thus, Whitefoot and Skerlos argued that there is an incentive to increase the size of vehicles.

Data from the first few years under the new standard confirm that the average footprint, weight, and horsepower of cars and trucks have indeed all increased since 2008, even as carbon emissions fell, reflecting the distorted incentives.

Manufacturers have found work-arounds to thwart the intent of the regulations. For example, the standards raised the price of large cars, such as station wagons, relative to light trucks. As a result, automakers created a new type of light truck—the sport utility vehicle (SUV)—which was covered by the lower standard and had low gas mileage but met consumers’ needs. Other automakers have simply chosen to miss the thresholds and pay fines on a sliding scale.

Another well-known flaw in CAFE standards is the “rebound effect.” When consumers are forced to buy more fuel-efficient vehicles, the cost per mile falls (since their cars use less gas) and they drive more. This offsets part of the fuel economy gain and adds congestion and road repair costs. Similarly, the rising price of new vehicles causes consumers to delay upgrades, leaving older vehicles on the road longer.

In addition, the higher purchase price of cars under a stricter CAFE standard is likely to force millions of households out of the new-car market altogether. Many households face credit constraints when borrowing money to purchase a car. David Wagner, Paulina Nusinovich, and Esteban Plaza-Jennings used Bureau of Labor Statistics data and typical finance industry debt-service-to-income ratios and estimated that 3.1 million to 14.9 million households would not have enough credit to purchase a new car under the 2025 CAFE standards.[34] This impact would fall disproportionately on poorer households and force the use of older cars with higher maintenance costs and with fuel economy that is generally lower than that of new cars.

CAFE standards may also have redistributed corporate profits to foreign automakers and away from Ford, General Motors (GM), and Chrysler (the Big Three), because foreign-headquartered firms tend to specialize in vehicles that are favored under the new standards.[35] 

Conclusion

CAFE standards are costly, inefficient, and ineffective regulations. They severely limit consumers’ ability to make their own choices concerning safety, comfort, affordability, and efficiency. Originally based on the belief that consumers undervalued fuel economy, the standards have morphed into climate control mandates. Under any justification, regulation gives the desires of government regulators precedence over those of the Americans who actually pay for the cars. Since the regulators undervalue the well-being of American consumers, the policy outcomes are predictably harmful.

What’s Next?