Elites’ Energy Fantasies Abound

Stephen Barrett reports on ruminations from technically challenged overlords in his Spectator Australia article by way of John Ray’s blog The chattering climate class and their war on coal. Excerpts in italics with my bolds and added images.

The Climate Chattering Class is now telling each other:
No Such Thing as Baseload Electrical Power.

Electricity is slippery stuff, in that it can be difficult to properly grasp what it is or how to quantify it.  We can blame the school system. Teachers who were taught social politics at University must somehow teach mathematics and physics.

There is a reason for everything in the world and that reason usually comes down to physics until politics gets mixed in. This is a problem. In politics, the same big lie can be repeated many times, as loudly as possible, until people accept it as truth or give up trying to argue the toss.


Readers will be familiar with the nameplate rating on wind farms and solar plants. It lists the rated output under ideal circumstances, measured in watts. If a heater has 1,000W we all understand it is telling us the output at one instant in time. Consumption is a different thing and is measured in watts/hour. Reversing this, we can understand we are seeing a generator’s nameplate watts as the size of the generator and watt hours as how much it provides.

An interesting idea making its way around the energy conversation
at present is that there is no such thing as baseload energy.

The lie is perpetrated by the political system which is, at present, intending to destroy the concept (and existence) of baseload energy. Baseload is created by heavy generators that operate all day, every day, and are typically cheap. The disadvantage of this structure is that baseload plants usually take time to reach full production. Then, they need to run for extended periods of time to be economically viable. Coal and nuclear are the only two types feasible for most of the Australian market.

Gas and diesel plants can provide electricity but they are expensive when operated in this way. Peaking power is where gas comes to the fore. It can be fired up quickly and make electricity rapidly. This is ideal for peaks when people come home from their day and want heating or cooling and to cook. Gas can cover this surge very happily. Diesel is lovely stuff and great in remote locations where there is no access to the grid or if the grid fails. It might not be pretty, but it delivers when needed.

In the whole clean grid argument, those words should be enshrined…
‘When it is needed.’

Coal, nuclear, gas, and diesel will deliver when needed. Reliability has been ignored by the chattering classes who have created the current disaster of high prices and brownouts that continue to destroy various industries.

Perhaps that is the whole point of ‘renewable’ energy.  I put that in quotes because the best figures I can find are that they only return seven-tenths of the power used to build them.

Every wind tower is a hallmark to coal-fired power being able to carry inefficient freeloaders. Freeloaders because renewable technologies can never produce energy when it is needed, only when it wishes.Solar and wind dump themselves on the energy market, making it impossible for reliable supplies to remain economic. If they had to obey the same bidding rules, they would never survive.

Let’s compare the costs of wind, solar, and nuclear. To do this we can look at the Shepherds Flat Wind Farm, Topaz Solar Farm, and Barakah Nuclear Power Plant.

We can skittle the first anti-nuclear claim about taking too long to build. Barakah was completed within eight years. The global average for modern nuclear plant construction (globally) is between seven and eight years. Sadly in Australia we have a less than helpful public service that thrives on inefficiency that might drag out this timeline.

The nameplate ratings on these plants were 845MW for Shepherds Flat, 550MW for Topaz and 5,600MW for Barakah. Nuclear can appear expensive if you compare build cost against the nameplate rating but not markedly. Shepherds Flat cost $2 billion, Topaz $2.5 billion, and Barakah $24.4 billion. Comparing build cost to nameplate rating, Shepherds Flat Wind cost 42 cents/MW, Topaz Solar 22 cents/MW, and Barakah Nuclear 23 cents/MW.

Looking at the size per dollar, nuclear is almost as good as solar and better than wind. The issue already demonstrated is not size as much as provision. That nameplate value is giving you one second of use. One second later, you are going to need that much again. This means the Watt/Hrs is crucial.

This is where wind and solar fail massively. The watts produced are not as important as the Watt Hours provided to the market. Assuming a generous 25-year life span for Shepherds Flat, 30 years for Topaz, and a mean-spirited 60 years for Barakah (when it is likely to still be running 100 years after it started), I calculated the GWh per annum compared to the Build Price over the life of the project. That is Build Price divided by annual GWh times lifespan.

Shepherds Flat was $40,000, Topaz $75,000 and Barakah $9,300.

On this measure, nuclear is significantly cheaper, but the price of firming wind and solar is not added to their totals. So that you can have power on those hot still days of summer when the wind doesn’t blow or the cold nights of winter when the sun is not shining you will need either nuclear or coal to provide you with the electricity you need.

We can discuss batteries some other time, but the new super battery has been coming about as long as perpetual motion and flying cars. Lithium ion batteries are old tech that has been developed to a point of maturity where there is little left to squeeze out of them and without mountains we are not going to get enough pumped hydro no matter how economically bad that model is.

If I magically had the power I would build more coal-fired stations, only because nuclear will need time to be made legal and that cannot be predicted. Nuclear however beats wind and solar to bits as far as costs and output and reliability are concerned.

18 yr. Plateau September Arctic Ice 2024

September daily extents are now fully reported and the 2024 September monthly results can be compared with those of the previous 17 years.  MASIE showed 2024 at 4.6M km2  and SII was slightly lower, reaching 4.4M for the month.  Analysis below shows that the 2024 annual Minimum month was 75k km2 lower than the 18 year average, and was 256 km2 greater than 2007.  The 18 yr. trendlines are virtually flat and matching the averages of 4.6M km2 for the period since 2007.

In August, 4.27M km2 was the median estimate (range 4.11 to 4.54) for the September monthly average extent from the SIPN (Sea Ice Prediction Network) who use the reports from SII (Sea Ice Index), the NASA team satellite product from passive microwave sensors. The SII actual ice extent average was ~100k km2 higher than estimated.

The graph below shows September comparisons through day 274 (Sept. 30).

The graph has some unusual things to note. The typical September is shown by the black line, which reaches the daily minimum on day 260 and ends the month ~100k km2 higher than the beginning. 2024 was unusual by matching average the first half, appearing on day 255 to hit minimum, then going above average on day 260.  Surprisingly, the last two weeks MASIE showed a steady loss of ice extent down to a minimum on day 272, and recovering the last two days.  Instead of a slight gain, MASIE showed September 2024 losing ~400k km2 by month end.

2007 also started matching average, but went down steeply losing more than half a wadham below average on day 261, then recovering slightly but ending ~800k km2 below average on day 274.

The orange line shows SII starting much lower than MASIE, then the dataset is missing six days mid month. Finally, ice extents rise above MASIE by the end, reaching ~200k km2 above the SII month beginning. I calculated the monthly average for SII from the data provided, though the official number may differ once it is reported.

The table shows ice extents in the regions for 2024, 18 year averages and 2007 for day 274. Averages refer to 2006 through 2023 inclusive.

Region 2024274 Day 274 Ave 2024-Ave. 2007274 2024-2007
 (0) Northern_Hemisphere 4446117 5013499 -567382 4196873 249243
 (1) Beaufort_Sea 280737 564271 -283534 503548 -222811
 (2) Chukchi_Sea 350261 214302 135958 1065 349196
 (3) East_Siberian_Sea 320409 307838 12571 311 320098
 (4) Laptev_Sea 207220 169745 37475 237446 -30226
 (5) Kara_Sea 339 39310 -38971 15857 -15518
 (6) Barents_Sea 0 15734 -15734 4851 -4851
 (7) Greenland_Sea 163963 250200 -86237 353210 -189247
 (8) Baffin_Bay_Gulf_of_St._Lawrence 38141 57714 -19573 51770 -13629
 (9) Canadian_Archipelago 112998 388756 -275757 302221 -189223
 (10) Hudson_Bay 0 3062 -3062 1936 -1936
 (11) Central_Arctic 2968286 3001441 -33155 2723382.15 244904

The major deficits are in Beaufort, Greenland sea and Canadian Archipelago. Only Chukchi shows an offsetting surplus.  Overall, the NH ice extent is 567k km2 or 11% below average.

Summary

Earlier observations showed that Arctic ice extents were low in the 1940s, grew thereafter up to a peak in 1977, before declining.  That decline was gentle until 1996 which started a decade of multi-year ice loss through the Fram Strait.  There was also a major earthquake under the north pole in that period.  In any case, the effects and the decline ceased in 2007, 30 years after the previous peak.  Now we have a plateau in ice extents, which could be the precursor of a growing phase of the quasi-60 year Arctic ice oscillation.

Background 

A commenter previously asked, where do they get their data? The answer is primarily from NIC’s Interactive Multisensor Snow and Ice Mapping System (IMS). From the documentation, the multiple sources feeding IMS are:

Platform(s) AQUA, DMSP, DMSP 5D-3/F17, GOES-10, GOES-11, GOES-13, GOES-9, METEOSAT, MSG, MTSAT-1R, MTSAT-2, NOAA-14, NOAA-15, NOAA-16, NOAA-17, NOAA-18, NOAA-N, RADARSAT-2, SUOMI-NPP, TERRA

Sensor(s): AMSU-A, ATMS, AVHRR, GOES I-M IMAGER, MODIS, MTSAT 1R Imager, MTSAT 2 Imager, MVIRI, SAR, SEVIRI, SSM/I, SSMIS, VIIRS

Summary: IMS Daily Northern Hemisphere Snow and Ice Analysis

The National Oceanic and Atmospheric Administration / National Environmental Satellite, Data, and Information Service (NOAA/NESDIS) has an extensive history of monitoring snow and ice coverage.Accurate monitoring of global snow/ice cover is a key component in the study of climate and global change as well as daily weather forecasting.

The Polar and Geostationary Operational Environmental Satellite programs (POES/GOES) operated by NESDIS provide invaluable visible and infrared spectral data in support of these efforts. Clear-sky imagery from both the POES and the GOES sensors show snow/ice boundaries very well; however, the visible and infrared techniques may suffer from persistent cloud cover near the snowline, making observations difficult (Ramsay, 1995). The microwave products (DMSP and AMSR-E) are unobstructed by clouds and thus can be used as another observational platform in most regions. Synthetic Aperture Radar (SAR) imagery also provides all-weather, near daily capacities to discriminate sea and lake ice. With several other derived snow/ice products of varying accuracy, such as those from NCEP and the NWS NOHRSC, it is highly desirable for analysts to be able to interactively compare and contrast the products so that a more accurate composite map can be produced.

The Satellite Analysis Branch (SAB) of NESDIS first began generating Northern Hemisphere Weekly Snow and Ice Cover analysis charts derived from the visible satellite imagery in November, 1966. The spatial and temporal resolutions of the analysis (190 km and 7 days, respectively) remained unchanged for the product’s 33-year lifespan.

As a result of increasing customer needs and expectations, it was decided that an efficient, interactive workstation application should be constructed which would enable SAB to produce snow/ice analyses at a higher resolution and on a daily basis (~25 km / 1024 x 1024 grid and once per day) using a consolidated array of new as well as existing satellite and surface imagery products. The Daily Northern Hemisphere Snow and Ice Cover chart has been produced since February, 1997 by SAB meteorologists on the IMS.

Another large resolution improvement began in early 2004, when improved technology allowed the SAB to begin creation of a daily ~4 km (6144×6144) grid. At this time, both the ~4 km and ~24 km products are available from NSIDC with a slight delay. Near real-time gridded data is available in ASCII format by request.

In March 2008, the product was migrated from SAB to the National Ice Center (NIC) of NESDIS. The production system and methodology was preserved during the migration. Improved access to DMSP, SAR, and modeled data sources is expected as a short-term from the migration, with longer term plans of twice daily production, GRIB2 output format, a Southern Hemisphere analysis, and an expanded suite of integrated snow and ice variable on horizon.

http://www.natice.noaa.gov/ims/ims_1.html

Footnote

Some people unhappy with the higher amounts of ice extent shown by MASIE continue to claim that Sea Ice Index is the only dataset that can be used. This is false in fact and in logic. Why should anyone accept that the highest quality picture of ice day to day has no shelf life, that one year’s charts can not be compared with another year?

MASIE is rigorous, reliable, serves as calibration for satellite products, and continues the long and honorable tradition of naval ice charting using modern technologies. More on this at my post Support MASIE Arctic Ice Dataset

 

Investors Beware Green Equipment Companies

Steve Goreham explains in his Heartland article Why Are Renewable Equipment Companies Such Poor Investments? Excerpts in italics with my bolds and added images.

Headlines promote renewable energy equipment companies as part of efforts to transition to Net Zero carbon dioxide emissions by 2050. Wind and solar system providers, electric vehicle manufacturers, green hydrogen producers, and other green equipment firms form a growing share of world industry. But renewable equipment firms suffer poor market returns, so investors should beware.

The Renewable Energy Industrial Index (RENIXX) is a global stock index of the 30 largest renewable energy industrial companies in the world by stock market capitalization. Current RENIXX companies include Enphase Energy, First Solar, Orsted, Plug Power, Tesla, and Vestas.

IWR of Germany established the RENIXX on May 1, 2006, with an initial value of 1,000 points. This month, the RENIXX stood at 1,013 points, essentially zero value growth over the last 18 years. In comparison, the S&P 500 Index more than quadrupled over the same period. The RENIXX is down three years in a row from 2021, losing about half its value.

Wind turbine manufacturers faced serious financial challenges over the last three years, even with rising sales. Rising costs, high interest rates, and project delays continue to impact the profitability of wind projects and equipment suppliers. The stock of Denmark-based Vestas Wind Systems, the world’s largest supplier, rose only 7% over the last 16 years, and its stock price has fallen 58% from a high in 2021. Vestas struggled to make a profit in 2022 and 2023 and suspended dividends to shareholders.

Other major wind suppliers have also been poor investments for shareholders. The stock of Siemens Gamesa, the number two turbine maker, is down 65% since a peak in 2021. Gamesa reported a loss of €4.4 billion in 2023 and received a €7.5 billion bailout from the German government that same year. Other top wind suppliers suffered major stock price declines since 2021, including Goldwind of China (down 77%) and Nordex of Germany (-36%).

Some 80% of the world’s solar panels are manufactured in China and the top six suppliers reside in China. The solar panel industry is beset by overcapacity and severe competition. Stock prices of the top seven suppliers have all declined by more than 50% since 2021. The stock of U.S. firm First Solar has risen since 2021 but remains below its all-time high price reached in 2008.

Tesla, which was founded in 2003, remained the only pure-play, publicly traded EV stock until 2018. By the end of 2021, Tesla’s value had soared to over $1 Trillion, boasting a market value more than Toyota, Volkswagen, Mercedes-Benz, General Motors, Ford, BMW, and Honda combined. But Tesla is the exception.

But in most cases, electric vehicle (EV) companies have been very poor investments. Between 2020 and 2024, 31 EV companies went public on U.S. stock exchanges. Only one of these 31 companies, the Chinese firm Li Auto, saw its price rise since the initial public offering (IPO). Thirty EV firms saw their stock prices fall, most precipitously.

EV company price declines from the IPO price include Fisker (-99%), Nikola (-94%), NIO (-50%), Lucid Group (-75%), and Rivian (-88%). Six others of the 31 companies went bankrupt. Tesla and Chinese firms BYD and Li Auto are the only EV firms profitable today.

ChargePoint is the world’s largest dedicated EV charger company (behind EV manufacturer Tesla), with over 25,000 charging stations in the U.S. and Canada. ChargePoint went public in 2021 by merging with Switchback Energy Acquisition Corporation, valued at $2.4 billion. The firm’s value today is about $585 million, down 76% since 2021.  For fiscal year 2024, ChargePoint lost $458 million on revenue of $507 million.

It’s not clear that any charging company can make money. High-speed, 50-kilowatt EV chargers cost about five times as much as traditional gasoline pumps. Around 80% of EV charging is done at home, reducing the demand for public charging. ChargePoint, EVgo, Wallbox, Allego, and Blink Charging are all valued today at small fractions of their original IPO price. No EV charger firm is profitable, even after continuing to receive large government subsidies.

Plug Power is a leading supplier of hydrogen energy systems, including battery-cells for hydrogen vehicles and electrolyzers to produce green hydrogen fuel. Founded in 1997, the company went public in October 1999 at a split-adjusted price of about $160 per share.

But during its 27-year history, Plug Power has never turned a profit. According to financial reports, the firm lost $1.45 billion in 2024, up from a loss of $43.8 million in 2018. Its current stock price is under two dollars per share.

Traditional established firms are finding that renewable equipment can be poor business. In 2023, Ford lost $4.7 billion on sales of 116,000 electric vehicles, or over $40,000 per vehicle. General Electric’s wind turbine business lost $1.1 billion in 2023.

The U.S. federal government provided subsidies to renewable equipment companies of between $7 billion and $16 billion per year between 2010 and 2022. But the Cato Institute estimates that because of the passage of the Inflation Reduction Act in 2022, subsidies will skyrocket to about $80 billion in fiscal year 2025.

EIA

Without the fear of human-caused climate change and
a rising level of government subsidies and mandates,
many of these green companies would not exist.

It’s doubtful that carbon dioxide pipelines, heavy electric trucks, offshore wind systems, green hydrogen fuel equipment, and EV charging stations would be viable businesses in unsubsidized capital markets.

During this last year, leading financial firms pulled back on their climate change pledges. Bank of America, JP Morgan, State Street, and Pimco withdrew from Climate Action 100+, which seeks to force companies and investment funds to address climate issues and adopt environmental, social, and governance (ESG) policies.

But it’s difficult to invest in renewable equipment companies
when they are losing money.

 

Wind Power Pollution and Hypocrisy in New England

Emmett Hare reports in City Journal Wind Power Debacle in New England.  Excerpts in italics with my bolds and added images.

A fractured turbine’s blade in Nantucket is causing
ongoing problems and frustrating local residents.

In mid-July, a blade from an offshore wind turbine operating 15 miles southwest of Nantucket fractured. A large amount of fiberglass, foam, and plastic debris fell into the ocean and began washing up on the island’s shores. The incident led to the closure of several beaches and a suspension of operations and construction for the massive Vineyard Wind project, a joint venture of Avangrid and foreign-owned Copenhagen Infrastructure Partners that has installed and operated ten of 62 planned turbines in the country’s largest wind farm.

At local meetings, Nantucket residents expressed concerns about officials’ handling of the turbine breakage and the environmental hazards of enormous fiberglass blades tumbling into the sea. In the past, they have also cited the project’s impact on marine wildlife and its visual impact on the town’s scenic beaches. A CNN report describing this “unusual and rare” event noted that the Coast Guard had retrieved a 300-foot piece of the shattered blade from local waters. The outlet reported that a spokesperson for GE Vernova, the wind-blade manufacturer, “couldn’t provide officials with the precise number of times something similar has happened at other wind farms around the world.”

Environmental groups, realizing the potential political implications of the fractured blade, downplayed the episode. The National Wildlife Federation (NWF), which avidly supports offshore wind farms, insisted that the damage was minor. “Compared to other energy disasters in the ocean like oil spills, this incident is fairly contained and easily cleaned up to prioritize the safety of marine life,” said Amber Hewett, senior director of offshore wind energy for the NWF. The Sierra Club emphasized that “the failure of a single turbine blade does not adversely impact the emergence of offshore wind as a critical solution for reducing dependence on fossil fuels and addressing the climate crisis.”

Whether the incident is “contained” remains in question. Debris from the broken turbine has been reported beyond Nantucket—in Martha’s Vineyard, Cape Cod, Rhode Island, and off the coast of Montauk, Long Island. The debris is breaking up into smaller pieces resembling shattered glass, with yet-unknown effects on Nantucket’s marine habitat. Vineyard Wind cautioned that “[m]embers of the public should avoid handling debris” and promised to “bag, track, and transport all debris to proper storage as soon as possible.” It remains to be seen whether simple avoidance will suffice, especially given the possibility of debris entering the human food chain through area fish.

The Massachusetts Clean Energy Center (MassCEC) Wind Technology Testing Center in Boston has taken delivery of a 107-meter wind turbine blade designed for GE Renewable Energy’s Haliade-X offshore wind turbine.

While this event may be “unusual and rare” in an absolute sense, many wind farms have seen broken turbines, fires, and sea-floor damage. And Nantucket’s situation is particularly dire, given that Vineyard Wind’s turbines are by far the largest ever constructed in the United States: the blade that fragmented on July 13 was over 350 feet long and weighed 57 tons.

Even when functioning as intended, wind farms can negatively affect the surrounding environment. Wildlife advocates have claimed that sonic and subsonic vibrations from the construction and operation of turbines disrupt the navigational senses of marine mammals like whales and dolphins and can cause beachings. Turbines are also responsible for the deaths of countless birds. Clammers and fishermen are wary of working in areas close to wind farms, out of concern for equipment snags on buried power lines and risks to their vessels of navigating between the turbines in bad weather.

French Fishermen Join U.S. Fishermen in Fighting Offshore Wind – IER

The Nantucket residents questioning the safety of wind turbines generally support alternative energy. Indeed, in an FAQ post on the town government’s webpage, officials made the point that allowing wind projects to avoid scrutiny might allow traditional fossil fuel producers to evade similar oversight: “If [the Bureau of Ocean Energy Management] guts the provisions of these longstanding federal laws protecting culturally and environmentally significant places to facilitate expedient green energy projects, fossil fuel developers will exploit the bad precedent to undercut regulation of harmful projects for decades to come.”

Nonetheless, the Nantucket residents have seen themselves branded as tools of the fossil-fuel industry by well-financed lobbyists and promoters of richly subsidized wind power. They have also been subject to physical attacks. At a city council meeting in Newport, Rhode Island, a field director for Climate Jobs Rhode Island, David Booth, was charged with simple assault and disorderly conduct after accosting a speaker and seizing a bag of turbine fragments that she had brought for her testimony. Booth allegedly appeared prominently in a photo on the campaign website of Rhode Island senator Sheldon Whitehouse, which was subsequently removed without comment.

Debris in the water from Vineyard turbine blade

The wind-power industry has seen some of its planned projects cancelled in recent years due to swelling production costs and local opposition to the environmental and aesthetic impact of the colossal windmills. A report published by Brown University’s Climate and Development Lab in early 2024 suggested that much of the opposition to offshore wind was rooted in “misinformation,” “[c]onspiracy theories,” and cherry-picked information supplied by “right-wing think tanks.” It might prove beyond the powers of an academic paper to convince the residents of New England and coastal states that the fiberglass and foam washing up on their beaches is nothing more than a conservative talking point.

See Also:

The Short Lives of Wind Turbines

NH Pulls Oceans Warmer August 2024

The best context for understanding decadal temperature changes comes from the world’s sea surface temperatures (SST), for several reasons:

  • The ocean covers 71% of the globe and drives average temperatures;
  • SSTs have a constant water content, (unlike air temperatures), so give a better reading of heat content variations;
  • Major El Ninos have been the dominant climate feature in recent years.

HadSST is generally regarded as the best of the global SST data sets, and so the temperature story here comes from that source. Previously I used HadSST3 for these reports, but Hadley Centre has made HadSST4 the priority, and v.3 will no longer be updated.  HadSST4 is the same as v.3, except that the older data from ship water intake was re-estimated to be generally lower temperatures than shown in v.3.  The effect is that v.4 has lower average anomalies for the baseline period 1961-1990, thereby showing higher current anomalies than v.3. This analysis concerns more recent time periods and depends on very similar differentials as those from v.3 despite higher absolute anomaly values in v.4.  More on what distinguishes HadSST3 and 4 from other SST products at the end. The user guide for HadSST4 is here.

The Current Context

The chart below shows SST monthly anomalies as reported in HadSST4 starting in 2015 through August 2024.  A global cooling pattern is seen clearly in the Tropics since its peak in 2016, joined by NH and SH cycling downward since 2016, followed by rising temperatures in 2023 and 2024.

Note that in 2015-2016 the Tropics and SH peaked in between two summer NH spikes.  That pattern repeated in 2019-2020 with a lesser Tropics peak and SH bump, but with higher NH spikes. By end of 2020, cooler SSTs in all regions took the Global anomaly well below the mean for this period.  

Then in 2022, another strong NH summer spike peaked in August, but this time both the Tropic and SH were countervailing, resulting in only slight Global warming, later receding to the mean.   Oct./Nov. temps dropped  in NH and the Tropics took the Global anomaly below the average for this period. After an uptick in December, temps in January 2023 dropped everywhere, strongest in NH, with the Global anomaly further below the mean since 2015.

Then came El Nino as shown by the upward spike in the Tropics since January 2023, the anomaly nearly tripling from 0.38C to 1.09C.  In September 2023, all regions rose, especially NH up from 0.70C to 1.41C, pulling up the global anomaly to a new high for this period. By December, NH cooled to 1.1C and the Global anomaly down to 0.94C from its peak of 1.10C, despite slight warming in SH and Tropics.

In January 2024 both Tropics and SH rose, resulting in Global Anomaly going higher. Since then Tropics have cooled from a  peak of 1.29C down to 0.84C.  SH also dropped down from 0.89C to 0.65C. NH lost ~0.4C as of March 2024, but has risen 0.2C over April to June. Despite that upward NH bump, the Global SST anomaly cooled further. 

In July there was a warming uptick in all regions, and now in August a new high in NH, along with the Tropics, bringing the global anomaly up to almost match August 2023. We have now three distinct warmings: the El Nino driven peak in 2025-16, the lesser peak in 2019-20 and now a stronger warming event in 2023-24.

Comment:

The climatists have seized on this unusual warming as proof their Zero Carbon agenda is needed, without addressing how impossible it would be for CO2 warming the air to raise ocean temperatures.  It is the ocean that warms the air, not the other way around.  Recently Steven Koonin had this to say about the phonomenon confirmed in the graph above:

El Nino is a phenomenon in the climate system that happens once every four or five years.  Heat builds up in the equatorial Pacific to the west of Indonesia and so on.  Then when enough of it builds up it surges across the Pacific and changes the currents and the winds.  As it surges toward South America it was discovered and named in the 19th century  It iswell understood at this point that the phenomenon has nothing to do with CO2.

Now people talk about changes in that phenomena as a result of CO2 but it’s there in the climate system already and when it happens it influences weather all over the world.   We feel it when it gets rainier in Southern California for example.  So for the last 3 years we have been in the opposite of an El Nino, a La Nina, part of the reason people think the West Coast has been in drought.

It has now shifted in the last months to an El Nino condition that warms the globe and is thought to contribute to this Spike we have seen. But there are other contributions as well.  One of the most surprising ones is that back in January of 2022 an enormous underwater volcano went off in Tonga and it put up a lot of water vapor into the upper atmosphere. It increased the upper atmosphere of water vapor by about 10 percent, and that’s a warming effect, and it may be that is contributing to why the spike is so high.

A longer view of SSTs

The graph above is noisy, but the density is needed to see the seasonal patterns in the oceanic fluctuations.  Previous posts focused on the rise and fall of the last El Nino starting in 2015.  This post adds a longer view, encompassing the significant 1998 El Nino and since.  The color schemes are retained for Global, Tropics, NH and SH anomalies.  Despite the longer time frame, I have kept the monthly data (rather than yearly averages) because of interesting shifts between January and July. 1995 is a reasonable (ENSO neutral) starting point prior to the first El Nino. 

The sharp Tropical rise peaking in 1998 is dominant in the record, starting Jan. ’97 to pull up SSTs uniformly before returning to the same level Jan. ’99. There were strong cool periods before and after the 1998 El Nino event. Then SSTs in all regions returned to the mean in 2001-2. 

SSTS fluctuate around the mean until 2007, when another, smaller ENSO event occurs. There is cooling 2007-8,  a lower peak warming in 2009-10, following by cooling in 2011-12.  Again SSTs are average 2013-14.

Now a different pattern appears.  The Tropics cooled sharply to Jan 11, then rise steadily for 4 years to Jan 15, at which point the most recent major El Nino takes off.  But this time in contrast to ’97-’99, the Northern Hemisphere produces peaks every summer pulling up the Global average.  In fact, these NH peaks appear every July starting in 2003, growing stronger to produce 3 massive highs in 2014, 15 and 16.  NH July 2017 was only slightly lower, and a fifth NH peak still lower in Sept. 2018.

The highest summer NH peaks came in 2019 and 2020, only this time the Tropics and SH were offsetting rather adding to the warming. (Note: these are high anomalies on top of the highest absolute temps in the NH.)  Since 2014 SH has played a moderating role, offsetting the NH warming pulses. After September 2020 temps dropped off down until February 2021.  In 2021-22 there were again summer NH spikes, but in 2022 moderated first by cooling Tropics and SH SSTs, then in October to January 2023 by deeper cooling in NH and Tropics.  

Then in 2023 the Tropics flipped from below to well above average, while NH produced a summer peak extending into September higher than any previous year.  Despite El Nino driving the Tropics January 2024 anomaly higher than 1998 and 2016 peaks, following months cooled in all regions, and the Tropics continued cooling in April, May and June along with SH dropping, suggesting that the peak might be reached, though now in July and August NH warming has again pulled the global anomaly higher.

What to make of all this? The patterns suggest that in addition to El Ninos in the Pacific driving the Tropic SSTs, something else is going on in the NH.  The obvious culprit is the North Atlantic, since I have seen this sort of pulsing before.  After reading some papers by David Dilley, I confirmed his observation of Atlantic pulses into the Arctic every 8 to 10 years.

Contemporary AMO Observations

Through January 2023 I depended on the Kaplan AMO Index (not smoothed, not detrended) for N. Atlantic observations. But it is no longer being updated, and NOAA says they don’t know its future.  So I find that ERSSTv5 AMO dataset has current data.  It differs from Kaplan, which reported average absolute temps measured in N. Atlantic.  “ERSST5 AMO  follows Trenberth and Shea (2006) proposal to use the NA region EQ-60°N, 0°-80°W and subtract the global rise of SST 60°S-60°N to obtain a measure of the internal variability, arguing that the effect of external forcing on the North Atlantic should be similar to the effect on the other oceans.”  So the values represent sst anomaly differences between the N. Atlantic and the Global ocean.

The chart above confirms what Kaplan also showed.  As August is the hottest month for the N. Atlantic, its variability, high and low, drives the annual results for this basin.  Note also the peaks in 2010, lows after 2014, and a rise in 2021. Then in 2023 the peak was holding at 1.4C before declining.  An annual chart below is informative:

Note the difference between blue/green years, beige/brown, and purple/red years.  2010, 2021, 2022 all peaked strongly in August or September.  1998 and 2007 were mildly warm.  2016 and 2018 were matching or cooler than the global average.  2023 started out slightly warm, then rose steadily to an  extraordinary peak in July.  August to October were only slightly lower, but by December cooled by ~0.4C.

Now in 2024 the AMO anomaly started higher than any previous year, then leveled off for two months declining slightly into April.  Remarkably, May shows an upward leap putting this on a higher track than 2023, and rising slightly higher in June.  In July and August 2024 the anomaly declined and is now lower than the peak reached in 2023.

The pattern suggests the ocean may be demonstrating a stairstep pattern like that we have also seen in HadCRUT4. 

The purple line is the average anomaly 1980-1996 inclusive, value 0.18.  The orange line the average 1980-202404, value 0.39, also for the period 1997-2012. The red line is 2013-202404, value 0.66. As noted above, these rising stages are driven by the combined warming in the Tropics and NH, including both Pacific and Atlantic basins.

See Also:

2024 El Nino Collapsing

Curiosity:  Solar Coincidence?

The news about our current solar cycle 25 is that the solar activity is hitting peak numbers now and higher  than expected 1-2 years in the future.  As livescience put it:  Solar maximum could hit us harder and sooner than we thought. How dangerous will the sun’s chaotic peak be?  Some charts from spaceweatherlive look familar to these sea surface temperature charts.

Summary

The oceans are driving the warming this century.  SSTs took a step up with the 1998 El Nino and have stayed there with help from the North Atlantic, and more recently the Pacific northern “Blob.”  The ocean surfaces are releasing a lot of energy, warming the air, but eventually will have a cooling effect.  The decline after 1937 was rapid by comparison, so one wonders: How long can the oceans keep this up? And is the sun adding forcing to this process?

Space weather impacts the ionosphere in this animation. Credits: NASA/GSFC/CIL/Krystofer Kim

Footnote: Why Rely on HadSST4

HadSST is distinguished from other SST products because HadCRU (Hadley Climatic Research Unit) does not engage in SST interpolation, i.e. infilling estimated anomalies into grid cells lacking sufficient sampling in a given month. From reading the documentation and from queries to Met Office, this is their procedure.

HadSST4 imports data from gridcells containing ocean, excluding land cells. From past records, they have calculated daily and monthly average readings for each grid cell for the period 1961 to 1990. Those temperatures form the baseline from which anomalies are calculated.

In a given month, each gridcell with sufficient sampling is averaged for the month and then the baseline value for that cell and that month is subtracted, resulting in the monthly anomaly for that cell. All cells with monthly anomalies are averaged to produce global, hemispheric and tropical anomalies for the month, based on the cells in those locations. For example, Tropics averages include ocean grid cells lying between latitudes 20N and 20S.

Gridcells lacking sufficient sampling that month are left out of the averaging, and the uncertainty from such missing data is estimated. IMO that is more reasonable than inventing data to infill. And it seems that the Global Drifter Array displayed in the top image is providing more uniform coverage of the oceans than in the past.

uss-pearl-harbor-deploys-global-drifter-buoys-in-pacific-ocean

USS Pearl Harbor deploys Global Drifter Buoys in Pacific Ocean

 

 

Fatal Flaw Discredits IPCC Science

By way of John Ray comes this Spectator Australia article A basic flaw in IPCC science.  Excerpts in italics with my bolds and added images.

Detailed research is underway that threatens to undermine the foundations of the climate science promoted by the IPCC since its First Assessment Report in 1992. The research is re-examining the rural and urban temperature records in the Northern Hemisphere that are the foundation for the IPCC’s estimates of global warming since 1850. The research team has been led by Dr Willie Soon (a Malaysian solar astrophysicist associated with the Smithsonian Institute for many years) and two highly qualified Irish academics – Dr Michael Connolly and his son Dr Ronan Connolly. They have formed a climate research group CERES-SCIENCE. Their detailed research will be a challenge for the IPCC 7th Assessment Report due to be released in 2029 as their research results challenge the very foundations of IPCC science.

The climate warming trend published by the IPCC is a continually updated graph based on the temperature records of Northern Hemisphere land surface temperature stations dating from the mid 19th Century. The latest IPCC 2021 report uses data for the period 1850-2018. The IPCC’s selection of Northern Hemisphere land surface temperature records is not in question and is justifiable. The Northern Hemisphere records provide the best database for this period. The Southern Hemisphere land temperature records are not that extensive and are sparse for the 19th and early 20th Century. It is generally agreed that the urban temperature data is significantly warmer than the rural data in the same region because of an urban warming bias. This bias is due to night-time surface radiation of the daytime solar radiation absorbed by concrete and bitumen. Such radiation leads to higher urban night-time temperatures than say in the nearby countryside. The IPCC acknowledges such a warming bias but alleges the increased effect is only 10 per cent and therefore does not significantly distort its published global warming trend lines.


Since 2018, Dr Soon and his partners have analysed the data from rural and urban temperature recording stations in China, the USA, the Arctic, and Ireland. The number of stations with reliable temperature records in these areas increased from very few in the mid-19th Century to around 4,000 in the 1970s before decreasing to around 2,000 by the 1990s. The rural temperature recording stations with good records peaked at 400 and are presently around 200.

Their analysis of individual stations needs to account for any variation in their exposure to the Sun due to changes in their location, OR shadowing due to the construction of nearby buildings, OR nearby vegetation growth. The analysis of rural temperature stations is further complicated as over time many are encroached by nearby cities. Consequently, the data from such stations needs to be shifted at certain dates from the rural temperature database to either an intermediate database or to a full urban database. Consequently, an accurate analysis of the temperature records of each recording station is a time-consuming task.


This new analysis of 4,000 temperature recording stations in China, the USA, the Arctic, and Ireland shows a warming trend of 0.89ºC per century in the urban stations that is 1.61 times higher that a warming trend of 0.55ºC per century in the rural stations. This difference is far more significant than the 10 per cent divergence between urban and rural stations alleged in the IPCC reports; a divergence explained by a potential flaw in the IPCC’s methodology. The IPCC uses a technique called homogenisation that averages the rural and urban temperatures in a particular region. This method distorts the rural temperature records as over 75 per cent of the temperature records used in this homogenisation methodology are urban stations. So, a methodology that attempts to statistically identify and correct some biases that may be in the raw data, in effect, leads to an urban blending of the rural dataset. This result is biased as it downgrades the actual values of each rural temperature station. In contrast, Dr Soon and his coworkers avoided homogenisation so the temperature trends they identify for each rural region are accurate as the rural data are not distorted by the readings from nearby urban stations.


The rural temperature trend measured by this new research is 0.55ºC per century and it indicates the Earth has warmed 0.9ºC since 1850. In contrast, the urban temperature trend measured by this new research is 0.89ºC per century and indicates a much higher warming of 1.5ºC since 1850. Consequently, a distorted urban warming trend has been used by the IPCC to quantify the warming of the whole of the Earth since 1850. The exaggeration is significant as the urban temperature record database used by the IPCC only represents the temperatures on 3-4 per cent of the Earth’s land surface area; an area less than 2 per cent of the Earth’s total surface area. During the next few years, Dr Willie Soon and his research team are currently analysing the meta-history of 800 European temperature recording stations. When this is done their research will be based on very significant database of Northern Hemisphere rural and urban temperature records from China, the USA, the Arctic, Ireland, and Europe.

This new research has unveiled another flaw in the IPCC‘s temperature narrative as trend lines in its revised temperature datasets are different from those published by the IPCC. For example, the rural records now show a marked warming trend in the 1930s and 1940s while there is only a slight warming trend in the IPCC dataset. The most significant difference is the existence of a marked cooling period in the rural dataset for the 1960s and 1970s that is almost absent in the IPCC’s urban dataset. This later divergence upsets the common narrative that rising carbon dioxide levels control modern warming trends. For, if carbon dioxide levels are the driver of modern warming, how can a higher rate of increasing carbon dioxide levels exist within a cooling period in the 1960s and 1970s while a lower increasing rate of carbon dioxide levels coincides with an earlier warming interval in the 1930s and 1940s? Or, in other words, how can carbon dioxide levels increasing at 1.7 parts per million per decade cause a distinct warming period in the 1930s and 1940s while a larger increasing rate of 10.63 parts per million per decade is associated with a distinct cooling period in the 1960s and 1970s! Consequently, the research of Willie Soon and his coworkers is discrediting, not only the higher rate of global warming trends specified in IPCC Reports, but also the theory that rising carbon dioxide levels explain modern warming trends; a lynchpin of IPCC science for the last 25 years.

Willie Soon and his coworkers maintain that climate scientists need to consider other possible explanations for recent global warming. Willie Soon and his coworkers point to the Sun, but the IPCC maintains that variations in Total Solar Irradiance (TSI) are over eons and not over shorter periods such as the last few centuries. For that reason, the IPCC point to changes in greenhouse gases as the most obvious explanation for global warming since 1850. In contrast, Willie Soon and his coworkers maintain there can be short-term changes in solar activity and, for example, refer to a period of no sunspot activity that coincided with the Little Ice Age in the 17th Century. They also point out there is still no agreed average figure for Total Solar Irradiance (TSI) despite 30 years of measurements taken by various satellites. Consequently, they contend research in this area is not settled.

The CERES-SCIENCE research project pioneered by Dr Willie Soon and the father-son Connolly team has questioned the validity of the high global warming trends for the 1850-present period that have been published by the IPCC since its first report in 1992. The research also queries the IPCC narrative that rising greenhouse gas concentrations, particularly carbon dioxide, are the primary driver of global warming since 1850. That narrative has been the foundation of IPCC climate science for the last 40 years. It will be interesting to see how the IPCC’s 7th Assessment Report in 2029 treats this new research that questions the very basis of IPCC’s climate science.

The paper is The Detection and Attribution of Northern Hemisphere Land Surface Warming (1850–2018) in Terms of Human and Natural Factors: Challenges of Inadequate Data. 

Abstract

A statistical analysis was applied to Northern Hemisphere land surface temperatures (1850–2018) to try to identify the main drivers of the observed warming since the mid-19th century. Two different temperature estimates were considered—a rural and urban blend (that matches almost exactly with most current estimates) and a rural-only estimate. The rural and urban blend indicates a long-term warming of 0.89 °C/century since 1850, while the rural-only indicates 0.55 °C/century. This contradicts a common assumption that current thermometer-based global temperature indices are relatively unaffected by urban warming biases.

Three main climatic drivers were considered, following the approaches adopted by the Intergovernmental Panel on Climate Change (IPCC)’s recent 6th Assessment Report (AR6): two natural forcings (solar and volcanic) and the composite “all anthropogenic forcings combined” time series recommended by IPCC AR6. The volcanic time series was that recommended by IPCC AR6. Two alternative solar forcing datasets were contrasted. One was the Total Solar Irradiance (TSI) time series that was recommended by IPCC AR6. The other TSI time series was apparently overlooked by IPCC AR6. It was found that altering the temperature estimate and/or the choice of solar forcing dataset resulted in very different conclusions as to the primary drivers of the observed warming.

Our analysis focused on the Northern Hemispheric land component of global surface temperatures since this is the most data-rich component. It reveals that important challenges remain for the broader detection and attribution problem of global warming: (1) urbanization bias remains a substantial problem for the global land temperature data; (2) it is still unclear which (if any) of the many TSI time series in the literature are accurate estimates of past TSI; (3) the scientific community is not yet in a position to confidently establish whether the warming since 1850 is mostly human-caused, mostly natural, or some combination. Suggestions for how these scientific challenges might be resolved are offered.

SCOTUS Must Stop Climate Extortion Lawfare

Jon Decker explains what’s at stake in the case awaiting US Supreme Court consideration.  His Real Clear Markets article is The Supreme Court Must Stop Climate Extortion Schemes.  Excerpts in itallics with my bolds and added images.

Chevron recently announced that it is moving out of California after almost a century and a half in the state. No wonder, since Sacramento sued the company, along with a number of other oil companies, for allegedly deceptive practices.

In their war on energy, progressive politicians increasingly turn to lawfare as another scheme to extract funds from productive citizens and dramatically reshape the economy. In the coming weeks, we’ll learn if the Supreme Court will stand up to them.

The critical case is Sunoco v. Honolulu, currently pending before SCOTUS. Honolulu is suing several oil companies, alleging their fossil fuel production caused significant damage to the city through rising sea levels and other climate-related infrastructure issues.

That these companies are being sued not for specific instances of purported environmental harm, but instead under “public nuisance” and “consumer fraud” laws for an alleged “multi-decadal campaign of deception” about the nature of their products is crucial. That’s because it lets progressive cities and states get around long-standing legal doctrine that kept climate cases in federal courts, where defendants are somewhat likelier to get a fair hearing.

Knowing that federal environmental law presents a more difficult path for litigation, activists instead went “venue shopping” with their climate agenda to deep blue states, knowing that a multi-jurisdictional assault would be unworkable, and potentially fatal, for American energy companies. The progressive politicians who support this approach not only see a boost for their careers and fundraising goals, but a potentially massive source of new government revenue.

Because Honolulu, like a spate of other lawsuits driven by ambitious progressive state AGs and prosecutors, models its assault on energy companies on the 90s tobacco wars. And they are unsurprisingly seeking an eye-watering settlement similar to what the tobacco industry was forced to pay (dollars that still flow to the government to this day).

But you don’t have to be a fossil fuel corporate booster or a climate change skeptic to recognize that these paydays will come at the expense of ordinary consumers and taxpayers, and of the economy as a whole.

Fossil fuels are ubiquitous in every sector, from agriculture and clothing to steel production, electricity, heating, and transportation. If this lawfare succeeds then every single one of these products and services—anything that takes energy as an input—becomes more expensive.

There’s also a kind of incoherence to the idea that a small handful of companies are alone to blame for perceived climate ills. What about the thousands of companies worldwide that are involved in the exploration, production, and distribution of fossil fuels – or the millions of companies that use fossil fuels to make their own goods and perform their services? That’s not even to mention the fact that the same few energy companies now being sued also play a critical role in the U.S. leading the world in CO2 emission reductions, through greater adoption of natural gas. How do the companies spearheading the natural gas renaissance figure into the supposed “deception” at work here?

The stakes here remind us that judges—and therefore elections—matter. Hawaii Supreme Court Chief Justice Mark Recktenwald, who ruled in favor of Honolulu and thereby triggered SCOTUS review, has been associated with the far-left Environmental Law Institute’s (ELI) Climate Judiciary Project, a group that “trains” and “educates” thousands of judges and government lawyers across the country to deliver legal outcomes favored by progressives.

The Supreme Court may be the last line of defense against
fringe activists dictating energy policy for the rest of us.

But of course, SCOTUS itself is in the crosshairs of left-wing judicial scheming, with top Democrats, including presidential nominee Kamala Harris, threatening to pack the Supreme Court if elected. No doubt with judges in the Mark Recktenwald mode.

If Honolulu succeeds in its case, and progressives in their larger campaign of lawfare, Americans can say Aloha to higher prices.

Postscript on Arguing Deception in These Cases

As noted above, the rationale for filing these cases in state rather than federal courts depends on claiming consumer fraud, I.e. oil companies deceived the public while knowing about their damaging energy products.  A good rebuttal against the “Exxon Knew” fiction is provided (with my bolds) by Randal Utech at Master Resource:

To say that Exxon knew the truth back in the early 80s is a laughable fallacy. Effectively they built a primitive model that is characteristically similar to the erroneous modern climate models of today.

Fundamentally their work is based on the poorly understood climate sensitivity (ECS) derived from radiative convective models and GCM models. To their credit, they actually acknowledged the high degree of uncertainty in these estimations. Today, even Hausfather (2022 vs 2019) is beginning to understand the climate sensitivity (ECS) is too high. CMIP6 is running still even hotter than CMIP5 and using ECS of 3 to 5° C rather than ~ 1.2° C as highlighted in Nick Lewis’s 2022 study.

CMIP6 should have been better because it incorporated solar particle forcing (Matthes et. al.) and as they incorporate more elements of natural forcing (an active area of research as we still do not have a predictive theory for climate), the effect is highlighting more underlying problems with the models.

However, Exxon investigators fell into the same trap that climate modelers of today where they build the models to history match temperatures and then wow, because they can create a model that appears to history match temperatures, they assume it is telling them something. Truth? Anyone can create a model to do this, but it would never mean the model is correct. While the models today are much more complex, they are based on a complex set of non-linear equations, and the understanding of the various sources of nonlineararity is poor. This opens up wide degrees of uncertainty yet wide opportunity for tuning. Furthermore, natural forcing is undercharacterized and deemed inconsequential.

The contrived sense of accomplishment in history matching is spurious correlation for an infinitesimally small period of time. Using Exxon’s internal analysis of CO2 climate forcing is little more than a propaganda tool. Current climate models, much more sophisticated, face the same problem of unknown, false causality.

See Background Post:

19 State AGs Ask Supremes to Block Climate Lawsuits

Energy Revolution Not In The Cards

Kite & Key explains in above video Why the Odds Are Stacked Against Net Zero.  For those preferring to read I provide a text from the captions, though the video is entertaining along with great images, some of which are included with the text in italics with my bolds.

Overview

Are we at the beginning of the end of fossil fuels? That’s the theory advanced by an international coalition of politicians who aim to get us to net zero carbon emissions by the year 2050. Just one problem: Research from the experts in their own governments suggests it’s a nearly impossible task. Enthusiasts for net zero often say we’re on the cusp of an “energy revolution.”

And that theory has a big problem: Energy revolutions don’t happen — at least not in the way that politicians often describe. While it’s true that technological and economic factors sometimes change the energy mix — countries that get wealthier become less dependent on wood, for example — the broader trend in the history of the world’s energy consumption can be defined by three words: more, more, more.

In a power-hungry world, we keep adding new energy sources. But there’s rarely any subtraction. And, with global energy demand expected to increase by about 35% by 2050, it’s nearly impossible that we can get all the power we need from carbon-free sources. For instance, meeting the net zero goals would require the construction of over 9,000 nuclear plants by 2050. The number currently being built around the world? 59.

So, what will the future of energy really look like? Our video explores.

Transcription

It doesn’t happen that often. But every once in a while, a single generation witnesses a technological breakthrough that will change the world forever.
The printing press.
The beginning of human flight.
And, for our generation, an inevitable full scale revolution in clean energy…
…that’s running a little behind schedule…
…Ok, way behind schedule.

“The beginning of the end of the fossil fuel era.” That’s how the United Nations referred to the outcome of a 2023 climate change summit held in…the United Arab Emirates. Which is sort of like having the Prohibition Conference in Vegas. Nevertheless, delegates from throughout the world left the gathering having pledged that the world would transition away from fossil fuels and get the world to net zero carbon emissions by the year 2050.

Now, the rationale for this is clear enough. Leaders from around the globe are worried that without a shift over to carbon-free energy sources like wind, solar, hydro, and nuclear the world will face significant problems as a result of climate change.

But, regardless of why they’re doing this, the more important question is whether they can do it. Because here’s the thing about energy revolutions: they don’t happen. At least not in the way that the UN is imagining. To understand why, it’s worth looking at the history of the world’s energy consumption – which looks like this.

Go back a couple of centuries and the world basically ran on “traditional biomass”– -which is a fancy way of saying … wood. We burned a lot of wood and also … dung. Then in the mid 19th century, coal came into the picture in a big way. By the 20th century, we’re using tons of oil. And natural gas is a big factor too, especially as we cross into the 21st century, and fracking makes it both abundant and more affordable. As the years went by, we added low-carbon sources of energy as well, like nuclear, hydro, wind, and solar–though overall, they’re still a pretty small part of the picture.

Now, there are two important things to note about this chart. First, the history of the world’s energy consumption can be defined in three words: more, more, more. Which kind of makes sense. After all, pretty much everything that defines modern life involves a lot of energy. Between 1950 and 2022, for example, the population of the U.S. a little more than doubled. But in that same time period, our electricity use got 14 times larger.

And second, because of that “more and more, more” trend, the only things we’ve ever had that look like energy “revolutions” have been about adding new sources into the mix, not getting rid of existing ones as net zero goals propose.

Now, to be clear, that doesn’t mean that nothing ever changes. In wealthier nations, the rise of cheaper natural gas has led to less coal usage, especially in the U.S. And poorer countries usually abandoned traditional biomass as they get wealthier, because no advanced nation powers itself by burning wood. We use it for much more sophisticated purposes…like doing psychedelics in the Nevada desert.

But using a little less coal or wood or relatively modest changes–and importantly are driven by cold, hard economic facts. By contrast, what the net zero goals entail is replacing all of this … with this … in just about 25 years. Based on little more than the fact that politicians just want it to happen.

To understand just how tall a task this is, it’s worth looking at what it would require to make it a reality. It’s estimated that meeting net zero goals would require deploying 2000 new wind turbines…
…every day … for the next 25 years. To give you some context for that, the U.S. builds about 3000 new wind turbines…
…a year.

Alternately, you could open one new nuclear plant every day for the next 25 years. For the record, that’s over 9,000 of them. And, also for the record, as of 2023, the number that were actually being built across the entire world was … 59.  And here in the U.S. anyway, it generally takes over a decade to build them.

And those are some of the reasons why what politicians promise about net zero and what the experts in their own governments say…don’t exactly match up. The government’s U.S. Energy Information Administration, or EIA, projects that by the year 2050, far from seeing a revolution in energy, America will be a little less reliant on coal, a little more so on renewables…and the rest of the picture looks pretty much the same as today.

And in fact, this is true for the entire world. The EIA ran seven different scenarios for what the world’s energy consumption could look like in 2050, and while all of them showed a significant increase in renewables … they also all showed a world that continued to get most of its energy from things like coal, oil, and natural gas. Not exactly “the beginning of the end of the fossil fuel era.”

The reason for all of this: We simply can’t take enormous quantities of energy offline in a world where it’s predicted that we’re going to need almost 35% more of it by the year 2050. For one thing, there are a lot of poor countries around the world who are going to need dramatically more energy to bring themselves up to even a fraction of our standards of living.

And for another, the technologies of the future require vast amounts of power. By the year 2030, it’s estimated the computer usage around the world will take up as much as five times more of the world’s electricity production as it did even in 2020. The digital cloud we all use to store data already uses twice as much electricity as the entire nation of Japan. And with new energy-hungry technologies like AI on the way, things are only gonna move further in that direction.

Which means the real future of energy is probably: everything. Nuclear, natural gas, wind, and solar, oil, hydropower, coal. We’re going to need all of it. Probably not much wood though.
Except for these guys.

Depravity of Net Zero Agenda

From Daily Sceptic The Real ‘Climate Change Deniers’ Are Those Who Deny the Climate Changed Before We Started Burning Fossil Fuels, Says Geologist.  Excerpts in italics with my bolds and added images.

“We need to be resilient”

As a geologist, Wielicki undoubtedly has a better-than-average understanding of how our planet has evolved in the first place, and how its climate has been in a constant state of flux. Today’s climate science, however, links climate change primarily to the increasing amount of CO2 in the atmosphere, especially its anthropogenic component. Scientists who doubt or dispute this are labelled climate deniers. Wielicki points out that we know very well from Earth’s relatively recent history that major climate changes, such as the Medieval Warm Period (ca. 950-1250) or the subsequent Little Ice Age (ca. 14th to mid-19th century, precise timing depending on the location), occurred without any significant change in the proportion of CO2 in the atmosphere.

“If there’s anything that I argue, it’s that we need to be resilient. We should stop pretending that if we changed or lowered our emissions the climate would stop changing. That’s the true denial of climate right there,” Wielicki says. “What we need to accept is that regardless of the CO2 in the atmosphere, we are going to have climate change and those shifts could occur over timescales of decades or centuries, and we should be prepared.

And being prepared means we need access to cheap, reliable energy.

But the world is moving in the opposite direction under the leadership of today’s political leaders. One of the main objectives is to fight CO2 emissions and to do so by phasing out fossil fuels, among other measures. However, according to Wielicki, the planners have not quite thought everything through. First of all, wind and solar power are unreliable substitutes because they can only be produced when the conditions are right, i.e., when the wind blows and the sun shines. In addition, they need constant support from the taxpayer, because when they can be produced, i.e., sold as energy to the grid, the price of electricity on the market will be low since there is a lot of it at that particular time. So in order for investors to build up these capacities, they need price guarantees from governments or taxpayer support. And on top of that, you still need to additionally build up controllable capacity to ensure that electricity is always available.

Wielicki also says that we need to understand that fossil fuels are not just liquids that we put in our cars at the petrol station but are essential to many aspects of our lives. “About four billion people on the planet are being fed off of agricultural crops that are being fertilised with synthetic fertilizers that are being created from fossil fuels. So you can’t just look at one side of a picture,” Wielicki explains, adding that the increases in atmospheric CO2 levels have actually also increased yields.

In addition, Wielicki says, it is worth thinking that we need to replace many of today’s fossil-fuel-based materials in everyday use, such as plastics, lubricants, oils, chemicals, etc., with new ones if we really want to phase out fossil fuels. “We have to ask what are the benefits that fossil fuels have given the society? And then let’s weigh that against the possible detrimental effects that these climate models argue will happen, but haven’t happened in the observable data yet,” he says.

The rise of the new green colonialism

Programmed into this whole Western orientation towards CO2 reduction, Wielicki says, is hypocrisy on several levels. For a start, it’s worth recognising that by reducing CO2 emissions in Europe or North America, we have effectively decided that we do not produce the goods we need here, but will produce them elsewhere in the world. “We pat ourselves on the back and say: look, we’ve lowered our CO2 emissions by this much! But all we’ve done is essentially offshored that industry to China and India, They do it dirtier. They have no regard for things like environmental policy. And so the global CO2 is going up faster than ever,” Wielicki notes.

While the big Asian countries are ramping up the use of coal to satisfy their energy appetite, many African countries don’t have a similar option. According to Wielicki, this is directly linked to the UN’s policy of not wanting these countries to increase their use of fossil fuels. This means, for example, that farm work that is done elsewhere by tractor still has to be done by many Africans with their hands. A large proportion of Africans also have little or no access to electricity. Food is cooked indoors on open fires, burning dung and wood.

The resulting smoke leads to respiratory illnesses and many people die as a result. All this could be easily avoided, according to Wielicki, by giving them access to propane bottles and gas cookers. “It might make them breathe easier at night. It might make their health better. But it’s going to increase the atmospheric CO2, and that is something we can’t have. These poor people must suffer and live in poverty because we need to save the planet. It’s so hypocritical,” Wielicki says.

What’s more, according to Wielicki, our hypocrisy lies in the fact that at the same time, we want to mine the minerals we need for our own energy transition, such as cobalt, in that very same Africa. “We’re switching to very mineral and energy intensive technologies like solar panels and electric vehicles. And we’re taking all of these raw materials from Africa,” he says. “I think this is going to be, probably, the legacy of this green revolution. I call it the new green colonialism. It’s unfortunately going to keep hundreds of millions of poor people in developing nations in poverty for decades longer than they ever needed to be,” Wielicki adds.

Again Wuhan Lab Coverup for Covid Virus Origin. Don’t Buy it!

Today we have a coordinated release globally of a study claiming to disprove the Covid 19 virus came from the Wuhan Institute of Viology (WIV).  An example is the article from the UK so-called Independent ‘Beyond reasonable doubt’ Covid pandemic originated at Wuhan market stall.  Excerpts in italics.

US and French researchers traced coronavirus to one stall in
Wuhan, China after analysing genetic samples

It is beyond “reasonable doubt” that the Covid-19 pandemic originated in a Chinese animal market, a new scientific study has found.

Researchers from the United States and France discovered one stall in Wuhan, China, was a hotspot for coronavirus after analysing hundreds of genetic samples collected by Chinese authorities in 2020.

The analysis, published in Cell, compiled a list of animals including raccoon dogs, masked palm civets, hoary bamboo rats and Malayan porcupines, that were likely to have passed the virus to humans.

“It’s far beyond reasonable doubt that this is how it happened,” Professor Michael Worobey at the University of Arizona told the BBC, noting that other theories involve some “really quite fanciful absurd scenarios”.

Professor Kristian Andersen from Scripps Research in the US, added: “We find a very consistent story in terms of this pointing to the market as being the very likely origin of this particular pandemic.”

Color me skeptical.  Both Worobey and Andersen were carrying Fauci’s water in 2020-21 when he was lying to congress about funding gain of function research at WIV.  Cui Bono from this paper?  China, who destroyed all the evidence in 2020, but now it supposedly appears in 2024.  And Fauci, who enlisted these and other researchers to publish fake studies in various journals to cancel the lab-leak explanation for the covid pandemic.

For a desciption of the malfeasance perpetrated by these charlatans, see this expose by Vanity Fair in 2022 “This Shouldn’t Happen”: Inside the Virus-Hunting Nonprofit at the Center of the Lab-Leak Controversy.

For a level-headed reaction to this latest coverup, here is a comment to the published Worobey et al paper by Alexander Chervonsky Professor of Pathology The University of Chicago.  In italics with my bolds.