Wind Energy Risky Business

The short video above summarizes the multiple engineering challenges involved in relying on wind and/or solar power.  Real Engineering produced The Problem with Wind Energy with excellent graphics.  For those who prefer reading, I made a transcript from the closed captions along with some key exhibits.

The Problem with Wind Energy

This is a map of the world’s wind Resources. With it we can see why the middle Plains of America has by far the highest concentrations of wind turbines in the country. More wind means more power.

However one small island off the mainland of Europe maxes out the average wind speed chart. Ireland is a wind energy Paradise. During one powerful storm wind energy powered the entire country for 3 hours, and it is not uncommon for wind to provide the majority of the country’s power on any single day. This natural resource has the potential to transform Ireland’s future.

But increasing wind energy on an energy grid comes with a lot of logistical problems which are all the more difficult for a small isolated island power grid. Mismanaged wind turbines can easily destabilize a power grid. From Power storage to grid frequency stabilization, wind energy is a difficult resource to build a stable grid upon.

To understand why, we need to take these engineering
Marvels apart and see how they work.

Hidden within the turbine cell is a Wonder of engineering. We cannot generate useful electricity with the low- speed high torque rotation of these massive turbine rotors. They rotate about 10 to 20 times a minute. The generator needs a shaft spinning around 1,800 times per minute to work effectively. So a gearbox is needed between the rotor shaft and the generator shaft.

The gearboxes are designed in stages. Planetary gears are directly attached to the blades to convert the extremely high torque into faster rotations. This stage increases rotational speed by four times. Planetary gears are used for high torque conversion because they have more contact points allowing the load to be shared between more gear teeth.

Moving deeper into the gearbox, a second stage set of helical gears multiplies the rotational speed by six. And the third stage multiplies It again by four to achieve the 1,500 to 1,800 revolutions per minute needed for the generator.

These heavy 15 tonne gearboxes have been a major source of frustration for power companies. Although they’ve been designed to have a 20-year lifespan, most don’t last more than 7 years without extensive maintenance. This is not a problem exclusive to gearboxes in wind turbines, but changing a gearbox in your car is different from having a team climb up over 50 meters to replace a multi-million dollar gearbox. Extreme gusts of wind, salty conditions and difficult to access offshore turbines increases maintenance costs even more. The maintenance cost of wind turbines can reach almost 20% of the levelized cost of energy.

In the grand scheme of things wind is still incredibly cheap. However we don’t know the precise mechanisms causing these gearbox failures. We do know that the wear shows up as these small cracks that form on the bearings,which are called White Edge cracks from the pale material that surrounds the damaged areas. This problem only gets worse when turbines get bigger and more powerful, requiring even more gear stages to convert the incredibly high torque being developed by the large diameter rotors.

One way of avoiding all of these maintenance costs is to skip the gearbox and connect the blades directly to the generator. But a different kind of generator is needed. The output frequency of the generator needs to match the grid frequency. Slower Revolutions in the generator need to be compensated for with a very large diameter generator that has many more magnetic poles meaning a single revolution of the generator passes through more alternating magnetic fields which increases the output frequency.

The largest wind turbine ever made, the Haliade X, uses a direct drive system. You can see the large diameter generator positioned directly behind the blades here. This rotor disc is 10 m wide with 200 poles and weighs 250 tons. But this comes with its own set of issues. Permanent magnets require neodium and dysprosium and China controls 90% of the supply of these rare earth metals. Unfortunately trade negotiations and embargos lead to fluctuating material costs that add extra risk and complexity to direct drive wind turbines. Ireland is testing these new wind turbines here in the Galway Wind Park. The blades were so large that this road passing underneath the Lough Atalia rail Bridge, which I use to walk home from school every day, had to be lowered to facilitate the transport of the blades from the nearby docks. It takes years to assess the benefit of new Energy Technologies like this, but as wind turbines get bigger and more expensive, direct drive systems become more attractive.

The next challenge is getting the electricity created inside these generators to match the grid frequency. When the speed of the wind constantly changes, the frequency of current created by permanent magnet generators matches the speed of the shaft. If we wanted the generator to Output the US Standard 60 HZ we could design a rotor to rotate 1,800 times per minute with four poles two North and two South. This will result in 60 cycles per second. This has to be exact; mismatched frequencies will lead to chaos on the grid, bringing the whole system down.

Managing grid frequency is a 24/7 job. In the UK, grid operators had to watch a popular TV show themselves so they could bring pumped Hydro stations online. Because a huge portion of the population went to turn on kettles to make tea during the ad breaks. This increased the load on the grid and without a matching increase in Supply, the frequency would have dropped. The grid is very sensitive to these shifts; a small 1 Herz change can bring a lot lot of Destruction.

During the 2021 freeze in Texas the grid fell Incredibly close to 59 Hertz. It was teetering on the edge of a full-scale blackout that would have lasted for months. Many people solely blamed wind turbines not running for causing this issue, but they were only partly to blame, as the natural gas stations also failed. Meanwhile the Texas grid also refuses to connect to the wider North American grid to avoid Federal Regulations. Rather oddly Texas is also an isolated power grid that has a large percentage of wind energy.

The problem with wind energy is that it is incapable of raising the grid frequency if it drops. Wind turbines are nonsynchronous and increasing the percentage of wind energy on the grid requires additional infrastructure to maintain a stable grid. To understand what nonsynchronous means, we need to dive into the engineering of wind turbines once again. The first electric wind turbines connected to the grid were designed to spin the generator shaft at exactly 1,800 RPM. The prevailing winds dictated the size and shape of the blades. The aim was to have the tips of the blades move at around seven times the speed of the prevailing wind. The tips of the blades were designed to stall if the wind speed picked up. This let them have a passive control and keep the blades rotating at a constant speed.

While this allowed the wind turbines to be connected straight to the grid, the constant rotational speed did induce large forces onto the blades. Gusts of wind would increase torque rapidly which was a recipe for fatigue failure in the drivetrain. So to extract more power, variable speed wind turbines were introduced. Instead of fixed blades that depended on a stall mechanism for control, the blades were attached to the hub with massive bearings that would allow the blades to change their angle of attack. This provided an active method of speed control, but now another problem emerged.

The rotor operated at different speeds and the frequency coming from the generator was variable. A wind turbine like this cannot be connected directly to the grid. Connecting a varying frequency generator to the grid means the power has to be passed through two inverters. The first converts the varying AC to DC using a rectifier; then the second converter takes the DC current and converts it back to AC at the correct frequency. This is done with electronic switches that rapidly turn on and off to create the oscillating wave.

We lose some power in this process but the larger issue for the grid as a whole is that this removes the benefit of the wind Turbine’s inerti. Slowing something heavy like a train is difficult because it has a lot of inertia. Power grids have inertia too. Huge rotating steam turbines connected directly to the grid are like these trains; they can’t be slowed down easily. So a grid with lots of large turbines like nuclear power and coal power turbines can handle a large load suddenly appearing and won’t experience a sudden drop in Grid frequency. This helps smooth out sudden increases in demand on the grid and gives grid operators more time to bring on new power sources.

Wind turbines of course have inertia, they are large rotating masses. But those inverters mean their masses aren’t connected directly to the grid, and so their inertia can’t help stabilize the grid. Solar panels suffer from the same problem, but they couldn’t add inertia anyway as they don’t move.

This is an issue for Renewables that can become a critical vulnerability when politicians push to increase the percentage of Renewables onto a grid without considering the impacts it can have on grid stability. Additional infrastructure is needed to manage this problem, especially as older energy sources, like coal power plants, that do provide inertia begin to shut down.

Ireland had a creative solution to this problem. In 2023 the world’s largest flywheel, a 120 ton steel shaft that rotates 3,000 times per minute, was installed in the location of a former coal power plant that already had all the infrastructure needed to connect to the grid. This flywheel takes about 20 minutes to get up to speed using grid power but it is kept rotating constantly inside a vacuum to minimize power lost to friction. When needed it can instantly provide power at the exact 50 HZ required by the grid. This flywheel provides the inertia needed to keep the grid stable, but it’s estimated that Ireland will need five more of these flywheels to reach its climate goals with increasing amounts of wind energy.

But they aren’t designed for long-term energy storage, they are purely designed for grid frequency regulation. Ireland’s next problem is more difficult to overcome. It’s an isolated island with few interconnections to other energy grids. Trading energy is one of the best ways to stabilize a grid. Larger grids are just inherently more stable. Ideally Ireland could sell wind energy to France when winds are high and buy nuclear energy when they are low. Instead right now Ireland needs to have redundancy in its grid with enough natural gas power available to ramp up when wind energy is forecasted to drop.

Currently Ireland has two interconnect connections with Great Britain but none to Mainland Europe. That is hopefully about to change with this 700 megawatt interconnection currently planned with France. With Ireland’s average demand at 4,000 megawatts, this interconnection can provide 17.5% of the country’s power needs when wind is low, or sell that wind to France when it is high. This would allow Ireland to remove some of that redundancy from its grid, while making it worthwhile to invest in more wind power as the excess then has somewhere to go.

The final piece of the puzzle is to develop long-term energy storage infrastructure. Ireland now has 1 gigawatt hour of energy storage, but this isn’t anywhere close to the amount needed. Ireland’s government has plans to develop a hydrogen fuel economy for longer term storage and energy export. In the National hydrogen plan they set up a pathway to become Europe’s main producer of green hydrogen, both for home use and for exports. With Ireland’s abundance of fresh water, thanks to our absolutely miserable weather, and our prime location along World shipping routes and being a hub for the third largest airline in the world, Ireland is very well positioned to develop a hydrogen economy.

These transport methods aren’t easily decarbonized and will need some form of renewably sourced synthetic fuel for which hydrogen will be needed, whether that’s hydrogen itself, ammonia or synthetic hydrocarbons. Synthetic hydrocarbons can be created using hydrogen and carbon dioxide captured from the air. Ireland’s winning combination of cheap renewable energy abundant fresh water and its strategically advantageous location positions it well for this future renewable energy economy. Ireland plans to begin the project by generating hydrogen with electrolysis with wind energy that has been shut off due to oversupply which is basically free energy.

As the market matures phase two of the plan is to finally begin tapping into Ireland’s vast offshore wind potential exclusively for hydrogen production with the lofty goal of 39 terrawatt hours of production by 2050 for use in energy storage fuel for transportation and for industrial heating. Ireland is legally Bound by EU law to achieve net zero emissions by 2050 but even without these lofty expectations it’s in Ireland’s best interest to develop these Technologies. Ireland has some of the most expensive electricity prices in Europe due to its Reliance on fossil fuel Imports which increased in price drastically due to the war in Ukraine. Making this transition won’t be easy and there are many challenges to overcome, but Ireland has the potential to not only become more energy secure but has the potential to develop its economy massively. Wind is a valuable resource by itself but in combination with its abundance of fresh water it could become one of the most energy rich countries in the world.

Comment

That’s a surprisingly upbeat finish boosting Irish prospects to be an energy powerhouse, considering all of the technical, logistical and economic issues highlighted along the way.  Engineers know more than anyone how complexity often results in fragility and unreliability in practice. Me thinks they are going to use up every last bit of Irish luck to pull this off. Of course the saddest part is that the whole transition is unnecessary, since more CO2 and warmth has been a boon for the planet and humankind.

See Also:

Replace Carbon Fuels with Hydrogen? Absurd, Exorbitant and Pointless

Big Batteries? Big Problems!

Battery Mad-Hattery

Viv Forbes’ article on this subject is at Canadian Free Press under the title First Aid for Flicker Power.  Excerpts in italics with my bolds and added images.

Wind and solar energy have a fatal flaw – intermittency

Big batteries bring big problems

Solar generators won’t run on moon-beams – they fade out as the sun goes down and stop whenever clouds block the sun. This happens at least once every day. But then at mid-day on most days, millions of solar panels pour so much electricity into the grid that the price plummets and no one makes any money.

Can your solar project weather a hailstorm?

Our green energy bureaucrats have the solution
to green power failures – “Big Batteries”

Turbine generators are also intermittent – they stop whenever there is too little, or too much wind. In a wide flat land like Australia, wind droughts may affect huge areas for days at a time. This often happens when a mass of cold air moves over Australia, winds drop and power demand rises in the cold weather. All of this makes our power grid more variable, more fragile and more volatile. What do we do if we have a cloudy windless week?

More big batteries storing renewable energy to be built around Australia The batteries will come online by 2025 with sites in Queensland, Victoria, New South Wales and South Australia.

Our green energy bureaucrats have the solution to green power failures – “Big Batteries”.

But big batteries bring more big problems – they have to be re-charged by the same intermittent green generators needed to keep the lights on, the trains running and the batteries charged in all those electric cars, trucks and dozers. And if anyone has been silly enough to build some power-hungry green hydrogen generators, they too will need more generation capacity and more battery backups. How long do we allow them to keep throwing our dollars into this green whirlpool?

Collecting dilute intermittent wind and solar energy from all over a big continent like Australia and moving it to coastal cities and factories brings another “green” energy nightmare – an expensive and intrusive spider-web of power-lines that are detested by landowners, degrade the environment, cause bushfires and are susceptible to damage from lightning, cyclones and sabotage.

They call them solar “farms” and wind “parks” – they are neither farms nor parks – they are monstrous and messy wind and solar power plants   And these very expensive “green” assets are idle, generating nothing, for most of most days.

In late July 2021, a fire broke out at the Victorian Big Battery in Moorabool, which was undergoing testing when the incident began. Image: CFA

Big batteries sitting in cities have proved a big fire risk and no one wants them next door. So our green “engineers” have another solution to these problems caused by their earlier “solutions” – “Mobile Batteries” (this is a worry – no one knows where they are – maybe they will be disguised as Mr Whippy ice cream vans)?

Near elimination of air pollution from diesel-electric freight trains by 2025 is now possible by retrofitting them with battery tender cars. BeyondImages/iStock

Train entrepreneurs want to build “batteries on tracks” – a train loaded with batteries, which parks beside a wind/solar energy factory until the batteries are full. Then the battery train trundles off to the nearest city to unload its electricity, preferably at a profit. They can also play the arbitrage market – buy top-up power around midday and sell into peak prices at breakfast and dinner times when the unreliable twins usually produce nothing useful. This will have the added advantage of sending coal and gas generators broke sooner by depressing peak prices. Once coal and gas are decimated, then the battery trains can make a real killing.

But battery trains may be the perfect answer to supplying those energy-hungry AI data centres. Let’s start a pilot project and park a battery train beside the National AI Centre near CSIRO in Canberra.

“Big Batteries on Boats”

Lithium-ion batteries ‘keeping the fire alive’ on burning cargo ship carrying luxury cars 2022

A more ambitious idea is the BBB Plan – “Big Batteries on Boats”.It would work like this:

The Australian government places an order with China to build a fleet of electric boats (sail-assisted of course) that are filled with batteries (and lots of fire extinguishers). The batteries are charged with cheap coal-fired electricity at ports in China. They then sail to ports in Australia where the electricity is un-loaded into the grid whenever prices are high or blackouts loom.

Australian mines can profit from the iron ore used to make the boats, the rare minerals used to build the batteries and any Australian coal used by the Chinese power plants to charge the batteries.

This solution allows Australian politicians to go to world conferences boasting that Australia’s electricity is “Net Zero”, and more tourists can be enticed to visit our endangered industrial relics – coal mining and steam generator museums.

Of course there is another danger in the BBB solution – some entrepreneurs may load their boats with nuclear generators plus enough fuel on board for several decades of operation. Or they may even site a small nuclear reactor beside a closed coal power station and make use of all the ready-to-go power lines already in place.

Concerns over how transmission lines are ‘impacting’ prime land. Sky News Australia

This sort of dangerous thinking could well demolish another Queensland green dream – “CopperString” – a $5 billion speculation to build 840 km of new transmission line from Townsville to Mt Isa. We are not sure which way the power is expected to flow. They will probably not get there before the great copper mine at Mt Isa closes.

Why not just send a small nuke-on-a-train to Mt Isa?

Viv Forbes, Chairman, The Carbon Sense Coalition, has spent his life working in exploration, mining, farming, infrastructure, financial analysis and political commentary. He has worked for government departments, private companies and now works as a private contractor and farmer.

Viv has also been a guest writer for the Asian Wall Street Journal, Business Queensland and mining newspapers. He was awarded the “Australian Adam Smith Award for Services to the Free Society” in 1988, and has written widely on political, technical and economic subjects.

Green Baloney, Hype and Fairy Tales in Australia

Viv Forbes writes at Spectator Australia Battery baloney, hydrogen hype, and green fairy tales in Australia.  Excerpts in italics with my bolds and added images.  H/T John Ray at his blog Greenie Watch.

How low Australia has fallen… Our once-great BHP now has a ‘Vice President for Sustainability and Climate Change’, the number of Australian students choosing physics at high school is collapsing, and our government opposes nuclear energy while pretending we can build and operate nuclear submarines.

Our Green politicians want: ‘No Coal, No Gas, No Nuclear!’ while Our ABC, Our CSIRO, and Our Australian Energy Market Operator (AEMO) are telling us that wind and solar energy (plus a bit of standby gas, heaps of batteries, and new power lines) can power our homes, industries and the mass electrification of our vehicle fleet. This sounds like Australia’s very own great leap backwards.

There are two troublesome Green Energy Unions: the Solar Workers down tools every night and cloudy day, and the Turbine Crews stop work if winds are too weak or too strong. And wind droughts can last for days. The reliable Coal and Gas Crews spend sunny days playing cards, but are expected to keep their turbines revving up and down to keep stable power in the lines.

From Duck to Canyon Curve

Magical things are also expected from more rooftop solar. But panel-power has four huge problems:

♦  Zero solar energy is generated to meet peak demand at breakfast and dinner times.

♦  Piddling solar power is produced from many poorly oriented roof panels or from the weak sunshine anywhere south of Sydney.

♦  If too much solar energy pours into the network (say at noon on a quiet sunny Sunday), the grid becomes unstable. Our green engineers have the solution – be ready to charge people for unwanted power they export to the grid, or just use ‘smart meters’ to turn them off.

♦  More rooftop solar means less income and more instability for power utilities so they have to raise electricity charges. This cost falls heaviest on those with no solar panels, or no homes.

Magical things are also expected from batteries.

When I was a kid on a dairy farm in Queensland, I saw our kerosene lamps and beeswax candles replaced by electric lights. We had 16 X 2 volt batteries on the verandah and a big thumping diesel generator in the dairy.

It was a huge relief, years later, when power poles bringing reliable electricity marched up the lane to our house. All those batteries disappeared with the introduction of 24/7 coal power.

Batteries are never a net generator of power – they store energy generated elsewhere, incurring losses on charging and discharging.

There has to be sufficient generating capacity to meet current demand while also recharging those batteries. What provides electricity to power homes, lifts, hospitals, and trains and to recharge all those vehicle batteries after sundown on a still winter night? (Hint: Call the reliable coal/gas/nuclear crews.)

The same remorseless equations apply to all the pumped hydro schemes being dreamed up – everyone is a net consumer of power once losses are covered and the water is pumped back up the hill.

Yet AEMO hopes we will install 16 times our current capacity of batteries and pumped hydro by 2050 – sounds like the backyard steel plans of Chairman Mao or the Soviet Gosplan that constipated initiative in USSR for 70 years. Who needs several Snowy 2 fiascos running simultaneously?

Mother Nature has created the perfect solar battery which holds the energy of sunlight for millions of years. When it releases that energy for enterprising humans, it returns CO2 for plants to the atmosphere from whence it came. It is called ‘Coal’.

‘Hydrogen’ gets a lot of hype, but it is an elusive and dangerous gas that is rarely found naturally. To use solar energy to generate hydrogen and to then use that hydrogen as a power source is just another silly scheme to waste water and solar energy. It always takes more energy to produce hydrogen than it gives back. Let green billionaires, not taxpayers, spend their money on this merry-go-round.

Who is counting the energy and capital consumed, and the emissions generated, to manufacture, transport, and install a continent being covered by ugly solar panels, bird slicers, high voltage power lines, access roads, and hydro schemes? Now they want to invade our shallow seas. Who is going to clean up this mess in a few years’ time?

As Jo Nova says:

‘No one wants industrial plants in their backyard, but when we have to build 10,000 km of high voltage towers, 40 million solar panels, and 2,500 bird-killing turbines – it’s in everyone’s backyard.’

With all of this planned and managed by the same people who gave us Pink Batts, Snowy 2 hydro, and the NBN/NDIS fiascoes, what could possibly go wrong?

Another big problem is emerging – country people don’t want power lines across their paddocks, whining wind turbines on their hills, and glittering solar panels smothering their flats. And seaside dwellers don’t want to hear or see wind turbines off their beaches. Even whales are confused.

The solution is obvious – build all wind and solar facilities in electorates that vote Green, Teal, and Labor. Those good citizens can then listen to the turbines turning in the night breezes and look out their windows to see shiny solar panels on every roof. This will make them feel good that they are preventing man-made global warming. Those electorates who oppose this silly green agenda should get their electricity from local coal, gas or nuclear plants.

What about the Net Zero targets?

At the same time as Australia struggles to generate enough reliable power for today, governments keep welcoming more migrants, more tourists, more foreign students and planning yet more stadiums, games, and circuses. None of this is compatible with their demand for Net Zero emissions.

Unlike Europe, the Americas, and Asia, Australia has no extension cords to neighbours with reliable power from nuclear, hydro, coal, or gas – we are on our own.

Australia has abundant resources of coal and uranium – we mine and export these energy minerals but Mr Bowen, our Minister for Blackouts, says we may not use our own coal and uranium to generate future electricity here. Someone needs to tell him that no country in the world relies solely on wind, solar, and pumped hydro. Germany tried but soon found they needed French nuclear, Scandinavian hydro, imported gas, and at least 20 coal-fired German power plants are being resurrected or extended past their closing dates to ensure Germans have enough energy to get through the winter.

Australia is the only G20 country in which nuclear power is illegal (maybe no one has told green regulators that we have had a nuclear reactor at Lucas Heights in Sydney since 1958). Australia is prepared to lock navy personnel beside nuclear power plants in our new nuclear-powered submarines but our politicians forbid nuclear power stations in our wide open countryside.

More CO2 in the atmosphere brings great benefits to life on Earth. If man adds to it, the oceans dissolve a swag of it, and what stays in the atmosphere is gratefully welcomed by all plant life.

In 2023, Australia added just 0.025 ppm to the 420 ppm in today’s atmosphere. Most of this probably dissolved in the oceans. If we in Australia turned everything off tomorrow, the climate wouldn’t notice, but our plant life would, especially those growing near power stations burning coal or gas and spreading plant food.

Climate has always changed and a warm climate has never been a problem
on Earth. 
It is cold that kills. Especially during blackouts.

Scientists Say: Net Zero Wins Nearly Zero Results

Chris Morrison explains at his Daily Sceptic article Net Zero Will Prevent Almost Zero Warming, Say Three Top Atmospheric Scientists.  Excerpts in italics with my bolds and added images.

Recent calculations by the distinguished atmospheric scientists Richard Lindzen, William Happer and William van Wijngaarden suggest that if the entire world eliminated net carbon dioxide emissions by 2050 it would avert warming of an almost unmeasurable 0.07°C. Even assuming the climate modelled feedbacks and temperature opinions of the politicised Intergovernmental Panel on Climate Change (IPCC), the rise would be only 0.28°C. Year Zero would have been achieved along with the destruction of economic and social life for eight billion people on Planet Earth. “It would be hard to find a better example of a policy of all pain and no gain,” note the scientists. [Paper is Net Zero Averted Temperature Increase  by Lindzen, Happer and van Wijngaarden.]

In the U.K., the current General Election is almost certain to be won by a party that is committed to outright warfare on hydrocarbons. The Labour party will attempt to ‘decarbonise’ the electricity grid by the end of the decade without any realistic instant backup for unreliable wind and solar except oil and gas. Britain is sitting on huge reserves of hydrocarbons but new exploration is to be banned. It is hard to think of a more ruinous energy policy, but the Conservative governing party is little better. Led by the hapless May, a woman over-promoted since her time running the education committee on Merton Council, through to Buffo Boris and Washed-Out Rishi, its leaders have drunk the eco Kool-Aid fed to them by the likes of Roger Hallam, Extinction Rebellion and the Swedish Doom Goblin. Adding to the mix in the new Parliament will be a likely 200 new ‘Labour’ recruits with university degrees in buggerallology and CVs full of parasitical non-jobs in the public sector.

Hardly any science knowledge between them, they even believe that they can spend billions of other people’s money to capture CO2 – perfectly good plant fertiliser – and bury it in the ground. As a privileged, largely middle class group, they have net zero understanding of how a modern industrial society works, feeds itself and creates the wealth that pays their unnecessary wages. All will be vying to save the planet and stop a temperature rise that is barely a rounding error on any long-term view.

They plan to cull the farting cows, sow wild flowers where food
once grew, take away efficient gas boilers and internal combustion
cars and stop granny visiting her grandchildren in the United States.

On a wider front, banning hydrocarbons will remove almost everything from a modern society including many medicines, building materials, fertilisers, plastics and cleaning products. It might be shorter and easier to list essential items where hydrocarbons are absent than produce one where they are present. Anyone who dissents from their absurd views is said to be in league with fossil fuel interests, a risible suggestion given that they themselves are dependent on hydrocarbon producers to sustain their enviable lifestyles.

Unlike politicians the world over who rant about fire and brimstone, Messrs Lindzen, Happer and van Wijngaarden pay close attention to actual climate observations and analyses of the data. Since it is impossible to determine how much of the gentle warming of the last two centuries is natural or caused by higher levels of CO2, they assume a ‘climate sensitivity’ – rise in temperature when CO2 doubles in the atmosphere – of 0.8°C. This is about four times less than IPCC estimates, which lacks any proof. Understandably the IPCC does not make a big issue of this lack of crucial proof at the heart of the so-called 97% anthropogenic ‘consensus’.

The 0.8°C estimate is based on the idea that greenhouse gases like CO2 ‘saturate’ at certain levels and their warming effect falls off a logarithmic cliff. This idea has the advantage of explaining climate records that stretch back 600 million years since CO2 levels have been up to 10-15 times higher in the past compared with the extremely low levels observed today. There is little if any long term causal link between temperature and CO2 over time. In the immediate past record there is evidence that CO2 rises after natural increases in temperature as the gas is released from warmer oceans.

Any argument that the Earth has a ‘boiling’ problem caused by the small CO2 contribution that humans make by using hydrocarbons is ‘settled’ by an invented political crisis, but is backed by no reliable observational data. Most of the fear-mongering is little more than a circular exercise using computer models with improbable opinions fed in, and improbable opinions fed out.

The three scientists use a simple formula using base-two logarithms to assess the CO2 influence on the atmosphere based on decades of laboratory experiments and atmospheric data collection. They demonstrate how trivial the effect on global temperature will be if humanity stops using hydrocarbons. After years wasted listening to Greta Thunberg, the message is starting to penetrate the political arena. In the United States, the Net Zero project is dead in the water if Trump wins the Presidential election. In Europe, the ruling political elites, both national and supranational, are retreating on their Net Zero commitments. Reality is starting to dawn and alternative political groupings emerge to challenge the comfortable insanity of Net Zero virtue signalling. In New Zealand, the nightmare of the Ardern years is being expunged with a roll back of Net Zero policies ahead of possible electricity black outs.

Only in Britain it seems are citizens prepared to elect a Government obsessed with self-inflicted poverty and deindustrialisation. The only major political grouping committed to scrapping Net Zero is the Nigel Farage-led Reform party and although it could beat the ruling Conservatives into second place in the popular vote, it is unlikely to secure many Parliamentary seats under the U.K.’s first-past-the-post electoral system. Only a few years ago the Labour leader Sir Keir Starmer, who thinks some women have penises, and his imbecilic Deputy Leader Angela Rayner, were bending the knee to an organisation that wanted to cut funding for the police and fling open the borders. The new British Parliament will have plenty of people who still support Net Zero and assorted woke woo woo, and the great tragedy is that they will still be found across most of the represented political parties.

See Also 

Delusions of Davos and Dubai

 

2024 Update: Fossil Fuels ≠ Global Warming

gas in hands

Previous posts addressed the claim that fossil fuels are driving global warming. This post updates that analysis with the latest (2023) numbers from Energy Institute and compares World Fossil Fuel Consumption (WFFC) with three estimates of Global Mean Temperature (GMT). More on both these variables below. Note: Previously these same statistics were hosted by BP.

WFFC

2023 statistics are now available from Energy Institute for international consumption of Primary Energy sources. Statistical Review of World Energy. 

The reporting categories are:
Oil
Natural Gas
Coal
Nuclear
Hydro
Renewables (other than hydro)

Note:  Energy Institute began last year to use Exajoules to replace MToe (Million Tonnes of oil equivalents.) It is logical to use an energy metric which is independent of the fuel source. OTOH renewable advocates have no doubt pressured EI to stop using oil as the baseline since their dream is a world without fossil fuel energy.

From BP conversion table 1 exajoule (EJ) = 1 quintillion joules (1 x 10^18). Oil products vary from 41.6 to 49.4 tonnes per gigajoule (10^9 joules).  Comparing this annual report with previous years shows that global Primary Energy (PE) in MToe is roughly 24 times the same amount in Exajoules.  The conversion factor at the macro level varies from year to year depending on the fuel mix. The graphs below use the new metric.

This analysis combines the first three, Oil, Gas, and Coal for total fossil fuel consumption world wide (WFFC).  The chart below shows the patterns for WFFC compared to world consumption of Primary Energy from 1965 through 2023.

The graph shows that global Primary Energy (PE) consumption from all sources has grown continuously over nearly 6 decades. Since 1965  oil, gas and coal (FF, sometimes termed “Thermal”) averaged 88% of PE consumed, ranging from 93% in 1965 to 82% in 2023.  Note that in 2020, PE dropped 21 EJ (4%) below 2019 consumption, then increased 31 EJ in 2021.  WFFC for 2020 dropped 24 EJ (5%), then in 2021 gained back 26 EJ to slightly exceed 2019 WFFC consumption. For the 59 year period, all net changes were increases from previous years and were:

Oil 203%
Gas 536%
Coal 182%
WFFC 246%
PE 297%
Global Mean Temperatures

Everyone acknowledges that GMT is a fiction since temperature is an intrinsic property of objects, and varies dramatically over time and over the surface of the earth. No place on earth determines “average” temperature for the globe. Yet for the purpose of detecting change in temperature, major climate data sets estimate GMT and report anomalies from it.

UAH record consists of satellite era global temperature estimates for the lower troposphere, a layer of air from 0 to 4km above the surface. HadSST estimates sea surface temperatures from oceans covering 71% of the planet. HadCRUT combines HadSST estimates with records from land stations whose elevations range up to 6km above sea level.

Both GISS LOTI (land and ocean) and HadCRUT4 (land and ocean) use 14.0 Celsius as the climate normal, so I will add that number back into the anomalies. This is done not claiming any validity other than to achieve a reasonable measure of magnitude regarding the observed fluctuations.[Note: HadCRUT4 was discontinued after 2021 in favor of HadCRUT5.]

No doubt global sea surface temperatures are typically higher than 14C, more like 17 or 18C, and of course warmer in the tropics and colder at higher latitudes. Likewise, the lapse rate in the atmosphere means that air temperatures both from satellites and elevated land stations will range colder than 14C. Still, that climate normal is a generally accepted indicator of GMT.

Correlations of GMT and WFFC

The next graph compares WFFC to GMT estimates over the decades from 1965 to 2023 from HadCRUT4, which includes HadSST4.

Since 1965 the increase in fossil fuel consumption is dramatic and monotonic, steadily increasing by 246% from 146 to 505 exajoules.  Meanwhile the GMT record from Hadcrut shows multiple ups and downs with an accumulated rise of 0.8C over 56 years, 6% of the starting value.

The graph below compares WFFC to GMT estimates from UAH6, and HadSST4 for the satellite era from 1980 to 2023 a period of 44 years.

In the satellite era WFFC has increased at a compounded rate of 1.5% per year, for a total increase of 97% since 1979. At the same time, SST warming amounted to 0.76C, or 5% of the starting value.  UAH warming was 0.85C, or 6% up from 1979.  The temperature compounded rate of change is 0.1% per year, an order of magnitude less than WFFC.  Even more obvious is the 1998 El Nino peak and flat GMT since.

Summary

The climate alarmist/activist claim is straight forward: Burning fossil fuels makes measured temperatures warmer. The Paris Accord further asserts that by reducing human use of fossil fuels, further warming can be prevented.  Those claims do not bear up under scrutiny.

It is enough for simple minds to see that two time series are both rising and to think that one must be causing the other. But both scientific and legal methods assert causation only when the two variables are both strongly and consistently aligned. The above shows a weak and inconsistent linkage between WFFC and GMT.

Going further back in history shows even weaker correlation between fossil fuels consumption and global temperature estimates:

wfc-vs-sat

Figure 5.1. Comparative dynamics of the World Fuel Consumption (WFC) and Global Surface Air Temperature Anomaly (ΔT), 1861-2000. The thin dashed line represents annual ΔT, the bold line—its 13-year smoothing, and the line constructed from rectangles—WFC (in millions of tons of nominal fuel) (Klyashtorin and Lyubushin, 2003). Source: Frolov et al. 2009

In legal terms, as long as there is another equally or more likely explanation for the set of facts, the claimed causation is unproven. The more likely explanation is that global temperatures vary due to oceanic and solar cycles. The proof is clearly and thoroughly set forward in the post Quantifying Natural Climate Change.

Footnote: CO2 Concentrations Compared to WFFC

Contrary to claims that rising atmospheric CO2 consists of fossil fuel emissions, consider the Mauna Loa CO2 observations in recent years.

 

Despite the drop in 2020 WFFC, atmospheric CO2 continued to rise steadily, demonstrating that natural sources and sinks drive the amount of CO2 in the air.

See also: Nature Erases Pulses of Human CO2 Emissions

Temps Cause CO2 Changes, Not the Reverse

Stress Testing California’s Grid Batteries

Lots of PR coming out of the golden state regarding great strides in building battery capacity required by the green dream of 100% carbon free electrical power.

From Business Insider: 

Batteries briefly became the biggest source of power in California twice in the past week.

The first time — Tuesday last week around 8:10 p.m. PT, according to GridStatus.iobatteries reached a record peak output of 6,177 megawatts. For about two hours, that made electricity generated earlier and stored in batteries the single largest source of power in the Golden state, eclipsing real-time production from natural gas, nuclear, renewable sources like wind and solar, and all other sources of energy.

It happened again on Sunday evening, this time for a few hours around 7:10 p.m. PT, per data from GridStatus.io. In that instance, which broke Tuesday’s record, batteries reached a peak output of 6,458 megawatts.

Battery storage has become a key part of the push to produce more electricity using renewable sources. By connecting huge, rechargeable batteries to power grids, power utilities can store energy generated during the day by solar panels and wind turbines.

Augmentation at the Vistra Moss Landing Energy Storage Facility in California has been completed, with the world’s biggest battery energy storage system (BESS) now at 400MW / 1,600MWh. The batteries are housed in repurposed gas turbine halls. Image: Vistra Energy.

Note the BESS ratings for power (MW) and energy output (MWh).  In this case, Moss Landing has a maximum power of 400MW and a duration of 4 hours, or 1600MWh.  Such a factor of 4 seems typical for large scale BESS in California.  It also means that for a single peak hour demand, Moss Landing can only supply 400MW for that hour.  If more energy is needed, it will have to come from somewhere else.

Then in April we have the news from Gov. Newsome’s office California Achieves Major Clean Energy Victory: 10,000 Megawatts of Battery Storage.  

Let’s Apply Some Context to These Cheerful Reports

The California Energy Commission produced its electricity forecast end of 2022:

Note the graph is projecting hourly electricity demand, which peaks during hour 19.  Output levels approach and then exceed 50,000 MW demand that hour, or 50k MWh.

Cal matters raises concerns about state policy to phase out ICE vehicles in favor of EVs.

Again demand requires from the grid 50k MW per hour in 2022 with less than 1% for charging EVs.  That is projected to go 10 times higher in 13 years.

Summary

The excitement is about batteries supplying  6500 MW for a couple of hours when the peak demand is 50,000 MW.  The glorious achievement is building battery capacity up to 10,000 MW.  It doesn’t add up.

 

 

 

Biden’s Dangerous NatGas Game

Tristan Abbey exposes the feds war on NatGas in his Real Energy article Joe Biden’s Dangerous Natural Gas Game.  Excerpts in italics with my bolds and added images.

If the devil is in the details, bureaucracy is hell on earth. Though terrain familiar to the Biden administration, Republicans must prepare to navigate it.

Witness the debacle over liquefied natural gas exports, wherein the White House, by “pausing” most new approvals, has catapulted the energy security of key U.S. allies straight into the buzzsaw of its climate ambitions. (The category of exports that will continue to be authorized is tiny.) The Department of Energy claims that a multifactor impact study due in early 2025 is required to determine whether and how the moratorium will be lifted.

For the 58 year period, the net changes were: Oil 194%, Gas 525%, Coal 178%, WFFC 239%, Primary Energy 287%  Source: Energy Institute stats 2022

Under a certain conception of executive power, it should be simple enough for a second-term Trump administration to end this national embarrassment by pressing “resume” on the authorization process. But as analysts at the Center for Strategic and International Studies have suggested, merely setting aside the study could provide a basis, however tenuous, for future litigation. In the modern administrative state, it is easier to open than shut the procedural door to delays.

Previous administrations have already published macroeconomic impact studies on the question of LNG exports from the U.S. The Obama administration paused its authorizations until its first study was released in December 2012, for example—curious timing, considering the election the previous month and the study’s actual completion in July of that year. Virtually every scenario in every study, including additional analysis in 2015 and 2018, has found net benefits to accrue.

It’s possible reopening the Obama playbook was the Biden team’s plan all along. After all, Secretary Granholm didn’t commission a new study in 2021, or in 2022, or in 2023. By waiting so long, the DOE can now claim that the cumulative volume of its authorizations is approaching the upper limit of the range that the 2018 study examined. Under the duplicity theory, approvals resume under a second Biden term as soon as the study is released and the election fades away.

But maybe the administration doesn’t even have a plan. It could be sheer incompetence. Gas exports offend the sensibilities of the Democratic base, but Appalachian swing states reap the economic rewards and European allies are desperate to detach themselves from Russian energy. Political operators will try in vain to triangulate even if it is impossible. We can imagine them now, hunched over the asphalt between the West Wing and the Eisenhower building, desperately chalking angles with a compass and ruler.

Appliances are just the thin end of the wedge against NatGas.

More ominously, Energy Secretary Granholm may be laying the groundwork for a Kafkaesque application process designed to punish an industry this administration has only ever pretended to tolerate. The fact that DOE’s approving authority is now housed in the Office of Resource Sustainability is suggestive, as is the Fiscal Year 2025 budget request to triple programmatic funding for export authorizations, primarily in the form of “anticipated studies and environmental reviews.”

In any event, undoing what the Biden team has done will take careful work by a putative second-term Trump administration. Putting the matter to rest on a more permanent basis will require legislative action, chiefly amending the Natural Gas Act signed into law by President Franklin Roosevelt in 1938. In the meantime, “death by study” works both ways.

 

The Bigger Picture from Master Resource

The Fossil-fuel Era: Still Young

“Oil, gas, and coal are ascending despite determined government efforts to reverse energy progress. With criteria air pollutants on the wane and carbon dioxide (CO2) benefits laboratory-proven, the increasing sustainability of fossil fuels is evident.”

Each year brings record production of the three fossil fuels: oil, natural gas, and coal. Peak demand is not in sight–nor should it be in a world of rising population, the aspiring poor, and new ways to employ inanimate energy to improve living. But what about future supply to meet growing demand?

In most nations of the world, free-market energy
plenty is held back by government intervention.

Government ownership and operation of fossil fuels and related infrastructure impedes supply and demand. But fossil fuel plenty is very hard to hold back, and enough is produced to reasonably meet demand. Such is true in the United States despite two hundred impediments from the Biden Administration. “The U.S. now has 227 years of oil supply, 130 years of natural gas supply, and 485 years of coal supply,” the study below reports.

Canadian oil soldiers on despite the anti-energy
policies of Prime Minister Justin Trudeau
.

The Institute for Energy Research (IER) has just released an update to its 2011 study, 2024 North American Energy Inventory. As more oil, gas, and coal is produced, more is discovered to be produced, the amazing (but not biblical) story of resource expansion from free-market resourceship

The fossil fuel era is very young in human history, having eclipsed the renewable energy era just several centuries ago. IER’s recent inventory study confirms the benefits of even a quasi-free market can do. Resourceship forever!

Simple Truth vs. Cheap Green Energy Lie

Francis Menton asserts that the biggest disinformation (Lie) in public discourse is claiming that the cheapest source of energy comes from renewables, wind and solar power.  He provides a number of brazen media examples in his blog post What Is The Most Pernicious Example Of “Misinformation” Currently Circulating?

Why do I say that the assertion of wind and solar being the cheapest ways to generate electricity is the very most pernicious of misinformation currently out there? Here are my three reasons: (1) the assertion is repeated endlessly and ubiquitously, (2) it is the basis for the misallocation of trillions of dollars of resources and for great impoverishment of billions of people around the world, and (3) it is false to the point of being preposterous, an insult to everyone’s intelligence, yet rarely challenged.

In addition, Paul Homewood explains at his blog how recently this lie was repeatedly entered into testimony in the UK Parliament House of Lords:

In oral questions on Thursday, Lord Frost noted Whitehall claims that renewables are half the cost of gas-fired electricity, and asked for an explanation of why subsidies were still required, and why the strike prices on offer to windfarms this year are twice what Lord Callanan says they need to make a profit. As Hansard shows, Lord Callanan failed to answer the question, simply reiterating his false claims about levelized costs.

The responses from Lord Callanan demonstrate the typical ploy for disarming dissenters’ objections, i.e. getting the discussion entangled in details and cost minutae so that the big lie is lost in the weeds.  It occurs to me that previously David Wojick had put the key issue in a simple, useful way, reposted below.

Background Post: Just One Number Keeps the Lights On

David Wojick explains how maintaining electricity supply is simple in his CFACT article It takes big energy to back up wind and solar.  Excerpts in italics with my bolds. (H/T John Ray)

Power system design can be extremely complex but there is one simple number that is painfully obvious. At least it is painful to the advocates of wind and solar power, which may be why we never hear about it. It is a big, bad number.

To my knowledge this big number has no name, but it should. Let’s call it the “minimum backup requirement” for wind and solar, or MBR. The minimum backup requirement is how much generating capacity a system must have to reliably produce power when wind and solar don’t.

Duck Curve Now Looks Like a Canyon

For most places the magnitude of MBR is very simple. It is all of the juice needed on the hottest or coldest low wind night. It is night so there is no solar. Sustained wind is less than eight miles per hour, so there is no wind power. It is very hot or cold so the need for power is very high.

In many places MBR will be close to the maximum power the system ever needs, because heat waves and cold spells are often low wind events. In heat waves it may be a bit hotter during the day but not that much. In cold spells it is often coldest at night.

Thus what is called “peak demand” is a good approximation for the maximum backup requirement. In other words, there has to be enough reliable generating capacity to provide all of the maximum power the system will ever need. For any public power system that is a very big number, as big as it gets in fact.

Actually it gets a bit bigger, because there also has to be margin of safety or what is called “reserve capacity”. This is to allow for something not working as it should. Fifteen percent is a typical reserve in American systems. This makes MBR something like 115% of peak demand.

We often read about wind and solar being cheaper than coal, gas and nuclear power, but that does not include the MBR for wind and solar.

What is relatively cheap for wind and solar is the cost to produce a unit of electricity. This is often called LCOE or the “levelized cost of energy”. But adding the reliable backup required to give people the power they need makes wind and solar very expensive.

In short the true cost of wind and solar is LCOE + MBR. This is the big cost you never hear about. But if every state goes to wind and solar then each one will have to have MBR for roughly its entire peak demand. That is an enormous amount of generating capacity.

Of course the cost of MBR depends on the generating technology. Storage is out because the cost is astronomical. Gas fired generation might be best but it is fossil fueled, as is coal. If one insists on zero fossil fuel then nuclear is probably the only option. Operating nuclear plants as intermittent backup is stupid and expensive, but so is no fossil fuel generation.

What is clearly ruled out is 100% renewables, because there would frequently be no electricity at all. That is unless geothermal could be made to work on an enormous scale, which would take many decades to develop.

unicorn

It is clear that the Biden Administration’s goal of zero fossil fueled electricity by 2035 (without nuclear) is economically impossible because of the minimum backup requirements for wind and solar. You can’t get there from here.

One wonders why we have never heard of this obvious huge cost with wind and solar. The utilities I have looked at avoid it with a trick.

Dominion Energy, which supplies most of Virginia’s juice, is a good example. The Virginia Legislature passed a law saying that Dominion’s power generation had to be zero fossil fueled by 2045. Dominion developed a Plan saying how they would do this. Tucked away in passing on page 119 they say they will expand their capacity for importing power purchased from other utilities. This increase happens to be to an amount equal to their peak demand.

The plan is to buy all the MBR juice from the neighbors! But if everyone is going wind and solar then no one will have juice to sell. In fact they will all be buying, which does not work. Note that the high pressure systems which cause low wind can be huge, covering a dozen or more states. For that matter, no one has that kind of excess generating capacity today.

To summarize, for every utility there will be times when there is zero wind and solar power combined with near peak demand. Meeting this huge need is the minimum backup requirement. The huge cost of meeting this requirement is part of the cost of wind and solar power. MBR makes wind and solar extremely expensive.

The simple question to ask the Biden Administration, the States and their power utilities is this: How will you provide power on hot or cold low wind nights?

Background information on grid stability is at Beware Deep Electrification Policies

More Technical discussion is On Stable Electric Power: What You Need to Know

cg4bbc1c620f5bf0

Footnote: Another Way to Assess Energy Cost and Value is LCOE + LACE

Cutting Through the Fog of Renewable Power Costs

Why Unintended Consequences from Pushing Green Energy

We have been treated to multiple reports of negative consequences unforeseen by policymakers pushing the Green Energy agenda. A sample of the range:

Ford ready to restrict UK sales of petrol models to hit electric targets, Financial Times

Why US offshore wind energy is struggling—the good, the bad and the opportunity, Tech Xplore

Another solar farm destroyed by a hail storm—this time in Texas, OK Energy Today

Storm Ravages World’s Largest Floating Solar Plant, Western Journal

DOE Finalizes Efficiency Standards for Clothes Washers and Dryers, Energy.Gov

Strict new EPA rules would force coal-fired power plants to capture emissions or shut down, AP news

Companies Are Balking at the High Costs of Running Electric Trucks, Wall Street Journal

Landmark wind turbine noise ruling from High Court referred to attorney general, Irish Times

Etc., Etc.

These reports point to regulators again attempting to force social and economic behavorial changes against human and physical forces opposing the goals. A detailed explanation of one such failure follows.

Background Post:  Why Raising Auto Fuel (CAFE) Standards Failed

There are deeper reasons why US auto fuel efficiency standards are counterproductive and should be rolled back.  They were instituted in denial of regulatory experience and science.  First, a parallel from physics.

In the sub-atomic domain of quantum mechanics, Werner Heisenberg, a German physicist, determined that our observations have an effect on the behavior of quanta (quantum particles).

The Heisenberg uncertainty principle states that it is impossible to know simultaneously the exact position and momentum of a particle. That is, the more exactly the position is determined, the less known the momentum, and vice versa. This principle is not a statement about the limits of technology, but a fundamental limit on what can be known about a particle at any given moment. This uncertainty arises because the act of measuring affects the object being measured. The only way to measure the position of something is using light, but, on the sub-atomic scale, the interaction of the light with the object inevitably changes the object’s position and its direction of travel.

Now skip to the world of governance and the effects of regulation. A similar finding shows that the act of regulating produces reactive behavior and unintended consequences contrary to the desired outcomes.

US Fuel Economy (CAFE) Standards Have Backfired

An article at Financial Times explains about Energy Regulations Unintended Consequences  Excerpts below with my bolds.

Goodhart’s Law holds that “any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes”. Originally coined by the economist Charles Goodhart as a critique of the use of money supply measures to guide monetary policy, it has been adopted as a useful concept in many other fields. The general principle is that when any measure is used as a target for policy, it becomes unreliable. It is an observable phenomenon in healthcare, in financial regulation and, it seems, in energy efficiency standards.

When governments set efficiency regulations such as the US Corporate Average Fuel Economy standards for vehicles, they are often what is called “attribute-based”, meaning that the rules take other characteristics into consideration when determining compliance. The Cafe standards, for example, vary according to the “footprint” of the vehicle: the area enclosed by its wheels. In Japan, fuel economy standards are weight-based. Like all regulations, fuel economy standards create incentives to game the system, and where attributes are important, that can mean finding ways to exploit the variations in requirements. There have long been suspicions that the footprint-based Cafe standards would encourage manufacturers to make larger cars for the US market, but a paper this week from Koichiro Ito of the University of Chicago and James Sallee of the University of California Berkeley provided the strongest evidence yet that those fears are likely to be justified.

Mr Ito and Mr Sallee looked at Japan’s experience with weight-based fuel economy standards, which changed in 2009, and concluded that “the Japanese car market has experienced a notable increase in weight in response to attribute-based regulation”. In the US, the Cafe standards create a similar pressure, but expressed in terms of size rather than weight. Mr Ito suggested that in Ford’s decision to end almost all car production in North America to focus on SUVs and trucks, “policy plays a substantial role”. It is not just that manufacturers are focusing on larger models; specific models are also getting bigger. Ford’s move, Mr Ito wrote, should be seen as an “alarm bell” warning of the flaws in the Cafe system. He suggests an alternative framework with a uniform standard and tradeable credits, as a more effective and lower-cost option. With the Trump administration now reviewing fuel economy and emissions standards, and facing challenges from California and many other states, the vehicle manufacturers appear to be in a state of confusion. An elegant idea for preserving plans for improving fuel economy while reducing the cost of compliance could be very welcome.

The paper is The Economics of Attribute-Based Regulation: Theory and Evidence from Fuel-Economy Standards Koichiro Ito, James M. Sallee NBER Working Paper No. 20500.  The authors explain:

An attribute-based regulation is a regulation that aims to change one characteristic of a product related to the externality (the “targeted characteristic”), but which takes some other characteristic (the “secondary attribute”) into consideration when determining compliance. For example, Corporate Average Fuel Economy (CAFE) standards in the United States recently adopted attribute-basing. Figure 1 shows that the new policy mandates a fuel-economy target that is a downward-sloping function of vehicle “footprint”—the square area trapped by a rectangle drawn to connect the vehicle’s tires.  Under this schedule, firms that make larger vehicles are allowed to have lower fuel economy. This has the potential benefit of harmonizing marginal costs of regulatory compliance across firms, but it also creates a distortionary incentive for automakers to manipulate vehicle footprint.

Attribute-basing is used in a variety of important economic policies. Fuel-economy regulations are attribute-based in China, Europe, Japan and the United States, which are the world’s four largest car markets. Energy efficiency standards for appliances, which allow larger products to consume more energy, are attribute-based all over the world. Regulations such as the Clean Air Act, the Family Medical Leave Act, and the Affordable Care Act are attribute-based because they exempt some firms based on size. In all of these examples, attribute-basing is designed to provide a weaker regulation for products or firms that will find compliance more difficult.

Summary from Heritage Foundation study Fuel Economy Standards Are a Costly Mistake Excerpt with my bolds.

The CAFE standards are not only an extremely inefficient way to reduce carbon dioxide emission but will also have a variety of unintended consequences.

For example, the post-2010 standards apply lower mileage requirements to vehicles with larger footprints. Thus, Whitefoot and Skerlos argued that there is an incentive to increase the size of vehicles.

Data from the first few years under the new standard confirm that the average footprint, weight, and horsepower of cars and trucks have indeed all increased since 2008, even as carbon emissions fell, reflecting the distorted incentives.

Manufacturers have found work-arounds to thwart the intent of the regulations. For example, the standards raised the price of large cars, such as station wagons, relative to light trucks. As a result, automakers created a new type of light truck—the sport utility vehicle (SUV)—which was covered by the lower standard and had low gas mileage but met consumers’ needs. Other automakers have simply chosen to miss the thresholds and pay fines on a sliding scale.

Another well-known flaw in CAFE standards is the “rebound effect.” When consumers are forced to buy more fuel-efficient vehicles, the cost per mile falls (since their cars use less gas) and they drive more. This offsets part of the fuel economy gain and adds congestion and road repair costs. Similarly, the rising price of new vehicles causes consumers to delay upgrades, leaving older vehicles on the road longer.

In addition, the higher purchase price of cars under a stricter CAFE standard is likely to force millions of households out of the new-car market altogether. Many households face credit constraints when borrowing money to purchase a car. David Wagner, Paulina Nusinovich, and Esteban Plaza-Jennings used Bureau of Labor Statistics data and typical finance industry debt-service-to-income ratios and estimated that 3.1 million to 14.9 million households would not have enough credit to purchase a new car under the 2025 CAFE standards.[34] This impact would fall disproportionately on poorer households and force the use of older cars with higher maintenance costs and with fuel economy that is generally lower than that of new cars.

CAFE standards may also have redistributed corporate profits to foreign automakers and away from Ford, General Motors (GM), and Chrysler (the Big Three), because foreign-headquartered firms tend to specialize in vehicles that are favored under the new standards.[35] 

Conclusion

CAFE standards are costly, inefficient, and ineffective regulations. They severely limit consumers’ ability to make their own choices concerning safety, comfort, affordability, and efficiency. Originally based on the belief that consumers undervalued fuel economy, the standards have morphed into climate control mandates. Under any justification, regulation gives the desires of government regulators precedence over those of the Americans who actually pay for the cars. Since the regulators undervalue the well-being of American consumers, the policy outcomes are predictably harmful.

What’s Next?

Wind Power for Beginners

H/T maxyhoge

Robert Bryce explains the basics at his substack blog Build It, And The Wind Won’t Come.  Excerpts in italics with my bolds and added images.

Weather-dependent generation sources are…weather dependent:
Last year, despite adding 6.2 GW of new capacity,
U.S. wind production dropped by 2.1%.

Three years ago, in the wake of Winter Storm Uri, the alt-energy lobby and their many allies in the media made sure not to blame wind energy for the Texas blackouts. The American Clean Power Association (2021 revenue: $32.1 million) declared frozen wind turbines “did not cause the Texas power outages” because they were “not the primary cause of the blackouts. Most of the power that went offline was powered by gas or coal.”

Damaged wind turbines at the Punta Lima wind project, Naguabo, Puerto Rico, 2018. Photo: Wikipedia.

NPR parroted that line, claiming, “Blaming wind and solar is a political move.” The Texas Tribune said it was wrong to blame alt-energy after Winter Storm Uri because “wind power was expected to make up only a fraction of what the state had planned for during the winter.” The outlet also quoted one academic who said that natural gas was “failing in the most spectacular fashion right now.” Texas Tribune went on to explain, “Only 7% of ERCOT’s forecasted winter capacity, or 6 gigawatts, was expected to come from various wind power sources across the state.”

In other words, there was no reason to expect the 33 GW of wind capacity that Texas had to deliver because, you know, no one expected wind energy to produce much power. Expectations? Mr. October? Playoff Jamal? Who needs them?

But what happens when you build massive amounts of
wind energy capacity and it doesn’t deliver —
not for a day or a week, but for six months, or even an entire year?

That question is germane because, on Wednesday, the Energy Information Administration published a report showing that U.S. wind energy production declined by 2.1% last year. Even more shocking: that decline occurred even though the wind sector added 6.2 GW of new capacity!

A hat tip to fellow Substack writer Roger Pielke Jr., who pithily noted on Twitter yesterday, “Imagine if the U.S. built 6.2 GW new capacity in nuclear power plants and after starting them up, overall U.S. electricity generation went down. That’d be a problem, right?”

Um, yes. It would. And the EIA made that point in its usual dry language. “Generation from wind turbines decreased for the first time since the mid-1990s in 2023 despite the addition of 6.2 GW of new wind capacity last year,” the agency reported. The EIA also explained that the capacity factor for America’s wind energy fleet, also known as the average utilization rate, “fell to an eight-year low of 33.5%.” That compares to 35.9% capacity factor in 2022 which was the all-time high. The report continued, “Lower wind speeds than normal affected wind generation in 2023, especially during the first half of the year when wind generation dropped by 14% compared with the same period in 2022.”

Read that again. For half of last year, wind generation was down by a whopping 14% due to lower wind speeds. Imagine if that wind drought continued for an entire year. That’s certainly possible. Recall that last summer, the North American Electric Reliability Corporation warned that U.S. generation capacity “is increasingly characterized as one that is sensitive to extreme, widespread, and long duration temperatures as well as wind and solar droughts.”

According to Bloomberg New Energy Finance, corporate investment in wind energy between 2004 and 2022 totaled some $278 billion. In addition, according to data from the Treasury Department, the U.S. government spent more than $30 billion on the production tax credit over that same period. Thus, over the last two decades, the U.S. has spent more than $300 billion building 150 GW of wind capacity that has gobbled up massive amounts of land, garnered enormous (and bitter) opposition from rural Americans, and hasn’t gotten more efficient over time.

Wednesday’s EIA report is a stark reminder that all of that generation capacity is subject to the vagaries of the wind. Imagine if the U.S. had spent that same $300 billion on a weather-resilient form of generation, like, say, nuclear power. That’s relevant because Unit 4 at Plant Vogtle in Georgia came online on Monday. With that same $300 billion, the U.S. could have built 20, 30, or maybe even 40 GW of new nuclear reactors with a 92% capacity factor that wouldn’t rely on the whims of the wind. In addition, those dozens of reactors would have required a tiny fraction of the land now covered by thousands of viewshed-destroying, bat-and-bird-killing wind turbines.

If climate change means we will face more extreme weather in the years ahead — hotter, colder, and/or more severe temperatures for extended periods — it’s Total Bonkers CrazytownTM to make our electric grid dependent on the weather. But by lavishing staggering amounts of money on wind and solar energy, and in many cases, mandating wind and solar, that’s precisely what we are doing.