A Rational Climate Policy

Recently in a post called Silence of Conservative Lambs I wrote:

The 1991 blockbuster movie revolved around meek, silent victims preyed upon by malevolent believers in their warped, twisted view of the world. A comparison can be drawn between how today’s conservative thinkers and politicians respond to advocates of the pernicious global warming/climate change ideology. Instead of challenging and pushing back against CO2 hysteria, and speaking out with a rational climate perspective, Republicans in the US, and Conservatives in Canada and elsewhere are meek and silent lambs in the face of this energy slaughter. Worse, when they do speak it is to usually to pander and try to appease offering proposals for things like carbon taxes or other non-remedies for a non-problem, essentially ceding the case to leftists.

So to be more constructive, let’s consider what should be proposed by political leaders regarding climate, energy and the environment.  IMO these should be the pillars:

♦  Climate change is real, but not an emergency.

♦  We must use our time to adapt to future climate extremes.

♦  We must transition to a diversified energy platform.

♦  We must safeguard our air and water from industrial pollutants.

 

For those not familiar, Climate Intelligence (CLINTEL) is an independent foundation that operates in the fields of climate change and climate policy. CLINTEL was founded in 2019 by emeritus professor of geophysics Guus Berkhout and science journalist Marcel Crok.  Their 1000+ members are signatories of a declaration There is No Climate Emergency

A global network of 900 scientists and professionals has prepared this urgent message. Climate science should be less political, while climate policies should be more scientific. Scientists should openly address uncertainties and exaggerations in their predictions of global warming, while politicians should dispassionately count the real costs as well as the imagined benefits of their policy measures.

One example of a national energy and environment strategy is provided by Clintel for The Netherlands.  The document is Clintel’s Integrated Energy Vision.  Excerpts in italics with my bolds.

Preamble

We all agree in CLINTEL that:
– There is no climate emergency. We have ample time to improve our climate models (for a better understanding of the factors that regulate the climate) and to search for better adaptation technologies.

– The influence of CO2 on global warming is overestimated and its influence on greening is underestimated (even worse, it is often ignored). Nobody knows what the optimum value of atmospheric CO2 concentration is, but from a geological point of view we may conclude that we live in a time with historical low concentrations. Again, there is no climate emergency.

– There is an energy emergency.  Decarbonisation policies – in terms of the current energy transition are most destructive. They do much more harm than good. These energy policies must be terminated immediately.

– The new generation (III and IV) nuclear power plants ought to get all our attention. These plants promise low-priced, reliable, safe and clean energy. In combination with natural gas nuclear energy is a ‘No Regret Solution’. Wind and solar energy are at most niche technologies. Their contribution is and will stay marginal.

With respect to the energy transition, CLINTEL emphasises that there exists not something as a global uniform energy system.  Every country needs a tailor-made energy system depending on its geography, mineral resources, development phase, industrial specialization, population density, etc. For instance, The Netherlands – being a very densely populated country and being severely divided on the CO2 issue – it looks like the new generation of nuclear power plants may function as a breakthrough in the political process:

Part I shows that current Dutch energy policy – having the ambition to reduce CO₂ emissions as much as 49% by 2030 – is based on panic and shall lead to immense additional costs and a drastically deteriorated living environment. Below, we will propose an inspiring long-term energy vision that fits our (and many other) country’s needs, is based on scientific facts, and aimed at a prosperous future for everyone. A positive vision that replaces the gloom and doom predictions of the climate models. A vision with a hopeful perspective for the future.

A Guiding Vision for the Future

It is well known that high-risk, capital-intensive decisions should be based on a policy that is as insensitive as possible about the way the future will unfold. We have called it a No Regret Policy. It represents a long-term policy, implemented by taking small steps, and continuously adapted to what is happening in reality. CLINTEL has drawn up a No Regret Energy Policy, especially aimed at the Dutch energy transition.

The proposed NRE policy is insensitive for the impact that CO₂ might or might not have on climate change (dominant or marginal). In addition it is insensitive for what role the future electricity grid will play and for what the best mobility energy option will be. An extra bonus of the NRE policy is that the Netherlands’ energy supply will become less dependent on Russian natural gas and Middle Eastern oil.

CLINTEL’s proposal consists of three main elements:

1. Introduction of nuclear energy
If we base ourselves on the most up-to-date insights in energy supply, and we look at our four objectives as well as to our ‘no regret demands’, then nuclear energy is the only choice that meets these needs:

• No CO₂ emissions (mandatory requirement in the climate policy in force) as well as excellent controlled waste treatment (pollution requirement)
• High safety level (safety requirement)
• Demand-driven, reliable and affordable (prosperity requirement)
• High energy density (environmental requirement)

About the last entry, please compare a medium-sized 500 MW nuclear power plant with a medium wind turbine park of 4 MW full load. For this reactor, we will need a terrain of approximately 1 km², for the wind farm approx. 300 km². In addition, a nuclear power plant delivers guaranteed for at least 60 years power with low operational costsWind turbines on the other hand deliver unreliable power with high operational costs for a maximum of 25 years.  Solar panels aren’t performing any better. Moreover, the corresponding inverter (from direct current to alternating current) only lasts about 10 years.

2. Transforming green electrons into green molecules

Transport and storage of much larger than the current quantities of electrical energy is
technically difficult and economically unattractive. Every physicist will say: Don’t do it!
The real alternative is that with a large supply of cheap and reliable electrical energy we can afford to transform this energy into any desired molecular clean energy carrier, in the form of synthetic gas and synthetic oil.

There are attractive candidates with an appropriate energy density, such as methanol (CH3OH), ammonia (NH3) and hydrogen (H2), or a combination. These truly green energy carriers can be used safely and affordably be stored and transported using the existing infrastructure (bear in mind that 100% H2 is very aggressive and highly flammable, so there is still a lot of work to be done before this energy carrier can be implemented safely at a large scale).

Oil companies should not be tempted by substantial public subsidies to participate in solar fields and wind farms. Instead, they should concentrate on production, transport and distribution of green molecules (green gas, green oil), so do what they are good at.  Plans to store surplus CO₂ underground may turn out to be a silly activity. Oil companies, be critical before starting such an activity at a large scale.

3. Hybrid applications

With the supply of truly clean electricity and truly clean energy carriers, optimal choices can be made without large and expensive  grid reinforcements and polluting battery packs. Examples:

• Clean high-efficiency boilers (green gas)
• Clean road traffic (green petrol, green diesel)
• Clean aviation (green kerosene)
• Clean industrial production (green gas)
• Clean desalination of seawater (green potable water)

Interestingly, for each application there also is a hybrid solution (fossil-fuel molecules combined with green molecules and/or green molecules combined with green electrons). Here are also great opportunities to meet the ever-growing need for potable water. After all, it is bad for the soil if we keep on pumping up groundwater (e.g. soil desiccation, and soil subsidence). This can be done much better if we link our energy policy to our drinking water policy.

NRE policy excludes burning of biomass (‘the most stupid policy of all times’) and includes sun and wind as niches only. Batteries are only used for low-power applications, as in the information sector. Natural gas and natural oil are primarily still raw materials for the industry. ‘Saying goodbye to ‘natural’ gas, is utterly silly. Any CO₂ tax is even more silly.

Nuclear energy is proposed as the only truly sustainable solution.  To start with, nuclear power will have to take over the energy and heat supply from existing power plants that have almost reached the end of their technical and/or economic lifespan. Next are the energy applications proposed by CLINTEL being part of this vision. The present nuclear technology works with enriched uranium. Breeder reactors on uranium and thorium will in the long run take over the role of these traditional nuclear reactors. Hopefully, nuclear fusion will follow. The Netherlands will, together with other countries, have to participate in research and development efforts, thus acknowledging the importance of a 100% clean, reliable and affordable global energy supply for the foreseeable future. 

Footnote:  US Republicans Get Behind a Six-Point Plan

ClearPath Action

♦  Leverage American Innovation

Innovation and creating jobs is just part of who we are. And thanks to innovation, America has reduced its emissions by more than any other country in the last 20 years. We did this through new American technology, research at the Department of Energy, and strong bipartisan support.

We need to double down and get more American innovations to market.

♦  Modernize Permitting

We need to build cleaner, faster. Clean energy and grid modernization present tremendous economic opportunities, but burdensome and outdated regulations mean that new projects take five years on average to come online.

We have to move faster by enacting common sense reforms to the permitting process.

♦  Bring American Industry Back

American manufacturing is the cleanest in the world with the highest environmental standards. Unfortunately, countries like China and Russia don’t have the same standards.

We can restore American manufacturing leadership in industries like steel and concrete by strengthening our own supply chains and eliminating dependence from countries that don’t meet our environmental standards.

♦  Unleash American Resource Independence

A new industrial revolution is going to require an enormous amount of resources like lithium, copper, cobalt, graphite, and nickel. Currently, we are too dependent on countries like China to supply our needs.

This dependence increases emissions and handicaps American businesses. We have to make it easier to safely supply manufacturers with American-made materials and employ American workers.

♦  Make Our Communities More Resilient

As conservatives, we plan ahead. When it comes to natural disasters, an ounce of prevention is worth a pound of cure. One dollar invested now equals six dollars after the disaster.

We can help take common sense measures and make sound investments that make our communities and farms more resistant to natural disasters like floods, fires and droughts.

♦  Use Natural Solutions

Crop production depends on access to healthy soil, adequate water supplies and predictable weather conditions, all of which are more difficult to manage as the climate changes.

Natural climate solutions – planting trees and farming practices that improve soil health – have a major impact on reducing carbon emissions while making forests and farms more resilient to floods and fires. They are also profitable.

2000 Mules is the Smoking Gun

Now we know why they avert their eyes from watching the film 2000 Mules.  And why the panel investigating the January 6 protest didn’t dare to screen the documentary.  Because it shows beyond reasonable doubt that election theft activities were coordinated and replicated in multiple states, proving a national criminal enterprise rigged the 2020 US Presidential election.  The smoking gun is there for all to see, and even more, it is only part of the pattern of corruption.

Charlie Johnston explains in his American Thinker article D’Souza’s Mules Left Tracks.  Excerpts in italics with my bolds.

Many conservative commentators have noted that Dinesh D’Souza’s documentary, 2000 Mules, offers compelling evidence of large-scale vote fraud. It offers more than this, though. It provides compelling evidence of a massive, centrally coordinated conspiracy to commit vote fraud. Examining several states with different voter laws while focusing on just one form of fraud, the movie found that the method of fraud was executed identically in each of these states.

That is prima facie evidence of central organization and management.

From the moment counting was stopped in the dead of night in five Democrat-run swing states on election night, Democrats and the media have treated anyone who questioned election integrity in 2020 like a mob boss treats anyone who threatens to testify against him: shut up, or we will cancel you.

Democrats and the media routinely smear anyone who questions the election results as a conspiracy theorist.  They routinely pronounce any evidence that emerges as “debunked.”  For the record, “debunked” does not mean “inconvenient to the leftist narrative.”  It means “thoroughly investigated and proven to be false.”  Almost none of the evidence has been debunked; very little has been officially examined.  Leftists treat actual evidence like how a vampire treats a crucifix.  There is no reasoned discourse, just a lot of hissing and snarling.

From well before he took office, Donald Trump faced an ongoing administrative coup attempt. First was the long-running Russian collusion hoax, mounted by Hillary Clinton and the Democratic National Committee and abetted by the FBI and intelligence agencies. Federal employees who were, theoretically, subordinate to Trump gleefully worked to undermine his administration. Two baseless impeachments were mounted against him by Democrats who know nothing other than shrieking partisanship anymore.

The slow-moving coup finally succeeded on the evening of November 3, 2020, when those five states quit counting ballots to give Democrats time to “fortify” the election. The last real hurdle to thwarting election integrity came on December 11, 2020, when the Supreme Court ruled that Texas and 18 other states lacked standing to complain of massive fraud. How states that conduct honest elections lack standing to complain of states that don’t in an election that affects them all is beyond my understanding. It looked like unconditional institutional surrender to massive fraud to me. All hail the barbarians!

D’Souza’s documentary examined only the slice of fraud that involved
organized physical ballot-stuffing.

It did not touch on compromised voting machines and systems or unconstitutional, administrative election law changes. If the single slice that 2000 Mules so effectively biopsied is filled with the cancer of fraud, it is willful ignorance to believe that everything else was clean.

If the election of 2020 had been fundamentally clean, Democrats and the media should have been the loudest advocates for a thorough and bipartisan investigation of the election to put widespread doubts to rest and own the conservatives. (By bipartisan, I do not mean like the J6 committee, where the Democrats unilaterally appointed all members, including a couple of Republican chumps for show.) Instead, the left hisses and snarls at every piece of evidence brought forth, no matter how compelling. A guilty man tries to suppress every bit of evidence at his trial, never knowing which piece will seal his conviction, while an innocent man tries to get every piece into evidence he can, never knowing which piece will exonerate him. To assess credibility on this, look who is trying to suppress evidence and who is trying to get evidence into the public record.

At this stage, it is hard to credit Democratic and media intransigence to anything innocent. If they are not just stupid, they have become co-conspirators in the only actual insurrection America has seen over the last six years. Understand, this coup was not primarily aimed at Trump and conservative Republicans; it is a coup against the very idea of self-government. Alas, many Republicans may disagree with elements of Democratic methods but agree with them that a self-serving elite class should rule the citizen-serfs they think constitute the American people.

The relentless smears, the constant howls, and the shrieking rage of the leftists are not because they are so offended that the right would challenge them. It is because the mud of massive deception is being washed away to reveal the rock of stark fraud the left mounted to steal an American presidential election. That is genuine insurrection. Confession, repentance, and forfeiture of all offices of public honor or trust by the conspirators could begin to establish American honor and liberty anew. That, of course, will never happen. Power is the left’s only god, and pursuit of it by any means its only liturgy.

Republicans will win by unprecedented margins in November. If they hold the left to account for its depredations against the American system of law and systemic attack on the Bill of Rights, we can begin to crawl out of this hole of despotism. If, instead, the Republicans largely choose to let bygones be bygones, as they have done with the Russian collusion conspirators, there is little hope that America can long survive as anything the founders would recognize. Renewal will come. Americans will not forever submit to be ruled by any class of people — and certainly not to this degenerate class of aspiring despots.

However it comes, D’Souza’s documentary is the seminal moment the tide washed away enough mud that, despite their shrieks and howls, the left can no longer hide the ugly truth of what it did.

Massive election fraud in 2020 is a conspiracy, but it is no longer merely a theory.

OPEC runs out of spare capacity, makes bullish case for oil

Mohammed Barkindo, the secretary general of OPEC, has warned that “OPEC is running out of capacity,” and that “with the exception of two or three members, all are maxed out.” PHOTO BY REUTERS/DADO RUVIC/ILLUSTRATION/FILE PHOTO

Eric Nuttall explains at Financial Post OPEC running out of spare capacity confirms our multi-year bull case for oil.  Excerpts in italics with my bolds.

Oil companies are going to be pumping high returns to investors
for much longer than people realize

Imagine life without insurance. The constant worry of an unexpected accident, such as your house burning down or car getting stolen, wreaking financial havoc without the economic certainty that everything would be OK in the end. This is where the world is heading in the next several months.

The Organization of the Petroleum Exporting Countries’ (OPEC’s) spare capacity, the oil market equivalent of insurance, has since the 1960s been available to avoid severe price spikes by smoothing out periodic supply disruptions caused by geopolitical events.

Now, owing to too many years of insufficient investment, as the needs of social spending and sovereign revenue dwarfed those of investing in incremental capacity during a multi-year period of low oil prices, OPEC’s spare capacity is set to become exhausted.  This imminent reality will be a watershed event and has enormous implications for the oil market that investors must urgently appreciate.

We have for more than a year argued the world was hurtling into an energy crisis of epic proportions that would result in a multi-year bull market for oil.

Our bullish thesis had four basic tenets:

♦  persistent demand growth for at least the next 10 years;
♦  the end of shale hyper-growth in the United States, defined as shale production growth rates that no longer exceed global demand growth;
♦  stagnant production growth from the global super-majors resulting from eight years of insufficient investment and, finally,
♦  the exhaustion of OPEC’s spare capacity.

The hardest of these four core assumptions to prove by far was the last one. U.S. shale growth rates could be forecasted by talking with oil executives and modelling corporate cash flows. One could easily see that spending by the super-majors had peaked in 2014, falling to half of those levels today, while also being burdened by increasing pressures to decarbonize, so we could predict and model stagnant growth for years to come. And demand growth was boosted in the short term by the emergence from global lockdown, and is supported over the medium-to-long term by the realities that limit alternatives from reaching enough critical mass to meaningfully displace oil in the next several decades.

OPEC’s spare capacity, however, was the tricky one. Monthly data released by several different sources can vary wildly. Given the strategic importance of oil revenue to many Gulf States, hard data on productive capacity has at times been viewed as state secrets and either difficult to get or taken with some skepticism. How then can we be so confident that OPEC’s spare capacity is nearing exhaustion? Because they just told us so.

Last week, the Royal Bank of Canada hosted a spectacular energy conference in New York with the highlight being a keynote speech by Mohammed Barkindo, the secretary general of OPEC. That same night, I had the good fortune to have dinner with him, which to an energy enthusiast was the equivalent of a tech investor getting to hang out with Elon Musk. I found him to be a warm, insightful, soft-spoken and, surprisingly, straight-talking gentleman.

In his keynote speech, Barkindo warned that “OPEC is running out of capacity,” and that “with the exception of two or three members, all are maxed out.” Further, “the world needs to come to terms with this brutal fact” and that it is a “global challenge.”

Why is this so incredibly important? Well, what would happen if the U.S. Federal Reserve ran out of hard currency? It would just simply print more, with fresh bills sent to banks via armoured car the next day.

For oil producers, the cycle time to produce more oil is measured not in days, but in years.

With short-cycle U.S. shale set to grow at a fraction of historical rates, the world is now almost entirely dependent on long-cycle production, yet the global super-majors are entrenched in a multi-year period of stagnation due to too many years of underspending, and now OPEC, out of incremental capacity, is constrained by the very same challenge.

With oil inventories already at multi-year lows, demand back to pre-COVID-19 levels and structural challenges to supply growth, we believe oil prices will have to act as a demand-destroying mechanism, rising to a high enough level that kills discretionary demand, thereby balancing the market, while also staying there long enough to give the super-majors the confidence needed to start adequately spending again.

Given industry cycle times of four to six years, we believe that oil companies are set up to return egregiously high returns to investors for much longer than people realize, leading to a rerating from valuation levels that still imply the end of oil is nigh.

Eric Nuttall is a partner and senior portfolio manager with Ninepoint Partners LP.

 

USA Today Outed for Fictional Fact Checking

Paul Joseph Watson writes at Summit News Top ‘Fact Checker’ USA Today Forced to Delete Articles Over Fabricated Sources.  Excerpts in italics with my bolds.  H/T Tyler Durden

USA Today, which is used as a ‘fact checker’ by social media platforms, was forced to delete 23 articles from its website after an investigation found one of its reporters had fabricated sources.

Well, this is awkward

The news outlet has an entire section of its website dedicated to ‘fact checking’ and is used by Facebook to ‘fact check’ stories published by other outlets, downranking them in algorithms in a form of soft censorship.

However, it appears as though USA Today should have devoted more resources to fact checking itself before publishing articles by its own staff.

“USA Today’s breaking news reporter Gabriela Miranda fabricated sources and misappropriated quotes for stories, the news outlet confirmed on Thursday. The outlet conducted an internal audit after receiving an “external correction request” on one of its published stories,” reports Breitbart.

The 23 articles which were removed for not meeting the paper’s “editorial standards” included pieces on the Texas abortion ban, anti-vaxxer content and Russia’s invasion of Ukraine.

Miranda, who has now resigned from her position, “took steps to deceive investigators by producing false evidence of her news gathering, including recordings of interviews,” according to the New York Times.

“After receiving an external correction request, USA TODAY audited the reporting work of Gabriela Miranda. The audit revealed that some individuals quoted were not affiliated with the organizations claimed and appeared to be fabricated. The existence of other individuals quoted could not be independently verified. In addition, some stories included quotes that should have been credited to others.”

As we previously highlighted, USA Today was also forced to hastily delete a series of tweets which critics said were tantamount to the normalization of pedophilia after the newspaper cited “science” to assert that pedophilia was “determined in the womb.”

The newspaper was also lambasted by critics after it ‘fact checked’ as “true” claims that an official Trump 2020 t-shirt features a ‘Nazi symbol’.

In February last year, the news outlet published an op-ed which denounced Tom Brady for refusing to walk back his previous support for Donald Trump and for being “white.”

The newspaper also had to fire their ‘race and inclusion’ editor Hemal Jhaveri after she falsely blamed the Boulder supermarket shooting on white people.

In summary, USA Today has a severe bias problem and shouldn’t be used as a non-partisan ‘fact checker’.

 

CDC is about Control, Not Disease Control

Marty Makary explains at Newsweek Why America Doesn’t Trust the CDC.  Excerpts in italics with my bolds.

People don’t trust the CDC. Here’s one example illustrating why. Two weeks ago, with no outcomes data on COVID-19 booster shots for 5-to-11-year-olds, the Centers for Disease Control (CDC) vigorously recommended the booster for all 24 million American children in that age group. The CDC cited a small Pfizer study of 140 children that showed boosters elevated their antibody levels—an outcome known to be transitory.

When that study concluded, a Pfizer spokesperson said it did not determine the efficacy of the booster in the 5-to-11-year-olds. But that didn’t matter to the CDC.

Seemingly hoping for a different answer, the agency put the matter before its own kangaroo court of curated experts, the Advisory Committee on Immunization Practices (ACIP).  I listened to the meeting, and couldn’t believe what I heard. At times, the committee members sounded like a group of marketing executives. Dr. Beth Bell of the University of Washington said “what we really need to do is to be as consistent and clear and simple as possible,” pointing out that the committee needed “a consistent recommendation which is simple.”

Other committee members similarly emphasized the importance of a universal booster message that applies to all age groups. Dr. David Kimberlin, editor of the American Academy of Pediatrics Red Book, speaking on his own behalf, said “Americans are yearning for, are crying out for a simpler way for looking at this pandemic.” He suggested that not recommending boosters for young children would create confusion that “could also bleed over to 12-to-17-year-olds, and even the adult population.”

The committee also debated how hard to push the booster recommendation, discussing whether the CDC should say that 5-to-11-year-olds “may” get a booster versus “should” get it.

Exhibiting classic medical paternalism, committee member Dr. Oliver Brooks of the Watts Healthcare Corporation said “I think may is confusing and may sow doubt,” adding “if we say should more people will get boosted versus may, then we may have more data that helps us really define where we’re going.” Dr.

Brooks was essentially suggesting that boosting in this age group would be a clinical trial conducted without informed consent.

That doesn’t sound like following the science to me.

ACIP’s medical establishment representatives were on hand for the meeting. They included members of the trade association Pharmaceutical Research and Manufacturers of America and the American Medical Association (AMA). Dr. Sandra Fryhofer, an internist representing the AMA, summarized the tone of the many legacy stakeholders present with a passionate plea: “I urge the committee to support a ‘should’ recommendation for this third dose.”

The committee promptly approved the booster for young children by an 11-1 vote, with one obstetrician abstaining because he missed some of the discussion.

The one dissenting vote came from Dr. Keipp Talbot of Vanderbilt University, who courageously said vaccines, while extremely effective, “are not without their potential side effects.” She questioned the sustainability of vaccinating the population every six months. Many experts agree with her, but they don’t have a platform to speak. In fact, nearly 40 percent of rural parents say their pediatricians do not recommend the primary vaccine series for children. Those pediatricians were not represented on the committee.

The CDC has a history of appointing like-minded loyalists to its committees.

Last year, it dismissed a member of its vaccine safety group, Harvard professor of medicine Dr. Martin Kuldorff, for dissenting from its decision to pause the J&J vaccine. A year ago, Joe Biden appointed party devotees to his COVID-19 task force. Reaching a consensus is easier that way.

The Food and Drug Administration’s (FDA) vaccine advisory committee, comprised of the nation’s top vaccine experts, have made similar public statements as Dr. Talbot. But the committee was not involved in approving boosters for children. The FDA actually bypassed it days prior—the third time over the last year that the FDA made sweeping and controversial authorizations without convening its vaccine experts.

Most remarkably, it didn’t seem to matter to the CDC that 75.2 percent of children under age 11 already have natural immunity, according to a CDC study that concluded in February. Natural immunity is certainly much more prevalent today, given the ubiquity of the Omicron variant since February. CDC data from New York and California demonstrated that natural immunity was 2.8 times more effective in preventing hospitalization and 3.3 to 4.7 times more effective in preventing COVID infection compared to vaccination during the Delta wave. These findings are consistent with dozens of other clinical studies.

Yet natural immunity has consistently and inexplicably been dismissed by the medical establishment.

When the CDC voted, director Dr. Rochelle Walensky declared that the booster dose is safe for kids ages 5-11. Yes, the complication rate is very low, and we think it’s safe, but how can anyone know from only a short-term follow-up of 140 children? The more appropriate assessment is that we believe it’s safe but we can’t be sure yet from the data we have so far. Unfortunately, the strength of the CDC recommendation to boost all children 5 and up will trigger some schools and summer camps to blindly mandate a third dose for healthy children who don’t need it.

Instead of pushing boosters on healthy children who are already immune, public health officials should focus on recommending the primary COVID vaccine series to high-risk children who don’t have any immunity.

Public health officials are expected to recommend COVID vaccines for children under 5 as soon as June 21st, despite the fact that the vast majority of children already have natural immunity. In a recent Kaiser survey, only 18 percent of parents said they were eager to vaccinate their child in that age group.

If the CDC is curious as to why people aren’t listening to its recommendations, it should consider how it bypassed experts to put the matter before a Kangaroo court of like-minded loyalists. The Biden administration should insist that we return to the standard process of putting all major vaccine decisions before a vote of the FDA’s leading vaccine experts.

The Biden administration promised to listen to the scientists. But the truth is, it only seems to listen to the ones who say what it wants to hear.

Marty Makary M.D., M.P.H. (@MartyMakary) is a professor at the Johns Hopkins School of Medicine and author of The New York Times Bestselling Book, The Price We Pay: What Broke American Health Care and How To Fix It.

Fear Dihydrogen Monoxide, Not CO2

Overview from Wikipedia (H/T Raymond)

Dihydrogen monoxide:

  • is also known as hydroxyl acid, and is the major component of acid rain.
  • contributes to the “greenhouse effect”.
  • may cause severe burns.
  • contributes to the erosion of our natural landscape.
  • accelerates corrosion and rusting of many metals.
  • may cause electrical failures and decreased effectiveness of automobile brakes.
  • has been found in excised tumors of terminal cancer patients

Despite the danger, dihydrogen monoxide is often used:

  • as an industrial solvent and coolant.
  • in nuclear power plants.
  • in the production of styrofoam.
  • as a fire retardant.
  • in many forms of cruel animal research.
  • in the distribution of pesticides. Even after washing, produce remains contaminated by this chemical.
  • as an additive in certain “junk-foods” and other food products.
Material Safety Data Sheet from ChemSafe

Even worse is the contribution of DHMO to the climate crisis

Conclusion: Stop obsessing over Carbon Dioxide, and Ban Dihydrogen Monoxide Now!

Mid June Arctic Ice Returns to Mean

The Arctic ice melting season was delayed this year as shown by the end of May (day 151) surplus of 600k km2 over the 16-yr average.  Since then both MASIE and SII show a steep decline in Arctic ice extents, now matching the average for June 15 (day 166).  The reports show that Barents alone lost 320k km2, Laptev down 200k km2, Baffin Bay lost 165k km2, Chukchi, Kara, Greenland seas all lost around 100k km2 each.

For the month of June Hudson Bay will take the stage.  Above average early in June. Hudson Bay lost 100k km2 the last six days. Being a shallow basin, it will likely lose much of its 1M km2 in a few weeks.

Why is this important?  All the claims of global climate emergency depend on dangerously higher temperatures, lower sea ice, and rising sea levels.  The lack of additional warming is documented in a post Adios, Global Warming

The lack of acceleration in sea levels along coastlines has been discussed also.  See USCS Warnings of Coastal Floodings

Also, a longer term perspective is informative:

post-glacial_sea_level
The table below shows the distribution of Sea Ice across the Arctic Regions, on average, this year and 2020.

Region 2022166 Day 166 Average 2022-Ave. 2020166 2022-2020
 (0) Northern_Hemisphere 10788609 10854645  -66036  10425585 363024 
 (1) Beaufort_Sea 1054571 964886  89685  1005355 49216 
 (2) Chukchi_Sea 799723 796983  2740  775535 24188 
 (3) East_Siberian_Sea 1059777 1050162  9615  1013223 46554 
 (4) Laptev_Sea 686049 773271  -87221  782244 -96194 
 (5) Kara_Sea 712542 715202  -2659  513253 199289 
 (6) Barents_Sea 79046 206557  -127511  164943 -85896 
 (7) Greenland_Sea 539319 566915  -27596  578130 -38812 
 (8) Baffin_Bay_Gulf_of_St._Lawrence 799919 706060  93859  592090 207829 
 (9) Canadian_Archipelago 838798 795875  42923  792582 46215 
 (10) Hudson_Bay 957895 986396  -28501  937993 19902 
 (11) Central_Arctic 3216668 3220647  -3979  3231087 -14419 

The main deficits are in Laptev and Barents Seas, mostly offset by surpluses in Beaufort, Baffin and Canadian Archipelago.

 

 

 

Nature Erases Pulses of Human CO2 Emissions

Those committed to blaming humans for rising atmospheric CO2 sometimes admit that emitted CO2 (from any source) only stays in the air about 5 years (20% removed each year)  being absorbed into natural sinks.  But they then save their belief by theorizing that human emissions are “pulses” of additional CO2 which persist even when particular molecules are removed, resulting in higher CO2 concentrations.  The analogy would be a traffic jam on the freeway which persists long after the blockage in removed.

A recent study by Bud Bromley puts the fork in this theory.  His paper is A conservative calculation of specific impulse for CO2.  The title links to his text which goes through the math in detail.  Excerpts are in italics here with my bolds.

In the 2 years following the June 15, 1991 eruption of the Pinatubo volcano, the natural environment removed more CO2 than the entire increase in CO2 concentration due to all sources, human and natural, during the entire measured daily record of the Global Monitoring Laboratory of NOAA/Scripps Oceanographic Institute (MLO) May 17, 1974 to June 15, 1991.

Then, in the 2 years after that, that CO2 was replaced plus an additional increment of CO2.

The Pinatubo Phase I Study (Bromley & Tamarkin, 2022) calculated the mass of net CO2 removed from the atmosphere based on measurements taken by MLO and from those measurements then calculated the first and second time derivatives (i.e., slope and acceleration) of CO2 concentration. We then demonstrated a novel use of the Specific Impulse calculation, a standard physical calculation used daily in life and death decisions. There are no theories, estimates or computer models involved in these calculations.

The following calculation is a more conservative demonstration which makes it obvious that human CO2 is not increasing global CO2 concentration.

The average slope of the CO2 concentration in the pre-Pinatubo period in MLO data was 1.463 ppm/year based on the method described in Bromley and Tamarkin (2022). Slope is the rate of change of the CO2 concentration. The rate of change and slope of a CO2 concentration with respect to time elapsed are identical to the commonly known terms velocity and speed.

June 15, 1991 was the start of the major Pinatubo volcanic eruption and April 22, 1993 was the date of maximum deceleration in net global average atmospheric CO2 concentration after Pinatubo in the daily measurement record of MLO.

The impulse calculation tells us whether a car has enough braking force to stop before hitting the wall, or enough force to take the rocket into orbit before it runs out of fuel, or, as in the analogy in the Phase Pinatubo report (Bromley & Tamarkin, 2022), enough force to accelerate the loaded 747 to liftoff velocity before reaching the end of the runway, or enough force to overcome addition of human CO2 to air.

MLO began reporting daily CO2 data on May 17, 1974. On that day, MLO reported 333.38 ppm. On June 15, 1991, MLO reported 358 ppm. 358 minus 333 = 25 ppm increase in CO2. This increase includes all CO2 in the atmosphere from all sources, human and natural. There is no residual human fraction.

25 ppm * 7.76 GtCO2 per ppm = 194 GtCO2 increase in CO2

For this comparison, attribute to humans that entire increase in MLO CO2 since the daily record began. This amount was measured by MLO and we know this amount exceeds the actual human CO2 component.

11.35 GtCO2 per year divided by 365 days per year = 0.031 Gt “human” CO2 added per day. Assume that human emissions did not slow following Pinatubo, even though total CO2 was decelerating precipitously.

Hypothetically, on April 22, 1993, 677 days later, final velocity v of “human” CO2 was the same 0.031 per day. But to be more conservative, let v = 0.041 GtCO2 per day, that is, “human” CO2 is growing faster even though total CO2 is declining sharply.

Jh = 2.17 Newton seconds is the specific impulse for our hypothetical “human” CO2 emissions.

Comparison:

♦  2.17 Newton seconds for hypothetical “human” CO2 emissions
♦  -55.5 Newton seconds for natural CO2 removal from atmosphere

In this conservative calculation, based entirely on measurements (not theory, not models, and not estimates), Earth’s environment demonstrated the capacity to absorb more than 25 times the not-to-exceed amount of human CO2 emissions at that time.

The data and graphs produced by MLO also show a reduction in slope of CO2 concentration following the June 1991 eruption of Pinatubo, and also shows the more rapid recovery of total CO2 concentration that began about 2 years after the 1991 eruption. This graph is the annual rate of change of total atmosphere CO2 concentration. This graph is not human CO2.

During the global cooling event in the 2 years following the Pinatubo eruption, CO2 concentration decelerated rapidly. Following that 2 year period, in the next 2 years CO2 accelerated more rapidly than it had declined, reaching an average CO2 slope which exceeded MLO-measured slope for the period prior to the June 1991 Pinatubo eruption. The maximum force of the environment to both absorb and emit CO2 could be much larger than the 25 times human emission and could occur much faster.

We do not know the maximum force or specific impulse. But it is very safe to infer from this result that human CO2 emissions are not an environmental crisis.

Theoretical discussion and conclusion

These are the experiment results. Theory must explain these results, not the other way around.

Bromley and Tamarkin (2022) suggested a theory how this very large amount of CO2 could be absorbed so rapidly into the environment, mostly ocean surface. This experimental result is consistent with Henry’s Law, the Law of Mass Action and Le Chatelier’s principle. In a forthcoming addendum to Bromley and Tamarkin (2022), two additional laws, Fick’s Law and Graham’s Law are suggested additions to our theory explaining this experimental result.

There are several inorganic chemical sources in the sea surface thin layer which produce CO2 through a series of linked reactions. Based on theories asserted more than 60 years ago, inorganic and organic chemical sources and sinks are believed to be too small and/or too slow to explain the slope of net global average CO2 concentration. Our results strongly suggest that the net CO2 absorption and net emission events that followed the Pinatubo eruption are response and recovery to a perturbation to the natural trend. There is no suggestion in our results or in our theory that long-term warming of SST causes the slope of net global average CO2 concentration. We have not looked at temperatures or correlation statistics between temperature and CO2 concentration because they are co-dependent variables, and the simultaneity bias cannot be removed with acceptable certainty. References to 25 degrees C in Bromley and Tamarkin (2022) are only in theoretical discussion and not involved in any way in our data analysis or calculations. References to 25 degrees C are merely standard ambient temperature, part of SATP, agreed by standards organizations.

When CO2 slope and acceleration declined post-Pinatubo, why was there a recovery to previous slope, plus and additional offset? The decline and the recovery were certainly not due to humans or the biosphere. As we have shown, CO2 from humans and biosphere combined are over an order of magnitude less than the CO2 absorbed by the environment and then re-emitted. That alone should end fears of CO2-caused climate crisis. Where did the CO2 go so rapidly and where did the CO2 in the recovery come from? Our data suggests that in future research we will find a series of other events, other volcanoes, El Ninos and La Ninas, etc. that have similarly disrupted the equilibrium followed by a response and recovery from the environment.

Footnote:

Tom Segalstad produced this graph on the speed of ocean-CO2 fluxes:

Background:  CO2 Fluxes, Sources and Sinks

 

 

 

How to FLICC Off Climate Alarms

John Ridgway has provided an excellent framework for skeptics to examine and respond to claims from believers in global warming/climate change.  His essay at Climate Scepticism is Deconstructing Scepticism: The True FLICC.  Excerpts in italics with my bolds and added comments.

Overview

I have modified slightly the FLICC components to serve as a list of actions making up a skeptical approach to an alarmist claim.  IOW this is a checklist for applying critical intelligence to alarmist discourse in the public arena. The Summary can be stated thusly:

♦  Follow the Data
Find and follow the data and facts to where they lead

♦  Look for full risk profile
Look for a complete assessment of risks and costs from proposed policies

♦  Interrogate causal claims
Inquire into claimed cause-effect relationships

♦  Compile contrary explanations
Construct an organized view of contradictory evidence to the theory

♦  Confront cultural bias
Challenge attempts to promote consensus story with flimsy coincidence

A Case In Point

John Ridgway illustrates how this method works in a comment:

No sooner have I’ve pressed the publish button, and the BBC comes out with the perfect example of what I have been writing about:  Climate change: Rising sea levels threaten 200,000 England properties

It tells of a group of experts theorizing that 200,000 coastal properties are soon to be lost due to climate change. Indeed, it “is already happening” as far as Happisburg on the Norfolk coast is concerned. Coastal erosion is indeed a problem there.

But did the experts take into account that the data shows no acceleration of erosion over the last 2000 years? No.

Have they acknowledge the fact that erosion on the East coast is a legacy of glaciation? No.

[For the US example of this claim, see my post Sea Level Scare Machine]

The FLICC Framework

Below is Ridgway’s text regarding this thought process, followed by a synopsis of his discussion of the five elements. Text is in italics with my bolds.

As part of the anthropogenic climate change debate, and when discussing the proposed plans for transition to Net Zero, efforts have been made to analyse the thinking that underpins the typical sceptic’s position. These analyses have universally presupposed that such scepticism stubbornly persists in the face of overwhelming evidence, as reflected in the widespread use of the term ‘denier’. Consequently, they are based upon taxonomies of flawed reasoning and methods of deception and misinformation.1 

However, by taking such a prejudicial approach, the analyses have invariably failed to acknowledge the ideological, philosophical and psychological bases for sceptical thinking. The following taxonomy redresses that failing and, as a result, offers a more pertinent analysis that avoids the worst excesses of opinionated philippic. The taxonomy identifies a basic set of ideologies and attitudes that feature prominently in the typical climate change sceptic’s case. For my taxonomy I have chosen the acronym FLICC:2

  • Follow data but distrust judgement and speculation

     i.e. value empirical evidence over theory and conjecture.

  • Look for the full risk profile

      i.e. when considering the management of risks and uncertainties, demand that those associated        with mitigating and preventative measures are also taken into account.

  • Interrogate causal arguments

      i.e. demand that both necessity and sufficiency form the basis of a causal analysis.

  • Contrariness

      i.e. distrust consensus as an indicator of epistemological value.

  • Cultural awareness

       i.e. never underestimate the extent to which a society can fabricate a truth for its own purposes.

All of the above have a long and legitimate history outside the field of climate science. The suggestion that they are not being applied in good faith by climate change sceptics falls beyond the remit of taxonomical analysis and strays into the territory of propaganda and ad hominem.

The five ideologies and attitudes of climate change scepticism introduced above are now discussed in greater detail.

Following the data

Above all else, the sceptical approach is characterized by a reluctance to draw conclusions from a given body of evidence. When it comes to evidence supporting the idea of a ‘climate crisis’, such reluctance is judged by many to be pathological and indicative of motivated reasoning. Cognitive scientists use the term ‘conservative belief revision’ to refer to an undue reluctance to update beliefs in accordance with a new body of evidence. More precisely, when the individual retains the view that events have a random pattern, thereby downplaying the possibility of a causative factor, the term used is ‘slothful induction’. Either way, the presupposition is that the individual is committing a logical fallacy resulting from cognitive bias.

However, far from being a pathology of thinking, such reluctance has its legitimate foundations in Pyrrhonian philosophy and, when properly understood, it can be seen as an important thinking strategy.3 Conservative belief revision and slothful induction can indeed lead to false conclusions but, more importantly, the error most commonly encountered when making decisions under uncertainty (and the one with the greatest potential for damage) is to downplay unknown and possibly random factors and instead construct a narrative that overstates and prejudges causation. This tendency is central to the human condition and it lies at the heart of our failure to foresee the unexpected – this is the truly important cognitive bias that the sceptic seeks to avoid.

The empirical sceptic is cognisant of evidence and allows the formulation of theories but treats them with considerable caution due to the many ways in which such theories often entail unwarranted presupposition.

The drivers behind this problem are the propensity of the human mind to seek patterns, to construct narratives that hide complexities, to over-emphasise the causative role played by human agents and to under-emphasise the role played by external and possibly random factors. Ultimately, it is a problem regarding the comprehension of uncertainty — we comprehend in a manner that has served us well in evolutionary terms but has left us vulnerable to unprecedented, high consequence events.

It is often said that a true sceptic is one who is prepared to accept the prevailing theory once the evidence is ‘overwhelming’. The climate change sceptic’s reluctance to do so is taken as an indication that he or she is not a true sceptic. However, we see here that true scepticism lies in the willingness to challenge the idea that the evidence is overwhelming – it only seems overwhelming to those who fail to recognise the ‘theorizing disease’ and lack the resolve to resist it. Secondly, there cannot be a climate change sceptic alive who is not painfully aware of the humiliation handed out to those who resist the theorizing.

In practice, the theorizing and the narratives that trouble the empirical sceptic take many forms. It can be seen in:

♦  over-dependence upon mathematical models for which the tuning owes more to art than science.

♦  readiness to treat the output of such models as data resulting from experiment, rather than the hypotheses they are.

♦  lack of regard for ontological uncertainty (i.e. the unknown unknowns which, due to their very nature, the models do not address).

♦  emergence of story-telling as a primary weapon in the armoury of extreme weather event attribution.

♦  willingness to commit trillions of pounds to courses of action that are predicated upon Representative Concentration Pathways and economic models that are the ‘theorizing disease’ writ large.

♦  contributions of the myriad of activists who seek to portray the issues in a narrative form laden with social justice and other ethical considerations.

♦  imaginative but simplistic portrayals of climate change sceptics and their motives; portrayals that are drawing categorical conclusions that cannot possibly be justified given the ‘evidence’ offered. And;

♦  any narrative that turns out to be unfounded when one follows the data.

Climate change may have its basis in science and data, but this basis has long since been overtaken by a plethora of theorizing and causal narrative that sometimes appears to have taken on a life of its own. Is this what settled science is supposed to look like?

Looking for the full risk profile

Almost as fundamental as the sceptic’s resistance to theorizing and narrative is his or her appreciation that the management of anthropogenic warming (particularly the transition to Net Zero) is an undertaking beset with risk and uncertainty. This concern reflects a fundamental principle of risk management: proposed actions to tackle a risk are often in themselves problematic and so a full risk analysis is not complete until it can be confirmed that the net risk will decrease following the actions proposed.7

Firstly, the narrative of existential risk is rejected on the grounds of empirical scepticism (the evidence for an existential threat is not overwhelming, it is underwhelming).

Secondly, even if the narrative is accepted, it has not been reliably demonstrated that the proposal for Net Zero transition is free from existential or extreme risks.

Indeed, given the dominant role played by the ‘theorizing disease’ and how it lies behind our inability to envisage the unprecedented high consequence event, there is every reason to believe that the proposals for Net Zero transition should be equally subject to the precautionary principle. The fact that they are not is indicative of a double standard being applied. The argument seems to run as follows: There is no uncertainty regarding the physical risk posed by climate change, but if there were it would only add to the imperative for action. There is also no uncertainty regarding the transition risk, but if there were it could be ignored because one can only apply the precautionary principle once!

This is precisely the sort of inconsistency one encounters when uncertainties are rationalised away in order to support the favoured narrative.

The upshot of this double standard is that the activists appear to be proceeding with two very different risk management frameworks depending upon whether physical or transition risk is being considered. As a result, risks associated with renewable energy security, the environmental damage associated with proposals to reduce carbon emissions and the potentially catastrophic effects of the inevitable global economic shock are all played down or explained away.

Looking for the full risk profile is a basic of risk management practice. The fact that it is seen as a ploy used only by those wishing to oppose the management of anthropogenic climate change is both odd and worrying. It is indeed important to the sceptic, but it should be important to everyone.

Interrogating causal arguments

For many years we have been told that anthropogenic climate change will make bad things happen. These dire predictions were supposed to galvanize the world into action but that didn’t happen, no doubt partly due to the extent to which such predictions repeatedly failed to come true (as, for example, with the predictions of the disappearance of Arctic sea ice).  .  .This is one good reason for the empirical sceptic to distrust the narrative,8 but an even better one lies in the very concept of causation.

A major purpose of narrative is to reduce complexity so that the ‘truth’ can shine through. This is particularly the case with causal narratives. We all want executive summaries and sound bites such as ‘Y happened because of X’. But very few of us are interested in examining exactly what we mean by such statements – very few except, of course, for the empirical sceptics. In a messy world in which many factors may be at play, the more pertinent questions are:

♦  To what extent was X necessary for Y to happen?
♦  To what extent was X sufficient for Y to happen?

The vast majority of the extreme weather event attribution narrative is focused upon the first question and very little attention is paid to the second; at least not in the many press bulletins issued. Basically, we are told that the event was virtually impossible without climate change, but very little is said regarding whether climate change on its own was enough.

This problem of oversimplification is even more worrying once one starts to examine consequential damages whilst failing to take into account man-made failings such as those that exacerbate the impacts of floods and forest fires.9   The oversimplification of causal narrative is not restricted to weather-related events, of course. Climate change, we are told, is wreaking havoc with the flora and fauna and many species are dying out as a result. However, when such claims are examined more closely,10 it is invariably the case that climate change has been lumped in with a number of other factors that are destroying habitat.

When climate change sceptics point this out they are, of course, accused of cherry-picking. The truth, however, is that their insistence that the extended causal narrative of necessity and sufficiency should be respected is nothing more than the consequence of following the data and looking for the full risk profile.

Contrariness

The climate change debate is all about making decisions under uncertainty, so it is little surprise that gaining consensus is seen as centrally important. Uncertainty is reduced when the evidence is overwhelming and it is tempting to believe that the high level of consensus amongst climate scientists surely points towards there being overwhelming evidence. If one accepts this logic then the sceptic’s refusal to accept the consensus is just another manifestation of his or her denial.

Except, of course, an empirical sceptic would not accept this logic. Consensus does not result from a simple examination of concordant evidence, it is instead the fruit of the tendentious theorizing and simplifying narrative that the empirical sceptic intuitively distrusts. As explained above, there are a number of drivers that cause such theories and narratives to entail unwarranted presupposition, and it is naïve to believe that scientists are immune to such drivers.

However, the fact remains that consensus on beliefs is neither a sufficient nor a necessary condition for presuming that these beliefs constitute shared knowledge. It is only when a consensus on beliefs is uncoerced, uniquely heterogeneous and large, that a shared knowledge provides the best explanation of a given consensus.11 The notion that a scientific consensus can be trusted because scientists are permanently seeking to challenge accepted views is simplistic at best.

It is actually far from obvious that in climate science the conditions have been met for consensus to be a reliable indicator of shared knowledge.

Contrariness simply comes with the territory of being an empirical sceptic. The evidence of consensus is there to be seen, but the amount of theorizing and narrative required for its genesis, together with the social dimension to consensus generation, are enough for the empirical sceptic to treat the whole matter of consensus with a great deal of caution.

Cultural awareness

There has been a great deal said already regarding the culture wars surrounding issues such as the threat posed by anthropogenic climate change. Most of the concerns are directed at the sceptic, who for reasons never properly explained is deemed to be the instigator of the conflict. However, it is the sceptic who chooses to point out that the value-laden arguments offered by climate activists are best understood as part of a wider cultural movement in which rationality is subordinate to in-group versus outgroup dynamics.

Psychological, ethical and spiritual needs lie at the heart of the development of culture and so the adoption of the climate change phenomenon in service of these needs has to be seen as essentially a cultural power play. The dangers of uncritically accepting the fruits of theorizing and narrative are only the beginning of the empirical sceptic’s concerns. Beyond that is the concern that the direction the debate is taking is not even a matter of empiricism – data analysis has little to offer when so much depends upon whether the phenomenon is subsequently to be described as warming or heating. It is for this reason that much of the sceptic’s attention is directed towards the manner in which the science features in our culture rather than the science itself. Such are our psychological, ethical and spiritual needs, that we must not underestimate the extent to which ostensibly scientific output can be moulded in their service.

Conclusions

Taxonomies of thinking should not be treated too seriously. Whilst I hope that I have offered here a welcome antidote to the diatribe that often masquerades as a scholarly appraisal of climate change scepticism, it remains the case that the form that scepticism takes will be unique to the individual. I could not hope to cover all aspects of climate change scepticism in the limited space available to me, but it remains my belief that there are unifying principles that can be identified.

Central to these is the concept of the empirical sceptic and the need to understand that there are sound reasons to treat theorizing and simplifying narratives with extreme caution. The empirical sceptic resists the temptation to theorize, preferring instead to keep an open mind on the interpretation of the evidence. This is far from being self-serving denialism; it is instead a self-denying servitude to the data.

That said, I cannot believe that there would be any activist who, upon reading this account, would see a reason to modify their opinions regarding the bad faith and irrationality that lies behind scepticism. This, unfortunately, is only to be expected given that such opinions are themselves the result of theorizing and simplifying narrative.

Footnote:

While the above focuses on climate alarmism, there are many other social and political initiatives that are theory-driven, suffering from inadequate attention to analysis by empirical sceptics.  One has only to note corporate and governmental programs based on Critical Race or Gender theories.  In addition, COVID policies in advanced nations ignored the required full risk profiling, as well as overturning decades of epidemiological knowledge in favor of models and experimental gene therapies proposed by Big Pharma.

 

 

June 2022 Heat Records Silly Season Again

Photo illustration by Slate. Photos by Thinkstock.

A glance at the news aggregator shows the silly season is in full swing.  A partial listing of headlines recntly proclaiming the hottest whatever.

  • Temperatures hit 43C in Spain’s hottest spring heatwave in decades The Independent
  • How to sleep during a heatwave, according to experts The Independent
  • Climate crisis focus of NASA chief’s visit The University of Edinburgh
  • Video: US hit by floods, mudslides, wildfires resembling ‘an erupting volcano’ and a record heatwave in two days Sky News
  • Rising beaches suggest Antarctic glaciers are melting faster than ever New Atlas
  • Dangerous heat grips US through midweek as wildfires explode in West The Independent
  • In hottest city on Earth, mothers bear brunt of climate change Yahoo! UK & Ireland
  • ‘Earthworms on steroids’ are spreading like wild in Connecticut The Independent
  • The Guardian view on an Indian summer: human-made heatwaves are getting hotter The Guardian
  • UK weather: Britain could bask in warmest June day ever with 35C on Friday Mail Online
  • Climate Change Causes Melting Permafrost in Alaska Nature World News
  • Spain in grip of heatwave with temperatures forecast to hit 44C The Guardian

Time for some Clear Thinking about Heat Records (Previous Post)

Here is an analysis using critical intelligence to interpret media reports about temperature records this summer. Daniel Engber writes in Slate Crazy From the Heat

The subtitle is Climate change is real. Record-high temperatures everywhere are fake.  As we shall see from the excerpts below, The first sentence is a statement of faith, since as Engber demonstrates, the notion does not follow from the temperature evidence. Excerpts in italics with my bolds.

It’s been really, really hot this summer. How hot? Last Friday, the Washington Post put out a series of maps and charts to illustrate the “record-crushing heat.” All-time temperature highs have been measured in “scores of locations on every continent north of the equator,” the article said, while the lower 48 states endured the hottest-ever stretch of temperatures from May until July.

These were not the only records to be set in 2018. Historic heat waves have been crashing all around the world, with records getting shattered in Japan, broken on the eastern coast of Canada, smashed in California, and rewritten in the Upper Midwest. A city in Algeria suffered through the highest high temperature ever recorded in Africa. A village in Oman set a new world record for the highest-ever low temperature. At the end of July, the New York Times ran a feature on how this year’s “record heat wreaked havoc on four continents.” USA Today reported that more than 1,900 heat records had been tied or beaten in just the last few days of May.

While the odds that any given record will be broken may be very, very small, the total number of potential records is mind-blowingly enormous.

There were lots of other records, too, lots and lots and lots—but I think it’s best for me to stop right here. In fact, I think it’s best for all of us to stop reporting on these misleading, imbecilic stats. “Record-setting heat,” as it’s presented in news reports, isn’t really scientific, and it’s almost always insignificant. And yet, every summer seems to bring a flood of new superlatives that pump us full of dread about the changing climate. We’d all be better off without this phony grandiosity, which makes it seem like every hot and humid August is unparalleled in human history. It’s not. Reports that tell us otherwise should be banished from the news.

It’s true the Earth is warming overall, and the record-breaking heat that matters most—the kind we’d be crazy to ignore—is measured on a global scale. The average temperature across the surface of the planet in 2017 was 58.51 degrees, one-and-a-half degrees above the mean for the 20th century. These records matter: 17 of the 18 hottest years on planet Earth have occurred since 2001, and the four hottest-ever years were 2014, 2015, 2016, and 2017. It also matters that this changing climate will result in huge numbers of heat-related deaths. Please pay attention to these terrifying and important facts. Please ignore every other story about record-breaking heat.

You’ll often hear that these two phenomena are related, that local heat records reflect—and therefore illustrate—the global trend. Writing in Slate this past July, Irineo Cabreros explained that climate change does indeed increase the odds of extreme events, making record-breaking heat more likely. News reports often make this point, linking probabilities of rare events to the broader warming pattern. “Scientists say there’s little doubt that the ratcheting up of global greenhouse gases makes heat waves more frequent and more intense,” noted the Times in its piece on record temperatures in Algeria, Hong Kong, Pakistan, and Norway.

Yet this lesson is subtler than it seems. The rash of “record-crushing heat” reports suggest we’re living through a spreading plague of new extremes—that the rate at which we’re reaching highest highs and highest lows is speeding up. When the Post reports that heat records have been set “at scores of locations on every continent,” it makes us think this is unexpected. It suggests that as the Earth gets ever warmer, and the weather less predictable, such records will be broken far more often than they ever have before.

But that’s just not the case. In 2009, climatologist Gerald Meehl and several colleagues published an analysis of records drawn from roughly 2,000 weather stations in the U.S. between 1950 and 2006. There were tens of millions of data points in all—temperature highs and lows from every station, taken every day for more than a half-century. Meehl searched these numbers for the record-setting values—i.e., the days on which a given weather station saw its highest-ever high or lowest-ever low up until that point. When he plotted these by year, they fell along a downward-curving line. Around 50,000 new heat records were being set every year during the 1960s; then that number dropped to roughly 20,000 in the 1980s, and to 15,000 by the turn of the millennium.

From Meehl et al 2009.

This shouldn’t be surprising. As a rule, weather records will be set less frequently as time goes by. The first measurement of temperature that’s ever taken at a given weather station will be its highest (and lowest) of all time, by definition. There’s a good chance that the same station’s reading on Day 2 will be a record, too, since it only needs to beat the temperature recorded on Day 1. But as the weeks and months go by, this record-setting contest gets increasingly competitive: Each new daily temperature must now outdo every single one that came before. If the weather were completely random, we might peg the chances of a record being set at any time as 1/n, where n is the number of days recorded to that point. In other words, one week into your record-keeping, you’d have a 1 in 7 chance of landing on an all-time high. On the 100th day, your odds would have dropped to 1 percent. After 56 years, your chances would be very, very slim.

The weather isn’t random, though; we know it’s warming overall, from one decade to the next. That’s what Meehl et al. were looking at: They figured that a changing climate would tweak those probabilities, goosing the rate of record-breaking highs and tamping down the rate of record-breaking lows. This wouldn’t change the fundamental fact that records get broken much less often as the years go by. (Even though the world is warming, you’d still expect fewer heat records to be set in 2000 than in 1965.) Still, one might guess that climate change would affect the rate, so that more heat records would be set than we’d otherwise expect.

That’s not what Meehl found. Between 1950 and 2006, the rate of record-breaking heat seemed unaffected by large-scale changes to the climate: The number of new records set every year went down from one decade to the next, at a rate that matched up pretty well with what you’d see if the odds were always 1/n. The study did find something more important, though: Record-breaking lows were showing up much less often than expected. From one decade to the next, fewer records of any kind were being set, but the ratio of record lows to record highs was getting smaller over time. By the 2000s, it had fallen to about 0.5, meaning that the U.S. was seeing half as many record-breaking lows as record-breaking highs. (Meehl has since extended this analysis using data going back to 1930 and up through 2015. The results came out the same.)

What does all this mean? On one hand, it’s very good evidence that climate change has tweaked the odds for record-breaking weather, at least when it comes to record lows. (Other studies have come to the same conclusion.) On the other hand, it tells us that in the U.S., at least, we’re not hitting record highs more often than we were before, and that the rate isn’t higher than what you’d expect if there weren’t any global warming. In fact, just the opposite is true: As one might expect, heat records are getting broken less often over time, and it’s likely there will be fewer during the 2010s than at any point since people started keeping track.

This may be hard to fathom, given how much coverage has been devoted to the latest bouts of record-setting heat. These extreme events are more unusual, in absolute terms, than they’ve ever been before, yet they’re always in the news. How could that be happening?

While the odds that any given record will be broken may be very, very small, the total number of potential records that could be broken—and then reported in the newspaper—is mind-blowingly enormous. To get a sense of how big this number really is, consider that the National Oceanic and Atmospheric Administration keeps a database of daily records from every U.S. weather station with at least 30 years of data, and that its website lets you search for how many all-time records have been set in any given stretch of time. For instance, the database indicates that during the seven-day period ending on Aug. 17—the date when the Washington Post published its series of “record-crushing heat” infographics—154 heat records were broken.

That may sound like a lot—154 record-high temperatures in the span of just one week. But the NOAA website also indicates how many potential records could have been achieved during that time: 18,953. In actuality, less than one percent of these were broken. You can also pull data on daily maximum temperatures for an entire month: I tried that with August 2017, and then again for months of August at 10-year intervals going back to the 1950s. Each time the query returned at least about 130,000 potential records, of which one or two thousand seemed to be getting broken every year. (There was no apparent trend toward more records being broken over time.)

Now let’s say there are 130,000 high-temperature records to be broken every month in the U.S. That’s only half the pool of heat-related records, since the database also lets you search for all-time highest low temperatures. You can also check whether any given highest high or highest low happens to be a record for the entire month in that location, or whether it’s a record when compared across all the weather stations everywhere on that particular day.

Add all of these together and the pool of potential heat records tracked by NOAA appears to number in the millions annually, of which tens of thousands may be broken. Even this vastly underestimates the number of potential records available for media concern. As they’re reported in the news, all-time weather records aren’t limited to just the highest highs or highest lows for a given day in one location. Take, for example, the first heat record mentioned in this column, reported in the Post: The U.S. has just endured the hottest May, June, and July of all time. The existence of that record presupposes many others: What about the hottest April, May and June, or the hottest March, April, and May? What about all the other ways that one might subdivide the calendar?

Geography provides another endless well of flexibility. Remember that the all-time record for the hottest May, June, and July applied only to the lower 48 states. Might a different set of records have been broken if we’d considered Hawaii and Alaska? And what about the records spanning smaller portions of the country, like the Midwest, or the Upper Midwest, or just the state of Minnesota, or just the Twin Cities? And what about the all-time records overseas, describing unprecedented heat in other countries or on other continents?

Even if we did limit ourselves to weather records from a single place measured over a common timescale, it would still be possible to parse out record-breaking heat in a thousand different ways. News reports give separate records, as we’ve seen, for the highest daily high and the highest daily low, but they also tell us when we’ve hit the highest average temperature over several days or several weeks or several months. The Post describes a recent record-breaking streak of days in San Diego with highs of at least 83 degrees. (You’ll find stories touting streaks of daily highs above almost any arbitrary threshold: 90 degrees, 77 degrees, 60 degrees, et cetera.) Records also needn’t focus on the temperature at all: There’s been lots of news in recent weeks about the fact that the U.K. has just endured its driest-ever early summer.

“Record-breaking” summer weather, then, can apply to pretty much any geographical location, over pretty much any span of time. It doesn’t even have to be a record—there’s an endless stream of stories on “near-record heat” in one place or another, or the “fifth-hottest” whatever to happen in wherever, or the fact that it’s been “one of the hottest” yadda-yaddas that yadda-yadda has ever seen. In the most perverse, insane extension of this genre, news outlets sometimes even highlight when a given record isn’t being set.

Loose reports of “record-breaking heat” only serve to puff up muggy weather and make it seem important. (The sham inflations of the wind chill factor do the same for winter months.) So don’t be fooled or flattered by this record-setting hype. Your summer misery is nothing special.

Summary

This article helps people not to confuse weather events with climate.  My disappointment is with the phrase, “Climate Change is Real,” since it is subject to misdirection.  Engber uses that phrase referring to rising average world temperatures, without explaining that such estimates are computer processed reconstructions since the earth has no “average temperature.”  More importantly the undefined “climate change” is a blank slate to which a number of meanings can be attached.

Some take it to mean: It is real that rising CO2 concentrations cause rising global warming.  Yet that is not supported by temperature records.
Others think it means: It is real that using fossil fuels causes global warming.  This too lacks persuasive evidence.

Since 1965 the increase in fossil fuel consumption is dramatic and monotonic (with 2020 an exception), steadily increasing by 218% from 146 to 463 exajoules. Meanwhile the GMT record from Hadcrut shows multiple ups and downs with an accumulated rise of 0.9C over 55 years, 7% of the starting value.

Others know that Global Mean Temperature is a slippery calculation subject to the selection of stations.

Graph showing the correlation between Global Mean Temperature (Average T) and the number of stations included in the global database. Source: Ross McKitrick, U of Guelph

Global warming estimates combine results from adjusted records.
Conclusion

The pattern of high and low records discussed above is consistent with natural variability rather than rising CO2 or fossil fuel consumption. Those of us not alarmed about the reported warming understand that “climate change” is something nature does all the time, and that the future is likely to include periods both cooler and warmer than now.

Background Reading:

The Climate Story (Illustrated)

2021 Update: Fossil Fuels ≠ Global Warming

Man Made Warming from Adjusting Data

What is Global Temperature? Is it warming or cooling?

NOAA US temp 2019 2021