USA Today Outed for Fictional Fact Checking

Paul Joseph Watson writes at Summit News Top ‘Fact Checker’ USA Today Forced to Delete Articles Over Fabricated Sources.  Excerpts in italics with my bolds.  H/T Tyler Durden

USA Today, which is used as a ‘fact checker’ by social media platforms, was forced to delete 23 articles from its website after an investigation found one of its reporters had fabricated sources.

Well, this is awkward

The news outlet has an entire section of its website dedicated to ‘fact checking’ and is used by Facebook to ‘fact check’ stories published by other outlets, downranking them in algorithms in a form of soft censorship.

However, it appears as though USA Today should have devoted more resources to fact checking itself before publishing articles by its own staff.

“USA Today’s breaking news reporter Gabriela Miranda fabricated sources and misappropriated quotes for stories, the news outlet confirmed on Thursday. The outlet conducted an internal audit after receiving an “external correction request” on one of its published stories,” reports Breitbart.

The 23 articles which were removed for not meeting the paper’s “editorial standards” included pieces on the Texas abortion ban, anti-vaxxer content and Russia’s invasion of Ukraine.

Miranda, who has now resigned from her position, “took steps to deceive investigators by producing false evidence of her news gathering, including recordings of interviews,” according to the New York Times.

“After receiving an external correction request, USA TODAY audited the reporting work of Gabriela Miranda. The audit revealed that some individuals quoted were not affiliated with the organizations claimed and appeared to be fabricated. The existence of other individuals quoted could not be independently verified. In addition, some stories included quotes that should have been credited to others.”

As we previously highlighted, USA Today was also forced to hastily delete a series of tweets which critics said were tantamount to the normalization of pedophilia after the newspaper cited “science” to assert that pedophilia was “determined in the womb.”

The newspaper was also lambasted by critics after it ‘fact checked’ as “true” claims that an official Trump 2020 t-shirt features a ‘Nazi symbol’.

In February last year, the news outlet published an op-ed which denounced Tom Brady for refusing to walk back his previous support for Donald Trump and for being “white.”

The newspaper also had to fire their ‘race and inclusion’ editor Hemal Jhaveri after she falsely blamed the Boulder supermarket shooting on white people.

In summary, USA Today has a severe bias problem and shouldn’t be used as a non-partisan ‘fact checker’.

 

CDC is about Control, Not Disease Control

Marty Makary explains at Newsweek Why America Doesn’t Trust the CDC.  Excerpts in italics with my bolds.

People don’t trust the CDC. Here’s one example illustrating why. Two weeks ago, with no outcomes data on COVID-19 booster shots for 5-to-11-year-olds, the Centers for Disease Control (CDC) vigorously recommended the booster for all 24 million American children in that age group. The CDC cited a small Pfizer study of 140 children that showed boosters elevated their antibody levels—an outcome known to be transitory.

When that study concluded, a Pfizer spokesperson said it did not determine the efficacy of the booster in the 5-to-11-year-olds. But that didn’t matter to the CDC.

Seemingly hoping for a different answer, the agency put the matter before its own kangaroo court of curated experts, the Advisory Committee on Immunization Practices (ACIP).  I listened to the meeting, and couldn’t believe what I heard. At times, the committee members sounded like a group of marketing executives. Dr. Beth Bell of the University of Washington said “what we really need to do is to be as consistent and clear and simple as possible,” pointing out that the committee needed “a consistent recommendation which is simple.”

Other committee members similarly emphasized the importance of a universal booster message that applies to all age groups. Dr. David Kimberlin, editor of the American Academy of Pediatrics Red Book, speaking on his own behalf, said “Americans are yearning for, are crying out for a simpler way for looking at this pandemic.” He suggested that not recommending boosters for young children would create confusion that “could also bleed over to 12-to-17-year-olds, and even the adult population.”

The committee also debated how hard to push the booster recommendation, discussing whether the CDC should say that 5-to-11-year-olds “may” get a booster versus “should” get it.

Exhibiting classic medical paternalism, committee member Dr. Oliver Brooks of the Watts Healthcare Corporation said “I think may is confusing and may sow doubt,” adding “if we say should more people will get boosted versus may, then we may have more data that helps us really define where we’re going.” Dr.

Brooks was essentially suggesting that boosting in this age group would be a clinical trial conducted without informed consent.

That doesn’t sound like following the science to me.

ACIP’s medical establishment representatives were on hand for the meeting. They included members of the trade association Pharmaceutical Research and Manufacturers of America and the American Medical Association (AMA). Dr. Sandra Fryhofer, an internist representing the AMA, summarized the tone of the many legacy stakeholders present with a passionate plea: “I urge the committee to support a ‘should’ recommendation for this third dose.”

The committee promptly approved the booster for young children by an 11-1 vote, with one obstetrician abstaining because he missed some of the discussion.

The one dissenting vote came from Dr. Keipp Talbot of Vanderbilt University, who courageously said vaccines, while extremely effective, “are not without their potential side effects.” She questioned the sustainability of vaccinating the population every six months. Many experts agree with her, but they don’t have a platform to speak. In fact, nearly 40 percent of rural parents say their pediatricians do not recommend the primary vaccine series for children. Those pediatricians were not represented on the committee.

The CDC has a history of appointing like-minded loyalists to its committees.

Last year, it dismissed a member of its vaccine safety group, Harvard professor of medicine Dr. Martin Kuldorff, for dissenting from its decision to pause the J&J vaccine. A year ago, Joe Biden appointed party devotees to his COVID-19 task force. Reaching a consensus is easier that way.

The Food and Drug Administration’s (FDA) vaccine advisory committee, comprised of the nation’s top vaccine experts, have made similar public statements as Dr. Talbot. But the committee was not involved in approving boosters for children. The FDA actually bypassed it days prior—the third time over the last year that the FDA made sweeping and controversial authorizations without convening its vaccine experts.

Most remarkably, it didn’t seem to matter to the CDC that 75.2 percent of children under age 11 already have natural immunity, according to a CDC study that concluded in February. Natural immunity is certainly much more prevalent today, given the ubiquity of the Omicron variant since February. CDC data from New York and California demonstrated that natural immunity was 2.8 times more effective in preventing hospitalization and 3.3 to 4.7 times more effective in preventing COVID infection compared to vaccination during the Delta wave. These findings are consistent with dozens of other clinical studies.

Yet natural immunity has consistently and inexplicably been dismissed by the medical establishment.

When the CDC voted, director Dr. Rochelle Walensky declared that the booster dose is safe for kids ages 5-11. Yes, the complication rate is very low, and we think it’s safe, but how can anyone know from only a short-term follow-up of 140 children? The more appropriate assessment is that we believe it’s safe but we can’t be sure yet from the data we have so far. Unfortunately, the strength of the CDC recommendation to boost all children 5 and up will trigger some schools and summer camps to blindly mandate a third dose for healthy children who don’t need it.

Instead of pushing boosters on healthy children who are already immune, public health officials should focus on recommending the primary COVID vaccine series to high-risk children who don’t have any immunity.

Public health officials are expected to recommend COVID vaccines for children under 5 as soon as June 21st, despite the fact that the vast majority of children already have natural immunity. In a recent Kaiser survey, only 18 percent of parents said they were eager to vaccinate their child in that age group.

If the CDC is curious as to why people aren’t listening to its recommendations, it should consider how it bypassed experts to put the matter before a Kangaroo court of like-minded loyalists. The Biden administration should insist that we return to the standard process of putting all major vaccine decisions before a vote of the FDA’s leading vaccine experts.

The Biden administration promised to listen to the scientists. But the truth is, it only seems to listen to the ones who say what it wants to hear.

Marty Makary M.D., M.P.H. (@MartyMakary) is a professor at the Johns Hopkins School of Medicine and author of The New York Times Bestselling Book, The Price We Pay: What Broke American Health Care and How To Fix It.

Fear Dihydrogen Monoxide, Not CO2

Overview from Wikipedia (H/T Raymond)

Dihydrogen monoxide:

  • is also known as hydroxyl acid, and is the major component of acid rain.
  • contributes to the “greenhouse effect”.
  • may cause severe burns.
  • contributes to the erosion of our natural landscape.
  • accelerates corrosion and rusting of many metals.
  • may cause electrical failures and decreased effectiveness of automobile brakes.
  • has been found in excised tumors of terminal cancer patients

Despite the danger, dihydrogen monoxide is often used:

  • as an industrial solvent and coolant.
  • in nuclear power plants.
  • in the production of styrofoam.
  • as a fire retardant.
  • in many forms of cruel animal research.
  • in the distribution of pesticides. Even after washing, produce remains contaminated by this chemical.
  • as an additive in certain “junk-foods” and other food products.
Material Safety Data Sheet from ChemSafe

Even worse is the contribution of DHMO to the climate crisis

Conclusion: Stop obsessing over Carbon Dioxide, and Ban Dihydrogen Monoxide Now!

Mid June Arctic Ice Returns to Mean

The Arctic ice melting season was delayed this year as shown by the end of May (day 151) surplus of 600k km2 over the 16-yr average.  Since then both MASIE and SII show a steep decline in Arctic ice extents, now matching the average for June 15 (day 166).  The reports show that Barents alone lost 320k km2, Laptev down 200k km2, Baffin Bay lost 165k km2, Chukchi, Kara, Greenland seas all lost around 100k km2 each.

For the month of June Hudson Bay will take the stage.  Above average early in June. Hudson Bay lost 100k km2 the last six days. Being a shallow basin, it will likely lose much of its 1M km2 in a few weeks.

Why is this important?  All the claims of global climate emergency depend on dangerously higher temperatures, lower sea ice, and rising sea levels.  The lack of additional warming is documented in a post Adios, Global Warming

The lack of acceleration in sea levels along coastlines has been discussed also.  See USCS Warnings of Coastal Floodings

Also, a longer term perspective is informative:

post-glacial_sea_level
The table below shows the distribution of Sea Ice across the Arctic Regions, on average, this year and 2020.

Region 2022166 Day 166 Average 2022-Ave. 2020166 2022-2020
 (0) Northern_Hemisphere 10788609 10854645  -66036  10425585 363024 
 (1) Beaufort_Sea 1054571 964886  89685  1005355 49216 
 (2) Chukchi_Sea 799723 796983  2740  775535 24188 
 (3) East_Siberian_Sea 1059777 1050162  9615  1013223 46554 
 (4) Laptev_Sea 686049 773271  -87221  782244 -96194 
 (5) Kara_Sea 712542 715202  -2659  513253 199289 
 (6) Barents_Sea 79046 206557  -127511  164943 -85896 
 (7) Greenland_Sea 539319 566915  -27596  578130 -38812 
 (8) Baffin_Bay_Gulf_of_St._Lawrence 799919 706060  93859  592090 207829 
 (9) Canadian_Archipelago 838798 795875  42923  792582 46215 
 (10) Hudson_Bay 957895 986396  -28501  937993 19902 
 (11) Central_Arctic 3216668 3220647  -3979  3231087 -14419 

The main deficits are in Laptev and Barents Seas, mostly offset by surpluses in Beaufort, Baffin and Canadian Archipelago.

 

 

 

Nature Erases Pulses of Human CO2 Emissions

Those committed to blaming humans for rising atmospheric CO2 sometimes admit that emitted CO2 (from any source) only stays in the air about 5 years (20% removed each year)  being absorbed into natural sinks.  But they then save their belief by theorizing that human emissions are “pulses” of additional CO2 which persist even when particular molecules are removed, resulting in higher CO2 concentrations.  The analogy would be a traffic jam on the freeway which persists long after the blockage in removed.

A recent study by Bud Bromley puts the fork in this theory.  His paper is A conservative calculation of specific impulse for CO2.  The title links to his text which goes through the math in detail.  Excerpts are in italics here with my bolds.

In the 2 years following the June 15, 1991 eruption of the Pinatubo volcano, the natural environment removed more CO2 than the entire increase in CO2 concentration due to all sources, human and natural, during the entire measured daily record of the Global Monitoring Laboratory of NOAA/Scripps Oceanographic Institute (MLO) May 17, 1974 to June 15, 1991.

Then, in the 2 years after that, that CO2 was replaced plus an additional increment of CO2.

The Pinatubo Phase I Study (Bromley & Tamarkin, 2022) calculated the mass of net CO2 removed from the atmosphere based on measurements taken by MLO and from those measurements then calculated the first and second time derivatives (i.e., slope and acceleration) of CO2 concentration. We then demonstrated a novel use of the Specific Impulse calculation, a standard physical calculation used daily in life and death decisions. There are no theories, estimates or computer models involved in these calculations.

The following calculation is a more conservative demonstration which makes it obvious that human CO2 is not increasing global CO2 concentration.

The average slope of the CO2 concentration in the pre-Pinatubo period in MLO data was 1.463 ppm/year based on the method described in Bromley and Tamarkin (2022). Slope is the rate of change of the CO2 concentration. The rate of change and slope of a CO2 concentration with respect to time elapsed are identical to the commonly known terms velocity and speed.

June 15, 1991 was the start of the major Pinatubo volcanic eruption and April 22, 1993 was the date of maximum deceleration in net global average atmospheric CO2 concentration after Pinatubo in the daily measurement record of MLO.

The impulse calculation tells us whether a car has enough braking force to stop before hitting the wall, or enough force to take the rocket into orbit before it runs out of fuel, or, as in the analogy in the Phase Pinatubo report (Bromley & Tamarkin, 2022), enough force to accelerate the loaded 747 to liftoff velocity before reaching the end of the runway, or enough force to overcome addition of human CO2 to air.

MLO began reporting daily CO2 data on May 17, 1974. On that day, MLO reported 333.38 ppm. On June 15, 1991, MLO reported 358 ppm. 358 minus 333 = 25 ppm increase in CO2. This increase includes all CO2 in the atmosphere from all sources, human and natural. There is no residual human fraction.

25 ppm * 7.76 GtCO2 per ppm = 194 GtCO2 increase in CO2

For this comparison, attribute to humans that entire increase in MLO CO2 since the daily record began. This amount was measured by MLO and we know this amount exceeds the actual human CO2 component.

11.35 GtCO2 per year divided by 365 days per year = 0.031 Gt “human” CO2 added per day. Assume that human emissions did not slow following Pinatubo, even though total CO2 was decelerating precipitously.

Hypothetically, on April 22, 1993, 677 days later, final velocity v of “human” CO2 was the same 0.031 per day. But to be more conservative, let v = 0.041 GtCO2 per day, that is, “human” CO2 is growing faster even though total CO2 is declining sharply.

Jh = 2.17 Newton seconds is the specific impulse for our hypothetical “human” CO2 emissions.

Comparison:

♦  2.17 Newton seconds for hypothetical “human” CO2 emissions
♦  -55.5 Newton seconds for natural CO2 removal from atmosphere

In this conservative calculation, based entirely on measurements (not theory, not models, and not estimates), Earth’s environment demonstrated the capacity to absorb more than 25 times the not-to-exceed amount of human CO2 emissions at that time.

The data and graphs produced by MLO also show a reduction in slope of CO2 concentration following the June 1991 eruption of Pinatubo, and also shows the more rapid recovery of total CO2 concentration that began about 2 years after the 1991 eruption. This graph is the annual rate of change of total atmosphere CO2 concentration. This graph is not human CO2.

During the global cooling event in the 2 years following the Pinatubo eruption, CO2 concentration decelerated rapidly. Following that 2 year period, in the next 2 years CO2 accelerated more rapidly than it had declined, reaching an average CO2 slope which exceeded MLO-measured slope for the period prior to the June 1991 Pinatubo eruption. The maximum force of the environment to both absorb and emit CO2 could be much larger than the 25 times human emission and could occur much faster.

We do not know the maximum force or specific impulse. But it is very safe to infer from this result that human CO2 emissions are not an environmental crisis.

Theoretical discussion and conclusion

These are the experiment results. Theory must explain these results, not the other way around.

Bromley and Tamarkin (2022) suggested a theory how this very large amount of CO2 could be absorbed so rapidly into the environment, mostly ocean surface. This experimental result is consistent with Henry’s Law, the Law of Mass Action and Le Chatelier’s principle. In a forthcoming addendum to Bromley and Tamarkin (2022), two additional laws, Fick’s Law and Graham’s Law are suggested additions to our theory explaining this experimental result.

There are several inorganic chemical sources in the sea surface thin layer which produce CO2 through a series of linked reactions. Based on theories asserted more than 60 years ago, inorganic and organic chemical sources and sinks are believed to be too small and/or too slow to explain the slope of net global average CO2 concentration. Our results strongly suggest that the net CO2 absorption and net emission events that followed the Pinatubo eruption are response and recovery to a perturbation to the natural trend. There is no suggestion in our results or in our theory that long-term warming of SST causes the slope of net global average CO2 concentration. We have not looked at temperatures or correlation statistics between temperature and CO2 concentration because they are co-dependent variables, and the simultaneity bias cannot be removed with acceptable certainty. References to 25 degrees C in Bromley and Tamarkin (2022) are only in theoretical discussion and not involved in any way in our data analysis or calculations. References to 25 degrees C are merely standard ambient temperature, part of SATP, agreed by standards organizations.

When CO2 slope and acceleration declined post-Pinatubo, why was there a recovery to previous slope, plus and additional offset? The decline and the recovery were certainly not due to humans or the biosphere. As we have shown, CO2 from humans and biosphere combined are over an order of magnitude less than the CO2 absorbed by the environment and then re-emitted. That alone should end fears of CO2-caused climate crisis. Where did the CO2 go so rapidly and where did the CO2 in the recovery come from? Our data suggests that in future research we will find a series of other events, other volcanoes, El Ninos and La Ninas, etc. that have similarly disrupted the equilibrium followed by a response and recovery from the environment.

Footnote:

Tom Segalstad produced this graph on the speed of ocean-CO2 fluxes:

Background:  CO2 Fluxes, Sources and Sinks

 

 

 

How to FLICC Off Climate Alarms

John Ridgway has provided an excellent framework for skeptics to examine and respond to claims from believers in global warming/climate change.  His essay at Climate Scepticism is Deconstructing Scepticism: The True FLICC.  Excerpts in italics with my bolds and added comments.

Overview

I have modified slightly the FLICC components to serve as a list of actions making up a skeptical approach to an alarmist claim.  IOW this is a checklist for applying critical intelligence to alarmist discourse in the public arena. The Summary can be stated thusly:

♦  Follow the Data
Find and follow the data and facts to where they lead

♦  Look for full risk profile
Look for a complete assessment of risks and costs from proposed policies

♦  Interrogate causal claims
Inquire into claimed cause-effect relationships

♦  Compile contrary explanations
Construct an organized view of contradictory evidence to the theory

♦  Confront cultural bias
Challenge attempts to promote consensus story with flimsy coincidence

A Case In Point

John Ridgway illustrates how this method works in a comment:

No sooner have I’ve pressed the publish button, and the BBC comes out with the perfect example of what I have been writing about:  Climate change: Rising sea levels threaten 200,000 England properties

It tells of a group of experts theorizing that 200,000 coastal properties are soon to be lost due to climate change. Indeed, it “is already happening” as far as Happisburg on the Norfolk coast is concerned. Coastal erosion is indeed a problem there.

But did the experts take into account that the data shows no acceleration of erosion over the last 2000 years? No.

Have they acknowledge the fact that erosion on the East coast is a legacy of glaciation? No.

[For the US example of this claim, see my post Sea Level Scare Machine]

The FLICC Framework

Below is Ridgway’s text regarding this thought process, followed by a synopsis of his discussion of the five elements. Text is in italics with my bolds.

As part of the anthropogenic climate change debate, and when discussing the proposed plans for transition to Net Zero, efforts have been made to analyse the thinking that underpins the typical sceptic’s position. These analyses have universally presupposed that such scepticism stubbornly persists in the face of overwhelming evidence, as reflected in the widespread use of the term ‘denier’. Consequently, they are based upon taxonomies of flawed reasoning and methods of deception and misinformation.1 

However, by taking such a prejudicial approach, the analyses have invariably failed to acknowledge the ideological, philosophical and psychological bases for sceptical thinking. The following taxonomy redresses that failing and, as a result, offers a more pertinent analysis that avoids the worst excesses of opinionated philippic. The taxonomy identifies a basic set of ideologies and attitudes that feature prominently in the typical climate change sceptic’s case. For my taxonomy I have chosen the acronym FLICC:2

  • Follow data but distrust judgement and speculation

     i.e. value empirical evidence over theory and conjecture.

  • Look for the full risk profile

      i.e. when considering the management of risks and uncertainties, demand that those associated        with mitigating and preventative measures are also taken into account.

  • Interrogate causal arguments

      i.e. demand that both necessity and sufficiency form the basis of a causal analysis.

  • Contrariness

      i.e. distrust consensus as an indicator of epistemological value.

  • Cultural awareness

       i.e. never underestimate the extent to which a society can fabricate a truth for its own purposes.

All of the above have a long and legitimate history outside the field of climate science. The suggestion that they are not being applied in good faith by climate change sceptics falls beyond the remit of taxonomical analysis and strays into the territory of propaganda and ad hominem.

The five ideologies and attitudes of climate change scepticism introduced above are now discussed in greater detail.

Following the data

Above all else, the sceptical approach is characterized by a reluctance to draw conclusions from a given body of evidence. When it comes to evidence supporting the idea of a ‘climate crisis’, such reluctance is judged by many to be pathological and indicative of motivated reasoning. Cognitive scientists use the term ‘conservative belief revision’ to refer to an undue reluctance to update beliefs in accordance with a new body of evidence. More precisely, when the individual retains the view that events have a random pattern, thereby downplaying the possibility of a causative factor, the term used is ‘slothful induction’. Either way, the presupposition is that the individual is committing a logical fallacy resulting from cognitive bias.

However, far from being a pathology of thinking, such reluctance has its legitimate foundations in Pyrrhonian philosophy and, when properly understood, it can be seen as an important thinking strategy.3 Conservative belief revision and slothful induction can indeed lead to false conclusions but, more importantly, the error most commonly encountered when making decisions under uncertainty (and the one with the greatest potential for damage) is to downplay unknown and possibly random factors and instead construct a narrative that overstates and prejudges causation. This tendency is central to the human condition and it lies at the heart of our failure to foresee the unexpected – this is the truly important cognitive bias that the sceptic seeks to avoid.

The empirical sceptic is cognisant of evidence and allows the formulation of theories but treats them with considerable caution due to the many ways in which such theories often entail unwarranted presupposition.

The drivers behind this problem are the propensity of the human mind to seek patterns, to construct narratives that hide complexities, to over-emphasise the causative role played by human agents and to under-emphasise the role played by external and possibly random factors. Ultimately, it is a problem regarding the comprehension of uncertainty — we comprehend in a manner that has served us well in evolutionary terms but has left us vulnerable to unprecedented, high consequence events.

It is often said that a true sceptic is one who is prepared to accept the prevailing theory once the evidence is ‘overwhelming’. The climate change sceptic’s reluctance to do so is taken as an indication that he or she is not a true sceptic. However, we see here that true scepticism lies in the willingness to challenge the idea that the evidence is overwhelming – it only seems overwhelming to those who fail to recognise the ‘theorizing disease’ and lack the resolve to resist it. Secondly, there cannot be a climate change sceptic alive who is not painfully aware of the humiliation handed out to those who resist the theorizing.

In practice, the theorizing and the narratives that trouble the empirical sceptic take many forms. It can be seen in:

♦  over-dependence upon mathematical models for which the tuning owes more to art than science.

♦  readiness to treat the output of such models as data resulting from experiment, rather than the hypotheses they are.

♦  lack of regard for ontological uncertainty (i.e. the unknown unknowns which, due to their very nature, the models do not address).

♦  emergence of story-telling as a primary weapon in the armoury of extreme weather event attribution.

♦  willingness to commit trillions of pounds to courses of action that are predicated upon Representative Concentration Pathways and economic models that are the ‘theorizing disease’ writ large.

♦  contributions of the myriad of activists who seek to portray the issues in a narrative form laden with social justice and other ethical considerations.

♦  imaginative but simplistic portrayals of climate change sceptics and their motives; portrayals that are drawing categorical conclusions that cannot possibly be justified given the ‘evidence’ offered. And;

♦  any narrative that turns out to be unfounded when one follows the data.

Climate change may have its basis in science and data, but this basis has long since been overtaken by a plethora of theorizing and causal narrative that sometimes appears to have taken on a life of its own. Is this what settled science is supposed to look like?

Looking for the full risk profile

Almost as fundamental as the sceptic’s resistance to theorizing and narrative is his or her appreciation that the management of anthropogenic warming (particularly the transition to Net Zero) is an undertaking beset with risk and uncertainty. This concern reflects a fundamental principle of risk management: proposed actions to tackle a risk are often in themselves problematic and so a full risk analysis is not complete until it can be confirmed that the net risk will decrease following the actions proposed.7

Firstly, the narrative of existential risk is rejected on the grounds of empirical scepticism (the evidence for an existential threat is not overwhelming, it is underwhelming).

Secondly, even if the narrative is accepted, it has not been reliably demonstrated that the proposal for Net Zero transition is free from existential or extreme risks.

Indeed, given the dominant role played by the ‘theorizing disease’ and how it lies behind our inability to envisage the unprecedented high consequence event, there is every reason to believe that the proposals for Net Zero transition should be equally subject to the precautionary principle. The fact that they are not is indicative of a double standard being applied. The argument seems to run as follows: There is no uncertainty regarding the physical risk posed by climate change, but if there were it would only add to the imperative for action. There is also no uncertainty regarding the transition risk, but if there were it could be ignored because one can only apply the precautionary principle once!

This is precisely the sort of inconsistency one encounters when uncertainties are rationalised away in order to support the favoured narrative.

The upshot of this double standard is that the activists appear to be proceeding with two very different risk management frameworks depending upon whether physical or transition risk is being considered. As a result, risks associated with renewable energy security, the environmental damage associated with proposals to reduce carbon emissions and the potentially catastrophic effects of the inevitable global economic shock are all played down or explained away.

Looking for the full risk profile is a basic of risk management practice. The fact that it is seen as a ploy used only by those wishing to oppose the management of anthropogenic climate change is both odd and worrying. It is indeed important to the sceptic, but it should be important to everyone.

Interrogating causal arguments

For many years we have been told that anthropogenic climate change will make bad things happen. These dire predictions were supposed to galvanize the world into action but that didn’t happen, no doubt partly due to the extent to which such predictions repeatedly failed to come true (as, for example, with the predictions of the disappearance of Arctic sea ice).  .  .This is one good reason for the empirical sceptic to distrust the narrative,8 but an even better one lies in the very concept of causation.

A major purpose of narrative is to reduce complexity so that the ‘truth’ can shine through. This is particularly the case with causal narratives. We all want executive summaries and sound bites such as ‘Y happened because of X’. But very few of us are interested in examining exactly what we mean by such statements – very few except, of course, for the empirical sceptics. In a messy world in which many factors may be at play, the more pertinent questions are:

♦  To what extent was X necessary for Y to happen?
♦  To what extent was X sufficient for Y to happen?

The vast majority of the extreme weather event attribution narrative is focused upon the first question and very little attention is paid to the second; at least not in the many press bulletins issued. Basically, we are told that the event was virtually impossible without climate change, but very little is said regarding whether climate change on its own was enough.

This problem of oversimplification is even more worrying once one starts to examine consequential damages whilst failing to take into account man-made failings such as those that exacerbate the impacts of floods and forest fires.9   The oversimplification of causal narrative is not restricted to weather-related events, of course. Climate change, we are told, is wreaking havoc with the flora and fauna and many species are dying out as a result. However, when such claims are examined more closely,10 it is invariably the case that climate change has been lumped in with a number of other factors that are destroying habitat.

When climate change sceptics point this out they are, of course, accused of cherry-picking. The truth, however, is that their insistence that the extended causal narrative of necessity and sufficiency should be respected is nothing more than the consequence of following the data and looking for the full risk profile.

Contrariness

The climate change debate is all about making decisions under uncertainty, so it is little surprise that gaining consensus is seen as centrally important. Uncertainty is reduced when the evidence is overwhelming and it is tempting to believe that the high level of consensus amongst climate scientists surely points towards there being overwhelming evidence. If one accepts this logic then the sceptic’s refusal to accept the consensus is just another manifestation of his or her denial.

Except, of course, an empirical sceptic would not accept this logic. Consensus does not result from a simple examination of concordant evidence, it is instead the fruit of the tendentious theorizing and simplifying narrative that the empirical sceptic intuitively distrusts. As explained above, there are a number of drivers that cause such theories and narratives to entail unwarranted presupposition, and it is naïve to believe that scientists are immune to such drivers.

However, the fact remains that consensus on beliefs is neither a sufficient nor a necessary condition for presuming that these beliefs constitute shared knowledge. It is only when a consensus on beliefs is uncoerced, uniquely heterogeneous and large, that a shared knowledge provides the best explanation of a given consensus.11 The notion that a scientific consensus can be trusted because scientists are permanently seeking to challenge accepted views is simplistic at best.

It is actually far from obvious that in climate science the conditions have been met for consensus to be a reliable indicator of shared knowledge.

Contrariness simply comes with the territory of being an empirical sceptic. The evidence of consensus is there to be seen, but the amount of theorizing and narrative required for its genesis, together with the social dimension to consensus generation, are enough for the empirical sceptic to treat the whole matter of consensus with a great deal of caution.

Cultural awareness

There has been a great deal said already regarding the culture wars surrounding issues such as the threat posed by anthropogenic climate change. Most of the concerns are directed at the sceptic, who for reasons never properly explained is deemed to be the instigator of the conflict. However, it is the sceptic who chooses to point out that the value-laden arguments offered by climate activists are best understood as part of a wider cultural movement in which rationality is subordinate to in-group versus outgroup dynamics.

Psychological, ethical and spiritual needs lie at the heart of the development of culture and so the adoption of the climate change phenomenon in service of these needs has to be seen as essentially a cultural power play. The dangers of uncritically accepting the fruits of theorizing and narrative are only the beginning of the empirical sceptic’s concerns. Beyond that is the concern that the direction the debate is taking is not even a matter of empiricism – data analysis has little to offer when so much depends upon whether the phenomenon is subsequently to be described as warming or heating. It is for this reason that much of the sceptic’s attention is directed towards the manner in which the science features in our culture rather than the science itself. Such are our psychological, ethical and spiritual needs, that we must not underestimate the extent to which ostensibly scientific output can be moulded in their service.

Conclusions

Taxonomies of thinking should not be treated too seriously. Whilst I hope that I have offered here a welcome antidote to the diatribe that often masquerades as a scholarly appraisal of climate change scepticism, it remains the case that the form that scepticism takes will be unique to the individual. I could not hope to cover all aspects of climate change scepticism in the limited space available to me, but it remains my belief that there are unifying principles that can be identified.

Central to these is the concept of the empirical sceptic and the need to understand that there are sound reasons to treat theorizing and simplifying narratives with extreme caution. The empirical sceptic resists the temptation to theorize, preferring instead to keep an open mind on the interpretation of the evidence. This is far from being self-serving denialism; it is instead a self-denying servitude to the data.

That said, I cannot believe that there would be any activist who, upon reading this account, would see a reason to modify their opinions regarding the bad faith and irrationality that lies behind scepticism. This, unfortunately, is only to be expected given that such opinions are themselves the result of theorizing and simplifying narrative.

Footnote:

While the above focuses on climate alarmism, there are many other social and political initiatives that are theory-driven, suffering from inadequate attention to analysis by empirical sceptics.  One has only to note corporate and governmental programs based on Critical Race or Gender theories.  In addition, COVID policies in advanced nations ignored the required full risk profiling, as well as overturning decades of epidemiological knowledge in favor of models and experimental gene therapies proposed by Big Pharma.

 

 

June 2022 Heat Records Silly Season Again

Photo illustration by Slate. Photos by Thinkstock.

A glance at the news aggregator shows the silly season is in full swing.  A partial listing of headlines recntly proclaiming the hottest whatever.

  • Temperatures hit 43C in Spain’s hottest spring heatwave in decades The Independent
  • How to sleep during a heatwave, according to experts The Independent
  • Climate crisis focus of NASA chief’s visit The University of Edinburgh
  • Video: US hit by floods, mudslides, wildfires resembling ‘an erupting volcano’ and a record heatwave in two days Sky News
  • Rising beaches suggest Antarctic glaciers are melting faster than ever New Atlas
  • Dangerous heat grips US through midweek as wildfires explode in West The Independent
  • In hottest city on Earth, mothers bear brunt of climate change Yahoo! UK & Ireland
  • ‘Earthworms on steroids’ are spreading like wild in Connecticut The Independent
  • The Guardian view on an Indian summer: human-made heatwaves are getting hotter The Guardian
  • UK weather: Britain could bask in warmest June day ever with 35C on Friday Mail Online
  • Climate Change Causes Melting Permafrost in Alaska Nature World News
  • Spain in grip of heatwave with temperatures forecast to hit 44C The Guardian

Time for some Clear Thinking about Heat Records (Previous Post)

Here is an analysis using critical intelligence to interpret media reports about temperature records this summer. Daniel Engber writes in Slate Crazy From the Heat

The subtitle is Climate change is real. Record-high temperatures everywhere are fake.  As we shall see from the excerpts below, The first sentence is a statement of faith, since as Engber demonstrates, the notion does not follow from the temperature evidence. Excerpts in italics with my bolds.

It’s been really, really hot this summer. How hot? Last Friday, the Washington Post put out a series of maps and charts to illustrate the “record-crushing heat.” All-time temperature highs have been measured in “scores of locations on every continent north of the equator,” the article said, while the lower 48 states endured the hottest-ever stretch of temperatures from May until July.

These were not the only records to be set in 2018. Historic heat waves have been crashing all around the world, with records getting shattered in Japan, broken on the eastern coast of Canada, smashed in California, and rewritten in the Upper Midwest. A city in Algeria suffered through the highest high temperature ever recorded in Africa. A village in Oman set a new world record for the highest-ever low temperature. At the end of July, the New York Times ran a feature on how this year’s “record heat wreaked havoc on four continents.” USA Today reported that more than 1,900 heat records had been tied or beaten in just the last few days of May.

While the odds that any given record will be broken may be very, very small, the total number of potential records is mind-blowingly enormous.

There were lots of other records, too, lots and lots and lots—but I think it’s best for me to stop right here. In fact, I think it’s best for all of us to stop reporting on these misleading, imbecilic stats. “Record-setting heat,” as it’s presented in news reports, isn’t really scientific, and it’s almost always insignificant. And yet, every summer seems to bring a flood of new superlatives that pump us full of dread about the changing climate. We’d all be better off without this phony grandiosity, which makes it seem like every hot and humid August is unparalleled in human history. It’s not. Reports that tell us otherwise should be banished from the news.

It’s true the Earth is warming overall, and the record-breaking heat that matters most—the kind we’d be crazy to ignore—is measured on a global scale. The average temperature across the surface of the planet in 2017 was 58.51 degrees, one-and-a-half degrees above the mean for the 20th century. These records matter: 17 of the 18 hottest years on planet Earth have occurred since 2001, and the four hottest-ever years were 2014, 2015, 2016, and 2017. It also matters that this changing climate will result in huge numbers of heat-related deaths. Please pay attention to these terrifying and important facts. Please ignore every other story about record-breaking heat.

You’ll often hear that these two phenomena are related, that local heat records reflect—and therefore illustrate—the global trend. Writing in Slate this past July, Irineo Cabreros explained that climate change does indeed increase the odds of extreme events, making record-breaking heat more likely. News reports often make this point, linking probabilities of rare events to the broader warming pattern. “Scientists say there’s little doubt that the ratcheting up of global greenhouse gases makes heat waves more frequent and more intense,” noted the Times in its piece on record temperatures in Algeria, Hong Kong, Pakistan, and Norway.

Yet this lesson is subtler than it seems. The rash of “record-crushing heat” reports suggest we’re living through a spreading plague of new extremes—that the rate at which we’re reaching highest highs and highest lows is speeding up. When the Post reports that heat records have been set “at scores of locations on every continent,” it makes us think this is unexpected. It suggests that as the Earth gets ever warmer, and the weather less predictable, such records will be broken far more often than they ever have before.

But that’s just not the case. In 2009, climatologist Gerald Meehl and several colleagues published an analysis of records drawn from roughly 2,000 weather stations in the U.S. between 1950 and 2006. There were tens of millions of data points in all—temperature highs and lows from every station, taken every day for more than a half-century. Meehl searched these numbers for the record-setting values—i.e., the days on which a given weather station saw its highest-ever high or lowest-ever low up until that point. When he plotted these by year, they fell along a downward-curving line. Around 50,000 new heat records were being set every year during the 1960s; then that number dropped to roughly 20,000 in the 1980s, and to 15,000 by the turn of the millennium.

From Meehl et al 2009.

This shouldn’t be surprising. As a rule, weather records will be set less frequently as time goes by. The first measurement of temperature that’s ever taken at a given weather station will be its highest (and lowest) of all time, by definition. There’s a good chance that the same station’s reading on Day 2 will be a record, too, since it only needs to beat the temperature recorded on Day 1. But as the weeks and months go by, this record-setting contest gets increasingly competitive: Each new daily temperature must now outdo every single one that came before. If the weather were completely random, we might peg the chances of a record being set at any time as 1/n, where n is the number of days recorded to that point. In other words, one week into your record-keeping, you’d have a 1 in 7 chance of landing on an all-time high. On the 100th day, your odds would have dropped to 1 percent. After 56 years, your chances would be very, very slim.

The weather isn’t random, though; we know it’s warming overall, from one decade to the next. That’s what Meehl et al. were looking at: They figured that a changing climate would tweak those probabilities, goosing the rate of record-breaking highs and tamping down the rate of record-breaking lows. This wouldn’t change the fundamental fact that records get broken much less often as the years go by. (Even though the world is warming, you’d still expect fewer heat records to be set in 2000 than in 1965.) Still, one might guess that climate change would affect the rate, so that more heat records would be set than we’d otherwise expect.

That’s not what Meehl found. Between 1950 and 2006, the rate of record-breaking heat seemed unaffected by large-scale changes to the climate: The number of new records set every year went down from one decade to the next, at a rate that matched up pretty well with what you’d see if the odds were always 1/n. The study did find something more important, though: Record-breaking lows were showing up much less often than expected. From one decade to the next, fewer records of any kind were being set, but the ratio of record lows to record highs was getting smaller over time. By the 2000s, it had fallen to about 0.5, meaning that the U.S. was seeing half as many record-breaking lows as record-breaking highs. (Meehl has since extended this analysis using data going back to 1930 and up through 2015. The results came out the same.)

What does all this mean? On one hand, it’s very good evidence that climate change has tweaked the odds for record-breaking weather, at least when it comes to record lows. (Other studies have come to the same conclusion.) On the other hand, it tells us that in the U.S., at least, we’re not hitting record highs more often than we were before, and that the rate isn’t higher than what you’d expect if there weren’t any global warming. In fact, just the opposite is true: As one might expect, heat records are getting broken less often over time, and it’s likely there will be fewer during the 2010s than at any point since people started keeping track.

This may be hard to fathom, given how much coverage has been devoted to the latest bouts of record-setting heat. These extreme events are more unusual, in absolute terms, than they’ve ever been before, yet they’re always in the news. How could that be happening?

While the odds that any given record will be broken may be very, very small, the total number of potential records that could be broken—and then reported in the newspaper—is mind-blowingly enormous. To get a sense of how big this number really is, consider that the National Oceanic and Atmospheric Administration keeps a database of daily records from every U.S. weather station with at least 30 years of data, and that its website lets you search for how many all-time records have been set in any given stretch of time. For instance, the database indicates that during the seven-day period ending on Aug. 17—the date when the Washington Post published its series of “record-crushing heat” infographics—154 heat records were broken.

That may sound like a lot—154 record-high temperatures in the span of just one week. But the NOAA website also indicates how many potential records could have been achieved during that time: 18,953. In actuality, less than one percent of these were broken. You can also pull data on daily maximum temperatures for an entire month: I tried that with August 2017, and then again for months of August at 10-year intervals going back to the 1950s. Each time the query returned at least about 130,000 potential records, of which one or two thousand seemed to be getting broken every year. (There was no apparent trend toward more records being broken over time.)

Now let’s say there are 130,000 high-temperature records to be broken every month in the U.S. That’s only half the pool of heat-related records, since the database also lets you search for all-time highest low temperatures. You can also check whether any given highest high or highest low happens to be a record for the entire month in that location, or whether it’s a record when compared across all the weather stations everywhere on that particular day.

Add all of these together and the pool of potential heat records tracked by NOAA appears to number in the millions annually, of which tens of thousands may be broken. Even this vastly underestimates the number of potential records available for media concern. As they’re reported in the news, all-time weather records aren’t limited to just the highest highs or highest lows for a given day in one location. Take, for example, the first heat record mentioned in this column, reported in the Post: The U.S. has just endured the hottest May, June, and July of all time. The existence of that record presupposes many others: What about the hottest April, May and June, or the hottest March, April, and May? What about all the other ways that one might subdivide the calendar?

Geography provides another endless well of flexibility. Remember that the all-time record for the hottest May, June, and July applied only to the lower 48 states. Might a different set of records have been broken if we’d considered Hawaii and Alaska? And what about the records spanning smaller portions of the country, like the Midwest, or the Upper Midwest, or just the state of Minnesota, or just the Twin Cities? And what about the all-time records overseas, describing unprecedented heat in other countries or on other continents?

Even if we did limit ourselves to weather records from a single place measured over a common timescale, it would still be possible to parse out record-breaking heat in a thousand different ways. News reports give separate records, as we’ve seen, for the highest daily high and the highest daily low, but they also tell us when we’ve hit the highest average temperature over several days or several weeks or several months. The Post describes a recent record-breaking streak of days in San Diego with highs of at least 83 degrees. (You’ll find stories touting streaks of daily highs above almost any arbitrary threshold: 90 degrees, 77 degrees, 60 degrees, et cetera.) Records also needn’t focus on the temperature at all: There’s been lots of news in recent weeks about the fact that the U.K. has just endured its driest-ever early summer.

“Record-breaking” summer weather, then, can apply to pretty much any geographical location, over pretty much any span of time. It doesn’t even have to be a record—there’s an endless stream of stories on “near-record heat” in one place or another, or the “fifth-hottest” whatever to happen in wherever, or the fact that it’s been “one of the hottest” yadda-yaddas that yadda-yadda has ever seen. In the most perverse, insane extension of this genre, news outlets sometimes even highlight when a given record isn’t being set.

Loose reports of “record-breaking heat” only serve to puff up muggy weather and make it seem important. (The sham inflations of the wind chill factor do the same for winter months.) So don’t be fooled or flattered by this record-setting hype. Your summer misery is nothing special.

Summary

This article helps people not to confuse weather events with climate.  My disappointment is with the phrase, “Climate Change is Real,” since it is subject to misdirection.  Engber uses that phrase referring to rising average world temperatures, without explaining that such estimates are computer processed reconstructions since the earth has no “average temperature.”  More importantly the undefined “climate change” is a blank slate to which a number of meanings can be attached.

Some take it to mean: It is real that rising CO2 concentrations cause rising global warming.  Yet that is not supported by temperature records.
Others think it means: It is real that using fossil fuels causes global warming.  This too lacks persuasive evidence.

Since 1965 the increase in fossil fuel consumption is dramatic and monotonic (with 2020 an exception), steadily increasing by 218% from 146 to 463 exajoules. Meanwhile the GMT record from Hadcrut shows multiple ups and downs with an accumulated rise of 0.9C over 55 years, 7% of the starting value.

Others know that Global Mean Temperature is a slippery calculation subject to the selection of stations.

Graph showing the correlation between Global Mean Temperature (Average T) and the number of stations included in the global database. Source: Ross McKitrick, U of Guelph

Global warming estimates combine results from adjusted records.
Conclusion

The pattern of high and low records discussed above is consistent with natural variability rather than rising CO2 or fossil fuel consumption. Those of us not alarmed about the reported warming understand that “climate change” is something nature does all the time, and that the future is likely to include periods both cooler and warmer than now.

Background Reading:

The Climate Story (Illustrated)

2021 Update: Fossil Fuels ≠ Global Warming

Man Made Warming from Adjusting Data

What is Global Temperature? Is it warming or cooling?

NOAA US temp 2019 2021

Our Childish Leaders

Donald J. Boudreaux writes at AIER Beware the Allure of Simple ‘Solutions’.  H/T Brownstone Institute.  Excerpts in italics with my bolds and added images.

The attitudes and opinions of today’s so-called “elite” – those public-opinion formers who Deirdre McCloskey calls “the clerisy” – are childish. Most journalists and writers working for most premier media and entertainment companies, along with most professors and public intellectuals, think, talk, and write about society with the insight of kindergartners.

This sad truth is masked by the one feature that does distinguish the clerisy from young children: verbal virtuosity. Yet beneath the fine words, beautiful phrases, arresting metaphors, and affected allusions lies a notable immaturity of thought.

Every social and economic problem is believed to have a solution,
and that solution is almost always superficial.

Unlike children, adults understand that living life well begins with accepting the inescapability of trade-offs. Contrary to what you might have heard, you cannot “have it all.” You cannot have more of this thing unless you’re willing to have less of that other thing. And what’s true for you as an individual is true for any group of individuals. We Americans cannot have our government artificially raise the cost of producing and using carbon fuels unless we are willing to pay higher prices at the pump and, thus, have less income to spend on acquiring other goods and services. We cannot use money creation to ease the pain today of COVID lockdowns without enduring the greater pain tomorrow of inflation.

While children stomp their little feet in protest when confronted with the need to make trade-offs, the necessity of trade-offs is accepted as a matter of course by adults.

No less importantly, adults, unlike children, are not beguiled by the superficial.

Pay close attention to how the clerisy (who are mostly, although not exclusively, Progressives) propose to ‘solve’ almost any problem, real or imaginary. You’ll discover that the proposed ‘solution’ is superficial; it’s rooted in the naïve assumption that social reality beyond what is immediately observable either doesn’t exist or is unaffected by attempts to rearrange surface phenomena. In the clerisy’s view, the only reality that matters is the reality that is easily seen and seemingly easily manipulated with coercion.

The clerisy’s proposed ‘solutions,’ therefore, involve simply rearranging,
or attempting to rearrange, surface phenomena.

♦  Do some people use guns to murder other people? Yes, sadly. The clerisy’s superficial ‘solution’ to this real problem is to outlaw guns.

♦  Do some people have substantially higher net financial worths than other people? Yes. The clerisy’s juvenile ‘solution’ to this fake problem is to heavily tax the rich and transfer the proceeds to the less rich.

♦  Are some workers paid wages that are too low to support a family in modern America? Yes. The clerisy’s simplistic ‘solution’ to this fake problem – “fake” because most workers earning such low wages are not heads of households – is to have government prohibit the payment of wages below some stipulated minimum.

♦  Do some people suffer substantial property damage, or even loss of life, because of hurricanes, droughts, and other bouts of severe weather? Yes. The clerisy’s lazy ‘solution’ to this real problem focuses on changing the weather by reducing the emissions of an element, carbon, that is now (too simplistically) believed to heavily determine the weather.

♦  Do prices of many ‘essential’ goods and services rise significantly in the immediate aftermath of natural disasters? Yes. The clerisy’s counterproductive ‘solution’ to this fake problem, “counterproductive” and “fake” because these high prices accurately reflect and signal underlying economic realities, is to prohibit the charging and payment of these high prices.

♦  When inflationary pressures build up because of excessive monetary growth, are these pressures vented in the form of rising prices? Yes indeed. The clerisy’s infantile ‘solution’ to the very real problem of inflation is to blame it on greed while raising taxes on profits.

♦  Is the SARS-CoV-2 virus contagious and potentially dangerous to humans? Yes. The clerisy’s simple-minded ‘solution’ to this real problem is to forcibly prevent people from mingling with each other.

♦  Do many Americans still not receive K-12 schooling of minimum acceptable quality? Yes. The clerisy’s lazy ‘solution’ to this real problem is to give pay raises to teachers and spend more money on school administrators.

♦  Do some American workers lose jobs when American consumers buy more imports? Yes. The clerisy’s ‘solution’ is to obstruct consumers’ ability to buy imports.

♦  Are some people bigoted and beset with irrational dislike or fear of blacks, gays, lesbians, and bisexuals? Yes. The clerisy’s ‘solution’ to this real problem is to outlaw “hate” and to compel bigoted persons to behave as if they aren’t bigoted.

♦  Do many persons who are eligible to vote in political elections refrain from voting? Yes. The ‘solution’ favored by at least some of the clerisy to this fake problem – “fake” because in a free society each person has a right to refrain from participating in politics – is to make voting mandatory.

The above list of simplistic and superficial ‘solutions’ to problems real and imaginary
can easily be expanded.

The clerisy, mistaking words for realities, assumes that success at verbally describing realities more to their liking proves that these imagined realities can be made real by merely rearranging the relevant surface phenomena. Members of the clerisy ignore unintended consequences. And they overlook the fact that many of the social and economic realities that they abhor are the result, not of villainy or of correctible imperfections, but of complex trade-offs made by countless individuals.

Social engineering appears doable only to those persons who, seeing only a relatively few surface phenomena, are blind to the astonishing complexity that is ever-churning beneath the surface to create those surface phenomena. To such persons, social reality appears as it does to a child: simple and easily manipulated to achieve whatever are the desires that motivate the manipulators.

The clerisy’s ranks are filled overwhelmingly with simple-minded people who mistake their felicity with words and their good intentions for serious thinking. They convey to each other, and to the unsuspecting public, the appearance of being deep thinkers while seldom thinking with more sophistication and nuance than is on display daily in every classroom of kindergartners.

Progressive Means Experts Rule

Ryan McMaken explains at Mises Why Progressives Love Government “Experts”.  Excerpts in italics with my bolds and added images H/T zerohedge

In twenty-first-century America, ordinary people are at the mercy of well-paid, unelected government experts who wield vast power.

That is, we live in the age of the technocrats: people who claim to have special wisdom that entitles them to control, manipulate, and manage society’s institutions using the coercive power of the state.

We’re told these people are “nonpolitical” and will use their impressive scientific knowledge to plan the economy, public health, public safety, or whatever goal the regime has decided the technocrats will be tasked with bringing about.

These people include central bankers, Supreme Court justices, “public health” bureaucrats, and Pentagon generals.

The narrative is that these people are not there to represent the public or bow to political pressure. They’re just there to do “the right thing” as dictated by economic theory, biological sciences, legal theory, or the study of military tactics.  We’re also told that in order to allow these people to act as the purely well-meaning apolitical geniuses they are, we must give them their independence and not question their methods or conclusions.

We were exposed to this routine yet again last week as President Joe Biden announced he will “respect the Fed’s independence” and allow the central bankers to set monetary policy without any bothersome interference from the representatives of the taxpayers who pay all the bills and who primarily pay the price when central bankers make things worse. (Biden, of course, didn’t mention that central bankers have been spectacularly wrong about the inflation threat in recent years, with inflation rates hitting forty-year highs, economic growth going negative, and consumer credit piling up as families struggle to cope with the cost of living.)

Conveniently, Biden’s deferral to the Fed allows him to blame it later when economic conditions get even worse. Nonetheless, his placing the economy in the hands of alleged experts will no doubt appear laudable to many. This is because the public has long been taught by public schools and media outlets that government experts should have the leeway to exercise vast power in the name of “fixing” whatever problems society faces.

The Expert Class as a Tool for State Building

The success of this idea represents a great victory for progressive ideology. Progressives have long been committed to creating a special expert class as a means of building state power. In the United States, for example, the cult of expertise really began to take hold in the late nineteenth and early twentieth centuries, and it led directly to support for more government intervention in the private sector. As Maureen Flanagan notes in “Progressives and Progressivism in an Era of Reform,”

Social science expertise gave political Progressives a theoretical foundation for cautious proposals to create a more activist state…. Professional social scientists composed a tight circle of men who created a space between academia and government from which to advocate for reform. They addressed each other, trained their students to follow their ideas, and rarely spoke to the larger public.

These men founded new organizations—such as the American Economics Association—to promote this new class of experts and their plans for a more centrally planned society. Ultimately, the nature of the expert class was revolutionary. The new social scientists thought they knew better than the patricians, religious leaders, local representatives, and market actors who had long shaped local institutions. Instead,

Progressives were modernizers with a structural-instrumentalist agenda. They rejected reliance on older values and cultural norms to order society and sought to create a modern reordered society with political and economic institutions run by men qualified to apply fiscal expertise, businesslike efficiency, and modern scientific expertise to solve problems and save democracy. The emerging academic disciplines in the social sciences of economics, political economy and political science, and pragmatic education supplied the theoretical bases for this middle-class expert Progressivism.

In the Progressive view, business leaders and machine politicians lacked a rational and broad view of the needs of society. In contrast, the government experts would approach society’s problems as scientists. Johnson felt this model already somewhat existed in the Department of War, where Johnson imagined the secretary of war was “quite free from political pressure and [relied] on the counsel of the engineers.” Johnson imagined that these science-minded bureaucrats could bring a “really economic and scientific application” of policy.

“Disinterested” Central Planners

Johnson was part of a wave of experts and intellectuals attempting to develop “a new realm of state expertise” that favored apolitical technocrats who would plan the nation’s infrastructure and industry. Many historians have recognized that these efforts were fundamentally “state-building activities … [and that] their emergence marked and symbolized a watershed in which an often-undemocratic new politics of administration and interest groups displaced the nineteenth century’s partisan, locally oriented public life”. 

In short, these efforts sowed the seeds for the idealized technocracy we have today: unresponsive to the public and imbued with vast coercive power that continually displaces private discretion and private prerogatives.

Indeed, the Progressive devotion to expertise followed “the core pattern of Progressive politics,” which is “the redirection of decision making upward within bureaucracies.” Thus, in contrast to the populist political institutions of an earlier time, decision-making in the Progressive Era became more white-collar, more middle class—as opposed to the working-class party workers—and more hierarchical within bureaucracies directly controlled by the state’s executive agencies.

Who Should Rule?

In many ways, then, this aspect of Progressive ideology turned the political agenda of laissez-faire classical liberalism on its head. Liberals of the Jeffersonian and Jacksonian variety had sought to increase outside political influence in the policy-making process through elections and the appointment of party activists loyal to elected representatives. This was because liberals feared that an insulated class of government experts would function more in its own interests than those of the taxpayers.

The Progressives, however, imagined they could create a disinterested nonpolitical class of experts devoted only to objective science.

The fundamental question, then, became who should rule: insulated experts or nonexpert representatives with closer ties to the taxpayers.  We can see today that the Progressives largely succeeded in granting far greater power to today’s technocratic class of experts. The technocrats are praised for their allegedly scientific focus, and we are told to respect their independence.

If the goal was ever to protect public checks on state power, however, this was always an unworkable ideal. By creating a special class of expert bureaucrats with decades-long careers within the regime itself, we are simply creating a new class of officials able to wield state power with little accountability. Anyone with a sufficiently critical view of state power could see the danger in this. Interestingly, it was anarcho-communist Mikhail Bakunin who recognized the impossibility of solving the problem of state power by putting scientific experts in charge. Such a move only represented a transfer of power from one group to another. Bakunin warned:

The State has always been the patrimony of some privileged class or other; a priestly class, an aristocratic class, a bourgeois class, and finally a bureaucratic class.

It is not necessary, of course, to have full-blown socialism to create this “new class.” The modern state with its mixed economy in most cases already has all the bureaucratic infrastructure necessary to make this a reality. As long as we defer to this ruling class of “scientists and scholars,” the Progressives have won.

Background Why Technocrats Deliver Catastrophes

Finland’s Self-imposed Climate Lockdown

You’d think that politicians had learned to forego climate virtue-signaling after seeing the lawfare tactics that they will suffer.  And yet, Finland bravely goes where smarter angels fear to tread.  As the Helsinki Times reports New Climate Change Act into force in July.  Excerpts in italics with my bolds.

The Climate Change Act lays the foundation for national work on climate change in Finland. The reformed Act sets emission reductions targets for 2030, 2040 and 2050. Now the target of a carbon-neutral Finland by 2035 has for the first time been laid down by law.

The Government submitted the bill for approval on 9 June. The President of the Republic is to approve the Act on 10 June and it will enter into force on 1 July 2022.

“The new Climate Change Act is vital for Finland. The Climate Change Act ensures that ambitious climate work will continue across government terms. The Act shows the world how we can built a carbon-neutral welfare state by 2035. It is also a strong signal for companies that in Finland clean solutions are well worth investing in,” says Minister of the Environment and Climate Change Maria Ohisalo.

Minister of the Environment and Climate Change Maria Ohisalo at a press event in Helsinki. LEHTIKUVA

The Act lays down provisions on the climate change policy plans. The scope of the Act will be extended to also cover emissions from the land use sector, i.e. land use, forestry and agriculture, and it will for the first time include the objective to strengthen carbon sinks.

“Including land use in the Climate Change Act is a significant improvement. We have a lot of opportunities to reduce emissions and strengthen carbon sinks in the land use sector – in forests, construction and agriculture,” Minister Ohisalo says.

The previous Climate Change Act entered into force in 2015, and it set an emission reduction target only for 2050. The new Climate Change Act will include emission reduction targets for 2030 and 2040 that are based on the recommendations of the Finnish Climate Change Panel, and the target for 2050 will be updated.

The emission reduction targets are -60% by 2030, -80% by 2040 and at least -90% but aiming at -95% by 2050, compared to the levels in 1990.

Finns have lost any room to maneuver, or to walk back ill-advised policies should the future be cooler rather than the warming of which they are so certain.  The lawyers will be all over them to prevent any escape.  To use another metaphor, they are lobsters who put themselves into the pots; there will be no getting out or going free.

 

See Also Dutch Judges Dictate Energy Policy

See Also Climate Tyranny By Way of Criminal Law