How to FLICC Off Climate Alarms

John Ridgway has provided an excellent framework for skeptics to examine and respond to claims from believers in global warming/climate change.  His essay at Climate Scepticism is Deconstructing Scepticism: The True FLICC.  Excerpts in italics with my bolds and added comments.

Overview

I have modified slightly the FLICC components to serve as a list of actions making up a skeptical approach to an alarmist claim.  IOW this is a checklist for applying critical intelligence to alarmist discourse in the public arena. The Summary can be stated thusly:

♦  Follow the Data
Find and follow the data and facts to where they lead

♦  Look for full risk profile
Look for a complete assessment of risks and costs from proposed policies

♦  Interrogate causal claims
Inquire into claimed cause-effect relationships

♦  Compile contrary explanations
Construct an organized view of contradictory evidence to the theory

♦  Confront cultural bias
Challenge attempts to promote consensus story with flimsy coincidence

A Case In Point

John Ridgway illustrates how this method works in a comment:

No sooner have I’ve pressed the publish button, and the BBC comes out with the perfect example of what I have been writing about:  Climate change: Rising sea levels threaten 200,000 England properties

It tells of a group of experts theorizing that 200,000 coastal properties are soon to be lost due to climate change. Indeed, it “is already happening” as far as Happisburg on the Norfolk coast is concerned. Coastal erosion is indeed a problem there.

But did the experts take into account that the data shows no acceleration of erosion over the last 2000 years? No.

Have they acknowledge the fact that erosion on the East coast is a legacy of glaciation? No.

[For the US example of this claim, see my post Sea Level Scare Machine]

The FLICC Framework

Below is Ridgway’s text regarding this thought process, followed by a synopsis of his discussion of the five elements. Text is in italics with my bolds.

As part of the anthropogenic climate change debate, and when discussing the proposed plans for transition to Net Zero, efforts have been made to analyse the thinking that underpins the typical sceptic’s position. These analyses have universally presupposed that such scepticism stubbornly persists in the face of overwhelming evidence, as reflected in the widespread use of the term ‘denier’. Consequently, they are based upon taxonomies of flawed reasoning and methods of deception and misinformation.1 

However, by taking such a prejudicial approach, the analyses have invariably failed to acknowledge the ideological, philosophical and psychological bases for sceptical thinking. The following taxonomy redresses that failing and, as a result, offers a more pertinent analysis that avoids the worst excesses of opinionated philippic. The taxonomy identifies a basic set of ideologies and attitudes that feature prominently in the typical climate change sceptic’s case. For my taxonomy I have chosen the acronym FLICC:2

  • Follow data but distrust judgement and speculation

     i.e. value empirical evidence over theory and conjecture.

  • Look for the full risk profile

      i.e. when considering the management of risks and uncertainties, demand that those associated        with mitigating and preventative measures are also taken into account.

  • Interrogate causal arguments

      i.e. demand that both necessity and sufficiency form the basis of a causal analysis.

  • Contrariness

      i.e. distrust consensus as an indicator of epistemological value.

  • Cultural awareness

       i.e. never underestimate the extent to which a society can fabricate a truth for its own purposes.

All of the above have a long and legitimate history outside the field of climate science. The suggestion that they are not being applied in good faith by climate change sceptics falls beyond the remit of taxonomical analysis and strays into the territory of propaganda and ad hominem.

The five ideologies and attitudes of climate change scepticism introduced above are now discussed in greater detail.

Following the data

Above all else, the sceptical approach is characterized by a reluctance to draw conclusions from a given body of evidence. When it comes to evidence supporting the idea of a ‘climate crisis’, such reluctance is judged by many to be pathological and indicative of motivated reasoning. Cognitive scientists use the term ‘conservative belief revision’ to refer to an undue reluctance to update beliefs in accordance with a new body of evidence. More precisely, when the individual retains the view that events have a random pattern, thereby downplaying the possibility of a causative factor, the term used is ‘slothful induction’. Either way, the presupposition is that the individual is committing a logical fallacy resulting from cognitive bias.

However, far from being a pathology of thinking, such reluctance has its legitimate foundations in Pyrrhonian philosophy and, when properly understood, it can be seen as an important thinking strategy.3 Conservative belief revision and slothful induction can indeed lead to false conclusions but, more importantly, the error most commonly encountered when making decisions under uncertainty (and the one with the greatest potential for damage) is to downplay unknown and possibly random factors and instead construct a narrative that overstates and prejudges causation. This tendency is central to the human condition and it lies at the heart of our failure to foresee the unexpected – this is the truly important cognitive bias that the sceptic seeks to avoid.

The empirical sceptic is cognisant of evidence and allows the formulation of theories but treats them with considerable caution due to the many ways in which such theories often entail unwarranted presupposition.

The drivers behind this problem are the propensity of the human mind to seek patterns, to construct narratives that hide complexities, to over-emphasise the causative role played by human agents and to under-emphasise the role played by external and possibly random factors. Ultimately, it is a problem regarding the comprehension of uncertainty — we comprehend in a manner that has served us well in evolutionary terms but has left us vulnerable to unprecedented, high consequence events.

It is often said that a true sceptic is one who is prepared to accept the prevailing theory once the evidence is ‘overwhelming’. The climate change sceptic’s reluctance to do so is taken as an indication that he or she is not a true sceptic. However, we see here that true scepticism lies in the willingness to challenge the idea that the evidence is overwhelming – it only seems overwhelming to those who fail to recognise the ‘theorizing disease’ and lack the resolve to resist it. Secondly, there cannot be a climate change sceptic alive who is not painfully aware of the humiliation handed out to those who resist the theorizing.

In practice, the theorizing and the narratives that trouble the empirical sceptic take many forms. It can be seen in:

♦  over-dependence upon mathematical models for which the tuning owes more to art than science.

♦  readiness to treat the output of such models as data resulting from experiment, rather than the hypotheses they are.

♦  lack of regard for ontological uncertainty (i.e. the unknown unknowns which, due to their very nature, the models do not address).

♦  emergence of story-telling as a primary weapon in the armoury of extreme weather event attribution.

♦  willingness to commit trillions of pounds to courses of action that are predicated upon Representative Concentration Pathways and economic models that are the ‘theorizing disease’ writ large.

♦  contributions of the myriad of activists who seek to portray the issues in a narrative form laden with social justice and other ethical considerations.

♦  imaginative but simplistic portrayals of climate change sceptics and their motives; portrayals that are drawing categorical conclusions that cannot possibly be justified given the ‘evidence’ offered. And;

♦  any narrative that turns out to be unfounded when one follows the data.

Climate change may have its basis in science and data, but this basis has long since been overtaken by a plethora of theorizing and causal narrative that sometimes appears to have taken on a life of its own. Is this what settled science is supposed to look like?

Looking for the full risk profile

Almost as fundamental as the sceptic’s resistance to theorizing and narrative is his or her appreciation that the management of anthropogenic warming (particularly the transition to Net Zero) is an undertaking beset with risk and uncertainty. This concern reflects a fundamental principle of risk management: proposed actions to tackle a risk are often in themselves problematic and so a full risk analysis is not complete until it can be confirmed that the net risk will decrease following the actions proposed.7

Firstly, the narrative of existential risk is rejected on the grounds of empirical scepticism (the evidence for an existential threat is not overwhelming, it is underwhelming).

Secondly, even if the narrative is accepted, it has not been reliably demonstrated that the proposal for Net Zero transition is free from existential or extreme risks.

Indeed, given the dominant role played by the ‘theorizing disease’ and how it lies behind our inability to envisage the unprecedented high consequence event, there is every reason to believe that the proposals for Net Zero transition should be equally subject to the precautionary principle. The fact that they are not is indicative of a double standard being applied. The argument seems to run as follows: There is no uncertainty regarding the physical risk posed by climate change, but if there were it would only add to the imperative for action. There is also no uncertainty regarding the transition risk, but if there were it could be ignored because one can only apply the precautionary principle once!

This is precisely the sort of inconsistency one encounters when uncertainties are rationalised away in order to support the favoured narrative.

The upshot of this double standard is that the activists appear to be proceeding with two very different risk management frameworks depending upon whether physical or transition risk is being considered. As a result, risks associated with renewable energy security, the environmental damage associated with proposals to reduce carbon emissions and the potentially catastrophic effects of the inevitable global economic shock are all played down or explained away.

Looking for the full risk profile is a basic of risk management practice. The fact that it is seen as a ploy used only by those wishing to oppose the management of anthropogenic climate change is both odd and worrying. It is indeed important to the sceptic, but it should be important to everyone.

Interrogating causal arguments

For many years we have been told that anthropogenic climate change will make bad things happen. These dire predictions were supposed to galvanize the world into action but that didn’t happen, no doubt partly due to the extent to which such predictions repeatedly failed to come true (as, for example, with the predictions of the disappearance of Arctic sea ice).  .  .This is one good reason for the empirical sceptic to distrust the narrative,8 but an even better one lies in the very concept of causation.

A major purpose of narrative is to reduce complexity so that the ‘truth’ can shine through. This is particularly the case with causal narratives. We all want executive summaries and sound bites such as ‘Y happened because of X’. But very few of us are interested in examining exactly what we mean by such statements – very few except, of course, for the empirical sceptics. In a messy world in which many factors may be at play, the more pertinent questions are:

♦  To what extent was X necessary for Y to happen?
♦  To what extent was X sufficient for Y to happen?

The vast majority of the extreme weather event attribution narrative is focused upon the first question and very little attention is paid to the second; at least not in the many press bulletins issued. Basically, we are told that the event was virtually impossible without climate change, but very little is said regarding whether climate change on its own was enough.

This problem of oversimplification is even more worrying once one starts to examine consequential damages whilst failing to take into account man-made failings such as those that exacerbate the impacts of floods and forest fires.9   The oversimplification of causal narrative is not restricted to weather-related events, of course. Climate change, we are told, is wreaking havoc with the flora and fauna and many species are dying out as a result. However, when such claims are examined more closely,10 it is invariably the case that climate change has been lumped in with a number of other factors that are destroying habitat.

When climate change sceptics point this out they are, of course, accused of cherry-picking. The truth, however, is that their insistence that the extended causal narrative of necessity and sufficiency should be respected is nothing more than the consequence of following the data and looking for the full risk profile.

Contrariness

The climate change debate is all about making decisions under uncertainty, so it is little surprise that gaining consensus is seen as centrally important. Uncertainty is reduced when the evidence is overwhelming and it is tempting to believe that the high level of consensus amongst climate scientists surely points towards there being overwhelming evidence. If one accepts this logic then the sceptic’s refusal to accept the consensus is just another manifestation of his or her denial.

Except, of course, an empirical sceptic would not accept this logic. Consensus does not result from a simple examination of concordant evidence, it is instead the fruit of the tendentious theorizing and simplifying narrative that the empirical sceptic intuitively distrusts. As explained above, there are a number of drivers that cause such theories and narratives to entail unwarranted presupposition, and it is naïve to believe that scientists are immune to such drivers.

However, the fact remains that consensus on beliefs is neither a sufficient nor a necessary condition for presuming that these beliefs constitute shared knowledge. It is only when a consensus on beliefs is uncoerced, uniquely heterogeneous and large, that a shared knowledge provides the best explanation of a given consensus.11 The notion that a scientific consensus can be trusted because scientists are permanently seeking to challenge accepted views is simplistic at best.

It is actually far from obvious that in climate science the conditions have been met for consensus to be a reliable indicator of shared knowledge.

Contrariness simply comes with the territory of being an empirical sceptic. The evidence of consensus is there to be seen, but the amount of theorizing and narrative required for its genesis, together with the social dimension to consensus generation, are enough for the empirical sceptic to treat the whole matter of consensus with a great deal of caution.

Cultural awareness

There has been a great deal said already regarding the culture wars surrounding issues such as the threat posed by anthropogenic climate change. Most of the concerns are directed at the sceptic, who for reasons never properly explained is deemed to be the instigator of the conflict. However, it is the sceptic who chooses to point out that the value-laden arguments offered by climate activists are best understood as part of a wider cultural movement in which rationality is subordinate to in-group versus outgroup dynamics.

Psychological, ethical and spiritual needs lie at the heart of the development of culture and so the adoption of the climate change phenomenon in service of these needs has to be seen as essentially a cultural power play. The dangers of uncritically accepting the fruits of theorizing and narrative are only the beginning of the empirical sceptic’s concerns. Beyond that is the concern that the direction the debate is taking is not even a matter of empiricism – data analysis has little to offer when so much depends upon whether the phenomenon is subsequently to be described as warming or heating. It is for this reason that much of the sceptic’s attention is directed towards the manner in which the science features in our culture rather than the science itself. Such are our psychological, ethical and spiritual needs, that we must not underestimate the extent to which ostensibly scientific output can be moulded in their service.

Conclusions

Taxonomies of thinking should not be treated too seriously. Whilst I hope that I have offered here a welcome antidote to the diatribe that often masquerades as a scholarly appraisal of climate change scepticism, it remains the case that the form that scepticism takes will be unique to the individual. I could not hope to cover all aspects of climate change scepticism in the limited space available to me, but it remains my belief that there are unifying principles that can be identified.

Central to these is the concept of the empirical sceptic and the need to understand that there are sound reasons to treat theorizing and simplifying narratives with extreme caution. The empirical sceptic resists the temptation to theorize, preferring instead to keep an open mind on the interpretation of the evidence. This is far from being self-serving denialism; it is instead a self-denying servitude to the data.

That said, I cannot believe that there would be any activist who, upon reading this account, would see a reason to modify their opinions regarding the bad faith and irrationality that lies behind scepticism. This, unfortunately, is only to be expected given that such opinions are themselves the result of theorizing and simplifying narrative.

Footnote:

While the above focuses on climate alarmism, there are many other social and political initiatives that are theory-driven, suffering from inadequate attention to analysis by empirical sceptics.  One has only to note corporate and governmental programs based on Critical Race or Gender theories.  In addition, COVID policies in advanced nations ignored the required full risk profiling, as well as overturning decades of epidemiological knowledge in favor of models and experimental gene therapies proposed by Big Pharma.

 

 

June 2022 Heat Records Silly Season Again

Photo illustration by Slate. Photos by Thinkstock.

A glance at the news aggregator shows the silly season is in full swing.  A partial listing of headlines recntly proclaiming the hottest whatever.

  • Temperatures hit 43C in Spain’s hottest spring heatwave in decades The Independent
  • How to sleep during a heatwave, according to experts The Independent
  • Climate crisis focus of NASA chief’s visit The University of Edinburgh
  • Video: US hit by floods, mudslides, wildfires resembling ‘an erupting volcano’ and a record heatwave in two days Sky News
  • Rising beaches suggest Antarctic glaciers are melting faster than ever New Atlas
  • Dangerous heat grips US through midweek as wildfires explode in West The Independent
  • In hottest city on Earth, mothers bear brunt of climate change Yahoo! UK & Ireland
  • ‘Earthworms on steroids’ are spreading like wild in Connecticut The Independent
  • The Guardian view on an Indian summer: human-made heatwaves are getting hotter The Guardian
  • UK weather: Britain could bask in warmest June day ever with 35C on Friday Mail Online
  • Climate Change Causes Melting Permafrost in Alaska Nature World News
  • Spain in grip of heatwave with temperatures forecast to hit 44C The Guardian

Time for some Clear Thinking about Heat Records (Previous Post)

Here is an analysis using critical intelligence to interpret media reports about temperature records this summer. Daniel Engber writes in Slate Crazy From the Heat

The subtitle is Climate change is real. Record-high temperatures everywhere are fake.  As we shall see from the excerpts below, The first sentence is a statement of faith, since as Engber demonstrates, the notion does not follow from the temperature evidence. Excerpts in italics with my bolds.

It’s been really, really hot this summer. How hot? Last Friday, the Washington Post put out a series of maps and charts to illustrate the “record-crushing heat.” All-time temperature highs have been measured in “scores of locations on every continent north of the equator,” the article said, while the lower 48 states endured the hottest-ever stretch of temperatures from May until July.

These were not the only records to be set in 2018. Historic heat waves have been crashing all around the world, with records getting shattered in Japan, broken on the eastern coast of Canada, smashed in California, and rewritten in the Upper Midwest. A city in Algeria suffered through the highest high temperature ever recorded in Africa. A village in Oman set a new world record for the highest-ever low temperature. At the end of July, the New York Times ran a feature on how this year’s “record heat wreaked havoc on four continents.” USA Today reported that more than 1,900 heat records had been tied or beaten in just the last few days of May.

While the odds that any given record will be broken may be very, very small, the total number of potential records is mind-blowingly enormous.

There were lots of other records, too, lots and lots and lots—but I think it’s best for me to stop right here. In fact, I think it’s best for all of us to stop reporting on these misleading, imbecilic stats. “Record-setting heat,” as it’s presented in news reports, isn’t really scientific, and it’s almost always insignificant. And yet, every summer seems to bring a flood of new superlatives that pump us full of dread about the changing climate. We’d all be better off without this phony grandiosity, which makes it seem like every hot and humid August is unparalleled in human history. It’s not. Reports that tell us otherwise should be banished from the news.

It’s true the Earth is warming overall, and the record-breaking heat that matters most—the kind we’d be crazy to ignore—is measured on a global scale. The average temperature across the surface of the planet in 2017 was 58.51 degrees, one-and-a-half degrees above the mean for the 20th century. These records matter: 17 of the 18 hottest years on planet Earth have occurred since 2001, and the four hottest-ever years were 2014, 2015, 2016, and 2017. It also matters that this changing climate will result in huge numbers of heat-related deaths. Please pay attention to these terrifying and important facts. Please ignore every other story about record-breaking heat.

You’ll often hear that these two phenomena are related, that local heat records reflect—and therefore illustrate—the global trend. Writing in Slate this past July, Irineo Cabreros explained that climate change does indeed increase the odds of extreme events, making record-breaking heat more likely. News reports often make this point, linking probabilities of rare events to the broader warming pattern. “Scientists say there’s little doubt that the ratcheting up of global greenhouse gases makes heat waves more frequent and more intense,” noted the Times in its piece on record temperatures in Algeria, Hong Kong, Pakistan, and Norway.

Yet this lesson is subtler than it seems. The rash of “record-crushing heat” reports suggest we’re living through a spreading plague of new extremes—that the rate at which we’re reaching highest highs and highest lows is speeding up. When the Post reports that heat records have been set “at scores of locations on every continent,” it makes us think this is unexpected. It suggests that as the Earth gets ever warmer, and the weather less predictable, such records will be broken far more often than they ever have before.

But that’s just not the case. In 2009, climatologist Gerald Meehl and several colleagues published an analysis of records drawn from roughly 2,000 weather stations in the U.S. between 1950 and 2006. There were tens of millions of data points in all—temperature highs and lows from every station, taken every day for more than a half-century. Meehl searched these numbers for the record-setting values—i.e., the days on which a given weather station saw its highest-ever high or lowest-ever low up until that point. When he plotted these by year, they fell along a downward-curving line. Around 50,000 new heat records were being set every year during the 1960s; then that number dropped to roughly 20,000 in the 1980s, and to 15,000 by the turn of the millennium.

From Meehl et al 2009.

This shouldn’t be surprising. As a rule, weather records will be set less frequently as time goes by. The first measurement of temperature that’s ever taken at a given weather station will be its highest (and lowest) of all time, by definition. There’s a good chance that the same station’s reading on Day 2 will be a record, too, since it only needs to beat the temperature recorded on Day 1. But as the weeks and months go by, this record-setting contest gets increasingly competitive: Each new daily temperature must now outdo every single one that came before. If the weather were completely random, we might peg the chances of a record being set at any time as 1/n, where n is the number of days recorded to that point. In other words, one week into your record-keeping, you’d have a 1 in 7 chance of landing on an all-time high. On the 100th day, your odds would have dropped to 1 percent. After 56 years, your chances would be very, very slim.

The weather isn’t random, though; we know it’s warming overall, from one decade to the next. That’s what Meehl et al. were looking at: They figured that a changing climate would tweak those probabilities, goosing the rate of record-breaking highs and tamping down the rate of record-breaking lows. This wouldn’t change the fundamental fact that records get broken much less often as the years go by. (Even though the world is warming, you’d still expect fewer heat records to be set in 2000 than in 1965.) Still, one might guess that climate change would affect the rate, so that more heat records would be set than we’d otherwise expect.

That’s not what Meehl found. Between 1950 and 2006, the rate of record-breaking heat seemed unaffected by large-scale changes to the climate: The number of new records set every year went down from one decade to the next, at a rate that matched up pretty well with what you’d see if the odds were always 1/n. The study did find something more important, though: Record-breaking lows were showing up much less often than expected. From one decade to the next, fewer records of any kind were being set, but the ratio of record lows to record highs was getting smaller over time. By the 2000s, it had fallen to about 0.5, meaning that the U.S. was seeing half as many record-breaking lows as record-breaking highs. (Meehl has since extended this analysis using data going back to 1930 and up through 2015. The results came out the same.)

What does all this mean? On one hand, it’s very good evidence that climate change has tweaked the odds for record-breaking weather, at least when it comes to record lows. (Other studies have come to the same conclusion.) On the other hand, it tells us that in the U.S., at least, we’re not hitting record highs more often than we were before, and that the rate isn’t higher than what you’d expect if there weren’t any global warming. In fact, just the opposite is true: As one might expect, heat records are getting broken less often over time, and it’s likely there will be fewer during the 2010s than at any point since people started keeping track.

This may be hard to fathom, given how much coverage has been devoted to the latest bouts of record-setting heat. These extreme events are more unusual, in absolute terms, than they’ve ever been before, yet they’re always in the news. How could that be happening?

While the odds that any given record will be broken may be very, very small, the total number of potential records that could be broken—and then reported in the newspaper—is mind-blowingly enormous. To get a sense of how big this number really is, consider that the National Oceanic and Atmospheric Administration keeps a database of daily records from every U.S. weather station with at least 30 years of data, and that its website lets you search for how many all-time records have been set in any given stretch of time. For instance, the database indicates that during the seven-day period ending on Aug. 17—the date when the Washington Post published its series of “record-crushing heat” infographics—154 heat records were broken.

That may sound like a lot—154 record-high temperatures in the span of just one week. But the NOAA website also indicates how many potential records could have been achieved during that time: 18,953. In actuality, less than one percent of these were broken. You can also pull data on daily maximum temperatures for an entire month: I tried that with August 2017, and then again for months of August at 10-year intervals going back to the 1950s. Each time the query returned at least about 130,000 potential records, of which one or two thousand seemed to be getting broken every year. (There was no apparent trend toward more records being broken over time.)

Now let’s say there are 130,000 high-temperature records to be broken every month in the U.S. That’s only half the pool of heat-related records, since the database also lets you search for all-time highest low temperatures. You can also check whether any given highest high or highest low happens to be a record for the entire month in that location, or whether it’s a record when compared across all the weather stations everywhere on that particular day.

Add all of these together and the pool of potential heat records tracked by NOAA appears to number in the millions annually, of which tens of thousands may be broken. Even this vastly underestimates the number of potential records available for media concern. As they’re reported in the news, all-time weather records aren’t limited to just the highest highs or highest lows for a given day in one location. Take, for example, the first heat record mentioned in this column, reported in the Post: The U.S. has just endured the hottest May, June, and July of all time. The existence of that record presupposes many others: What about the hottest April, May and June, or the hottest March, April, and May? What about all the other ways that one might subdivide the calendar?

Geography provides another endless well of flexibility. Remember that the all-time record for the hottest May, June, and July applied only to the lower 48 states. Might a different set of records have been broken if we’d considered Hawaii and Alaska? And what about the records spanning smaller portions of the country, like the Midwest, or the Upper Midwest, or just the state of Minnesota, or just the Twin Cities? And what about the all-time records overseas, describing unprecedented heat in other countries or on other continents?

Even if we did limit ourselves to weather records from a single place measured over a common timescale, it would still be possible to parse out record-breaking heat in a thousand different ways. News reports give separate records, as we’ve seen, for the highest daily high and the highest daily low, but they also tell us when we’ve hit the highest average temperature over several days or several weeks or several months. The Post describes a recent record-breaking streak of days in San Diego with highs of at least 83 degrees. (You’ll find stories touting streaks of daily highs above almost any arbitrary threshold: 90 degrees, 77 degrees, 60 degrees, et cetera.) Records also needn’t focus on the temperature at all: There’s been lots of news in recent weeks about the fact that the U.K. has just endured its driest-ever early summer.

“Record-breaking” summer weather, then, can apply to pretty much any geographical location, over pretty much any span of time. It doesn’t even have to be a record—there’s an endless stream of stories on “near-record heat” in one place or another, or the “fifth-hottest” whatever to happen in wherever, or the fact that it’s been “one of the hottest” yadda-yaddas that yadda-yadda has ever seen. In the most perverse, insane extension of this genre, news outlets sometimes even highlight when a given record isn’t being set.

Loose reports of “record-breaking heat” only serve to puff up muggy weather and make it seem important. (The sham inflations of the wind chill factor do the same for winter months.) So don’t be fooled or flattered by this record-setting hype. Your summer misery is nothing special.

Summary

This article helps people not to confuse weather events with climate.  My disappointment is with the phrase, “Climate Change is Real,” since it is subject to misdirection.  Engber uses that phrase referring to rising average world temperatures, without explaining that such estimates are computer processed reconstructions since the earth has no “average temperature.”  More importantly the undefined “climate change” is a blank slate to which a number of meanings can be attached.

Some take it to mean: It is real that rising CO2 concentrations cause rising global warming.  Yet that is not supported by temperature records.
Others think it means: It is real that using fossil fuels causes global warming.  This too lacks persuasive evidence.

Since 1965 the increase in fossil fuel consumption is dramatic and monotonic (with 2020 an exception), steadily increasing by 218% from 146 to 463 exajoules. Meanwhile the GMT record from Hadcrut shows multiple ups and downs with an accumulated rise of 0.9C over 55 years, 7% of the starting value.

Others know that Global Mean Temperature is a slippery calculation subject to the selection of stations.

Graph showing the correlation between Global Mean Temperature (Average T) and the number of stations included in the global database. Source: Ross McKitrick, U of Guelph

Global warming estimates combine results from adjusted records.
Conclusion

The pattern of high and low records discussed above is consistent with natural variability rather than rising CO2 or fossil fuel consumption. Those of us not alarmed about the reported warming understand that “climate change” is something nature does all the time, and that the future is likely to include periods both cooler and warmer than now.

Background Reading:

The Climate Story (Illustrated)

2021 Update: Fossil Fuels ≠ Global Warming

Man Made Warming from Adjusting Data

What is Global Temperature? Is it warming or cooling?

NOAA US temp 2019 2021

Our Childish Leaders

Donald J. Boudreaux writes at AIER Beware the Allure of Simple ‘Solutions’.  H/T Brownstone Institute.  Excerpts in italics with my bolds and added images.

The attitudes and opinions of today’s so-called “elite” – those public-opinion formers who Deirdre McCloskey calls “the clerisy” – are childish. Most journalists and writers working for most premier media and entertainment companies, along with most professors and public intellectuals, think, talk, and write about society with the insight of kindergartners.

This sad truth is masked by the one feature that does distinguish the clerisy from young children: verbal virtuosity. Yet beneath the fine words, beautiful phrases, arresting metaphors, and affected allusions lies a notable immaturity of thought.

Every social and economic problem is believed to have a solution,
and that solution is almost always superficial.

Unlike children, adults understand that living life well begins with accepting the inescapability of trade-offs. Contrary to what you might have heard, you cannot “have it all.” You cannot have more of this thing unless you’re willing to have less of that other thing. And what’s true for you as an individual is true for any group of individuals. We Americans cannot have our government artificially raise the cost of producing and using carbon fuels unless we are willing to pay higher prices at the pump and, thus, have less income to spend on acquiring other goods and services. We cannot use money creation to ease the pain today of COVID lockdowns without enduring the greater pain tomorrow of inflation.

While children stomp their little feet in protest when confronted with the need to make trade-offs, the necessity of trade-offs is accepted as a matter of course by adults.

No less importantly, adults, unlike children, are not beguiled by the superficial.

Pay close attention to how the clerisy (who are mostly, although not exclusively, Progressives) propose to ‘solve’ almost any problem, real or imaginary. You’ll discover that the proposed ‘solution’ is superficial; it’s rooted in the naïve assumption that social reality beyond what is immediately observable either doesn’t exist or is unaffected by attempts to rearrange surface phenomena. In the clerisy’s view, the only reality that matters is the reality that is easily seen and seemingly easily manipulated with coercion.

The clerisy’s proposed ‘solutions,’ therefore, involve simply rearranging,
or attempting to rearrange, surface phenomena.

♦  Do some people use guns to murder other people? Yes, sadly. The clerisy’s superficial ‘solution’ to this real problem is to outlaw guns.

♦  Do some people have substantially higher net financial worths than other people? Yes. The clerisy’s juvenile ‘solution’ to this fake problem is to heavily tax the rich and transfer the proceeds to the less rich.

♦  Are some workers paid wages that are too low to support a family in modern America? Yes. The clerisy’s simplistic ‘solution’ to this fake problem – “fake” because most workers earning such low wages are not heads of households – is to have government prohibit the payment of wages below some stipulated minimum.

♦  Do some people suffer substantial property damage, or even loss of life, because of hurricanes, droughts, and other bouts of severe weather? Yes. The clerisy’s lazy ‘solution’ to this real problem focuses on changing the weather by reducing the emissions of an element, carbon, that is now (too simplistically) believed to heavily determine the weather.

♦  Do prices of many ‘essential’ goods and services rise significantly in the immediate aftermath of natural disasters? Yes. The clerisy’s counterproductive ‘solution’ to this fake problem, “counterproductive” and “fake” because these high prices accurately reflect and signal underlying economic realities, is to prohibit the charging and payment of these high prices.

♦  When inflationary pressures build up because of excessive monetary growth, are these pressures vented in the form of rising prices? Yes indeed. The clerisy’s infantile ‘solution’ to the very real problem of inflation is to blame it on greed while raising taxes on profits.

♦  Is the SARS-CoV-2 virus contagious and potentially dangerous to humans? Yes. The clerisy’s simple-minded ‘solution’ to this real problem is to forcibly prevent people from mingling with each other.

♦  Do many Americans still not receive K-12 schooling of minimum acceptable quality? Yes. The clerisy’s lazy ‘solution’ to this real problem is to give pay raises to teachers and spend more money on school administrators.

♦  Do some American workers lose jobs when American consumers buy more imports? Yes. The clerisy’s ‘solution’ is to obstruct consumers’ ability to buy imports.

♦  Are some people bigoted and beset with irrational dislike or fear of blacks, gays, lesbians, and bisexuals? Yes. The clerisy’s ‘solution’ to this real problem is to outlaw “hate” and to compel bigoted persons to behave as if they aren’t bigoted.

♦  Do many persons who are eligible to vote in political elections refrain from voting? Yes. The ‘solution’ favored by at least some of the clerisy to this fake problem – “fake” because in a free society each person has a right to refrain from participating in politics – is to make voting mandatory.

The above list of simplistic and superficial ‘solutions’ to problems real and imaginary
can easily be expanded.

The clerisy, mistaking words for realities, assumes that success at verbally describing realities more to their liking proves that these imagined realities can be made real by merely rearranging the relevant surface phenomena. Members of the clerisy ignore unintended consequences. And they overlook the fact that many of the social and economic realities that they abhor are the result, not of villainy or of correctible imperfections, but of complex trade-offs made by countless individuals.

Social engineering appears doable only to those persons who, seeing only a relatively few surface phenomena, are blind to the astonishing complexity that is ever-churning beneath the surface to create those surface phenomena. To such persons, social reality appears as it does to a child: simple and easily manipulated to achieve whatever are the desires that motivate the manipulators.

The clerisy’s ranks are filled overwhelmingly with simple-minded people who mistake their felicity with words and their good intentions for serious thinking. They convey to each other, and to the unsuspecting public, the appearance of being deep thinkers while seldom thinking with more sophistication and nuance than is on display daily in every classroom of kindergartners.

Progressive Means Experts Rule

Ryan McMaken explains at Mises Why Progressives Love Government “Experts”.  Excerpts in italics with my bolds and added images H/T zerohedge

In twenty-first-century America, ordinary people are at the mercy of well-paid, unelected government experts who wield vast power.

That is, we live in the age of the technocrats: people who claim to have special wisdom that entitles them to control, manipulate, and manage society’s institutions using the coercive power of the state.

We’re told these people are “nonpolitical” and will use their impressive scientific knowledge to plan the economy, public health, public safety, or whatever goal the regime has decided the technocrats will be tasked with bringing about.

These people include central bankers, Supreme Court justices, “public health” bureaucrats, and Pentagon generals.

The narrative is that these people are not there to represent the public or bow to political pressure. They’re just there to do “the right thing” as dictated by economic theory, biological sciences, legal theory, or the study of military tactics.  We’re also told that in order to allow these people to act as the purely well-meaning apolitical geniuses they are, we must give them their independence and not question their methods or conclusions.

We were exposed to this routine yet again last week as President Joe Biden announced he will “respect the Fed’s independence” and allow the central bankers to set monetary policy without any bothersome interference from the representatives of the taxpayers who pay all the bills and who primarily pay the price when central bankers make things worse. (Biden, of course, didn’t mention that central bankers have been spectacularly wrong about the inflation threat in recent years, with inflation rates hitting forty-year highs, economic growth going negative, and consumer credit piling up as families struggle to cope with the cost of living.)

Conveniently, Biden’s deferral to the Fed allows him to blame it later when economic conditions get even worse. Nonetheless, his placing the economy in the hands of alleged experts will no doubt appear laudable to many. This is because the public has long been taught by public schools and media outlets that government experts should have the leeway to exercise vast power in the name of “fixing” whatever problems society faces.

The Expert Class as a Tool for State Building

The success of this idea represents a great victory for progressive ideology. Progressives have long been committed to creating a special expert class as a means of building state power. In the United States, for example, the cult of expertise really began to take hold in the late nineteenth and early twentieth centuries, and it led directly to support for more government intervention in the private sector. As Maureen Flanagan notes in “Progressives and Progressivism in an Era of Reform,”

Social science expertise gave political Progressives a theoretical foundation for cautious proposals to create a more activist state…. Professional social scientists composed a tight circle of men who created a space between academia and government from which to advocate for reform. They addressed each other, trained their students to follow their ideas, and rarely spoke to the larger public.

These men founded new organizations—such as the American Economics Association—to promote this new class of experts and their plans for a more centrally planned society. Ultimately, the nature of the expert class was revolutionary. The new social scientists thought they knew better than the patricians, religious leaders, local representatives, and market actors who had long shaped local institutions. Instead,

Progressives were modernizers with a structural-instrumentalist agenda. They rejected reliance on older values and cultural norms to order society and sought to create a modern reordered society with political and economic institutions run by men qualified to apply fiscal expertise, businesslike efficiency, and modern scientific expertise to solve problems and save democracy. The emerging academic disciplines in the social sciences of economics, political economy and political science, and pragmatic education supplied the theoretical bases for this middle-class expert Progressivism.

In the Progressive view, business leaders and machine politicians lacked a rational and broad view of the needs of society. In contrast, the government experts would approach society’s problems as scientists. Johnson felt this model already somewhat existed in the Department of War, where Johnson imagined the secretary of war was “quite free from political pressure and [relied] on the counsel of the engineers.” Johnson imagined that these science-minded bureaucrats could bring a “really economic and scientific application” of policy.

“Disinterested” Central Planners

Johnson was part of a wave of experts and intellectuals attempting to develop “a new realm of state expertise” that favored apolitical technocrats who would plan the nation’s infrastructure and industry. Many historians have recognized that these efforts were fundamentally “state-building activities … [and that] their emergence marked and symbolized a watershed in which an often-undemocratic new politics of administration and interest groups displaced the nineteenth century’s partisan, locally oriented public life”. 

In short, these efforts sowed the seeds for the idealized technocracy we have today: unresponsive to the public and imbued with vast coercive power that continually displaces private discretion and private prerogatives.

Indeed, the Progressive devotion to expertise followed “the core pattern of Progressive politics,” which is “the redirection of decision making upward within bureaucracies.” Thus, in contrast to the populist political institutions of an earlier time, decision-making in the Progressive Era became more white-collar, more middle class—as opposed to the working-class party workers—and more hierarchical within bureaucracies directly controlled by the state’s executive agencies.

Who Should Rule?

In many ways, then, this aspect of Progressive ideology turned the political agenda of laissez-faire classical liberalism on its head. Liberals of the Jeffersonian and Jacksonian variety had sought to increase outside political influence in the policy-making process through elections and the appointment of party activists loyal to elected representatives. This was because liberals feared that an insulated class of government experts would function more in its own interests than those of the taxpayers.

The Progressives, however, imagined they could create a disinterested nonpolitical class of experts devoted only to objective science.

The fundamental question, then, became who should rule: insulated experts or nonexpert representatives with closer ties to the taxpayers.  We can see today that the Progressives largely succeeded in granting far greater power to today’s technocratic class of experts. The technocrats are praised for their allegedly scientific focus, and we are told to respect their independence.

If the goal was ever to protect public checks on state power, however, this was always an unworkable ideal. By creating a special class of expert bureaucrats with decades-long careers within the regime itself, we are simply creating a new class of officials able to wield state power with little accountability. Anyone with a sufficiently critical view of state power could see the danger in this. Interestingly, it was anarcho-communist Mikhail Bakunin who recognized the impossibility of solving the problem of state power by putting scientific experts in charge. Such a move only represented a transfer of power from one group to another. Bakunin warned:

The State has always been the patrimony of some privileged class or other; a priestly class, an aristocratic class, a bourgeois class, and finally a bureaucratic class.

It is not necessary, of course, to have full-blown socialism to create this “new class.” The modern state with its mixed economy in most cases already has all the bureaucratic infrastructure necessary to make this a reality. As long as we defer to this ruling class of “scientists and scholars,” the Progressives have won.

Background Why Technocrats Deliver Catastrophes

Finland’s Self-imposed Climate Lockdown

You’d think that politicians had learned to forego climate virtue-signaling after seeing the lawfare tactics that they will suffer.  And yet, Finland bravely goes where smarter angels fear to tread.  As the Helsinki Times reports New Climate Change Act into force in July.  Excerpts in italics with my bolds.

The Climate Change Act lays the foundation for national work on climate change in Finland. The reformed Act sets emission reductions targets for 2030, 2040 and 2050. Now the target of a carbon-neutral Finland by 2035 has for the first time been laid down by law.

The Government submitted the bill for approval on 9 June. The President of the Republic is to approve the Act on 10 June and it will enter into force on 1 July 2022.

“The new Climate Change Act is vital for Finland. The Climate Change Act ensures that ambitious climate work will continue across government terms. The Act shows the world how we can built a carbon-neutral welfare state by 2035. It is also a strong signal for companies that in Finland clean solutions are well worth investing in,” says Minister of the Environment and Climate Change Maria Ohisalo.

Minister of the Environment and Climate Change Maria Ohisalo at a press event in Helsinki. LEHTIKUVA

The Act lays down provisions on the climate change policy plans. The scope of the Act will be extended to also cover emissions from the land use sector, i.e. land use, forestry and agriculture, and it will for the first time include the objective to strengthen carbon sinks.

“Including land use in the Climate Change Act is a significant improvement. We have a lot of opportunities to reduce emissions and strengthen carbon sinks in the land use sector – in forests, construction and agriculture,” Minister Ohisalo says.

The previous Climate Change Act entered into force in 2015, and it set an emission reduction target only for 2050. The new Climate Change Act will include emission reduction targets for 2030 and 2040 that are based on the recommendations of the Finnish Climate Change Panel, and the target for 2050 will be updated.

The emission reduction targets are -60% by 2030, -80% by 2040 and at least -90% but aiming at -95% by 2050, compared to the levels in 1990.

Finns have lost any room to maneuver, or to walk back ill-advised policies should the future be cooler rather than the warming of which they are so certain.  The lawyers will be all over them to prevent any escape.  To use another metaphor, they are lobsters who put themselves into the pots; there will be no getting out or going free.

 

See Also Dutch Judges Dictate Energy Policy

See Also Climate Tyranny By Way of Criminal Law

 

 

 

FDA Interfered With Ivermectin, Doctors Suing

A Washington law firm has filed a federal lawsuit against the Food and Drug Administration (FDA) for interfering with the use of ivermectin as a treatment for COVID-19. H/T Epoch Times

The lawsuit was filed by Boyden Gray & Associates on behalf of three doctors who were disciplined for prescribing human-grade ivermectin to patients. The firm’s founder, attorney Boyden Gray, is a former legal adviser to the Reagan and Bush administrations.

Gray told The Epoch Times that the FDA had violated well-established law that allows doctors to prescribe an FDA-approved drug as an off-label treatment. Ivermectin was no different, he said. It was approved by the FDA in 1966. “Congress recognized the importance of letting doctors be doctors and expressly prohibited the FDA from interfering with the practice of medicine,” Gray said.

“That is exactly what the FDA has done time and time again throughout this pandemic, assuming authority it doesn’t have and trying to insert itself in the medical decisions of Americans everywhere.” The three plaintiffs in the case are k Marik is a founder of the Front Line COVID-19 Critical Care 21 Alliance (FLCCC), a national nonprofit that promotes alternative COVID-19 treatments to the government-touted vaccine.

“The FDA has made public statements on ivermectin that have been misleading and have raised unwarranted concern over a critical drug in preventing and treating COVID-19,” Marik told The Epoch Times.

“To do this is to ignore both statutory limits on the FDA’s authority and the significant body of scientific evidence from peer-reviewed research.”

According to Marik, more than 80 medical trials conducted since the outbreak of COVID-19 show that ivermectin is a safe and effective treatment for the virus. Gray said the FDA has engaged in unlawful interference with the use of ivermectin and should be held accountable for that. The lawsuit included several statements made by the FDA that Gray said show that the administration interfered with the use of ivermectin.

They include an Aug. 21, 2021, Twitter post by the agency: “You are not a horse. You are not a cow. Seriously, y’all. Stop it.”

Marik, a critical care specialist, was suspended by Sentara Norfolk General Hospital for prescribing ivermectin as a COVID-19 treatment. Bowden, an ear, nose, and throat specialist, was suspended from the Houston Medical Hospital. Apter was under investigation by both the Washington Medical Commission and Arizona Medical Board for prescribing ivermectin. Marik was recently informed that he was under investigation by the medical licensing board in Virginia.

Gray filed the lawsuit in U.S. District Court in Texas. The doctors are seeking a permanent injunction that would prohibit the FDA from interfering with the use of ivermectin for the treatment of COVID-19.

Background from previous post Why Ivermectin Was Disappeared

Henry F. Smith Jr., MD explains at American Thinker Why Ivermectin was Disappeared.  Excerpts in italics with my bolds.

Case #1

It’s a common occurrence in winter. A patient calls a primary physician to report a nonproductive cough, slight hoarseness, muscle aches, and a low-grade fever. The physician, and likely the patient, realize that this is almost certainly a viral upper respiratory infection. If the patient were in the office, the physician may test for a streptococcal bacterial infection, but it will likely be negative.

This is probably an infection with a rhinovirus, adenovirus, or endemic coronavirus. Despite this, the afflicted patient will happily proceed to the pharmacy to pick up their prescription for an antibiotic. The patient will feel as though the physician was proactive, something the doctor certainly understands.

This prescription, however, will be of no value to the patient and may actually cause issues. Yet pharmacies in the U.S. see this type of prescription thousands of times a day.

It occurs despite the fact that physicians are constantly reminded that gratuitous antibiotic prescriptions come with side effects and can lead to antibiotic resistance. Beyond that, there is no tangible resistance to this practice from the medical establishment or healthcare authorities.

Case #2

Now let’s imagine another patient calls in. This patient also has a dry cough scratchy throat, muscle aches, and a low-grade fever. Only this patient had a COVID test kit at home and tested positive. The physician wants to prescribe a medication with no risk of bacterial resistance and a very benign side-effect profile. He’s read lots of literature to suggest it will be helpful. There are a significant number of double-blind studies showing it to be effective in the treatment of SARS Co-V2. It has been used in multiple countries with excellent results. Except, in this case, the physician will find it impossible to prescribe that medication. It will be impossible because that medication is Ivermectin. And somehow it has been removed from the market.

Not only has this FDA-approved, Nobel prize-winning drug been made unavailable, if a physician were to prescribe it, or advocate it as therapy, they are threatened with the potential loss of their medical license, their hospital affiliations, and their board certification.

Case #3

It gets even more ironic. I’ve noticed that some physicians are prescribing a very common antibiotic called azithromycin for their COVID patients. It is well understood that for COVID-19 when taken alone, it is of no value. There is absolutely no data to show efficacy in COVID-19. It has the same potential problems, as when it is prescribed for other viral infections. Yet the practice goes on, again unimpeded.

Let’s go one step further. Levofloxacin is another antibiotic, introduced in 1996. It was unusual in that it can treat a broad variety of infections, even those that are severe but can be given orally. Because of this, it was overutilized, threatening to create drug resistance.

In 2016, the FDA issued a black box warning because of several severe side effects including tendon rupture, peripheral nerve damage, for them and psychosis. Since then its usage has waned.

The drug was proposed as a treatment for COVID early in the pandemic but proved to have limited antiviral activity.

Case #4

So I posed this hypothetical to several pharmacist friends: If a physician called in a prescription for azithromycin, or even levofloxacin, and gave the diagnosis of COVID-19, would they fill the prescription? The answer was yes, as there would be nothing to prevent it.  So, in other words, a physician is permitted to prescribe useless antibiotics, even those with serious adverse reactions according to the FDA for COVID-19 infection.

If, as apparently the FDA believes, ivermectin is similarly useless but benign, why is it alone being blocked?

Doing the Math

Let’s do some mathematics. As of this writing, there are roughly 890,000 deaths recorded in the United States related to COVID-19. I think most people understand that a lot of these deaths are not due to the virus but from other comorbid conditions. The CDC has long stated that the number of deaths from COVID where there was no comorbid condition (In other words, healthy people who died from COVID) is roughly 7% of the total (65,000). In several meta-analyses, Ivermectin was shown to be roughly 65% effective at preventing serious disease and/or death. So, in the best-case scenario for them, our public health organizations, by suppressing Ivermectin, may be responsible for roughly 40,000 deaths.

In fact, the vast majority of people who actually died from COVID had multiple comorbid conditions, so that number could be much higher.

I need to acknowledge that prescribing antibiotics for viral infections is something that the primary caregivers struggle with. Patients expect them to do something when they’re sick. They don’t appreciate being told to go home and take acetaminophen. Some may never come back and seek care elsewhere.

Yet patients have accepted that exact recipe for dealing with COVID-19, a disease they perceive may actually kill them.

So what’s the difference between prescriptions written for an anti-bacterial, versus Ivermectin, which is an anti-parasitic agent, for a viral infection? Both primarily target infectious agents other than viruses. If anything, even it was futile therapy, Ivermectin is safer than the antibiotics discussed. Yet it is the only medication that has been effectively banned

Given all this, I think it’s easy to suspect that the FDA, the NIH, and the CDC actually understand the potential benefits of Ivermectin and other repurposed drugs. But they also realize that these medications threaten the profits of the pharmaceutical industry with which they are financially entwined.

Case #5

What makes this even more infuriating is the government’s warm embrace of two new antiviral medications, Pfizer’s Paxlovid, and Merck’s Molnupivinir. These drugs have exactly one company-sponsored study each to vouch for their efficacy. Merck’s drug, by its own testing, is only 39% effective in reducing severe disease and/or death. There are no long-term safety data for either medication.

Yet both have received emergency use authorization, and have suddenly popped up on government-approved treatment protocols.

As I look towards the end of my career, I’ve seen a lot of profit-oriented behavior by pharmaceutical companies. I think of the me-too drugs, molecules that are only slightly different than their now off-patent predecessors aggressively marketed to physicians. I’ve seen pharmaceutical reps actually reimburse physicians for a certain number of prescriptions written for their medications. I’ve seen manipulation of the rules regarding inhaled medications to maintain their patents long after they would have expired.

But if they actively suppressed the adoption of useful medications during a pandemic, then this is beyond the pale. It would suggest a total collapse of any morality or sense of responsibility within the pharmaceutical industry and their partners in the regulatory agencies.

I hope that someday, our investigatory agencies can push past the vast political power these companies have acquired through their burgeoning profits, and find out the truth.

I’m not optimistic.

Henry F Smith Jr. MD FCCP practices Pulmonary and Sleep Medicine in Northeastern Pennsylvania.

Temps Cause CO2 Changes, Not the Reverse. June 2022 Update

Science is based on predictive power.  For example, astronomers demonstrate they know how the solar system works when they accurately predict eclipses of the sun and moon.

This post is about proving that CO2 changes in response to temperature changes, not the other way around, as is often claimed.  In order to do  that we need two datasets: one for measurements of changes in atmospheric CO2 concentrations over time and one for estimates of Global Mean Temperature changes over time.

For a possible explanation of natural warming and CO2 emissions see Little Ice Age Warming Recovery May be Over

Climate science is unsettling because past data are not fixed, but change later on.  I ran into this previously and now again in 2021 and 2022 when I set out to update an analysis done in 2014 by Jeremy Shiers (discussed in a previous post reprinted at the end).  Jeremy provided a spreadsheet in his essay Murray Salby Showed CO2 Follows Temperature Now You Can Too posted in January 2014. I downloaded his spreadsheet intending to bring the analysis up to the present to see if the results hold up.  The two sources of data were:

Temperature anomalies from RSS here:  http://www.remss.com/missions/amsu

CO2 monthly levels from NOAA (Mauna Loa): https://www.esrl.noaa.gov/gmd/ccgg/trends/data.html

Changes in CO2 (ΔCO2)

Uploading the CO2 dataset showed that many numbers had changed (why?).

The blue line shows annual observed differences in monthly values year over year, e.g. June 2020 minus June 2019 etc.  The first 12 months (1979) provide the observed starting values from which differentials are calculated.  The orange line shows those CO2 values changed slightly in the 2020 dataset vs. the 2014 dataset, on average +0.035 ppm.  But there is no pattern or trend added, and deviations vary randomly between + and -.  So last year I took the 2020 dataset to replace the older one for updating the analysis.

Now I find the NOAA dataset in 2021 has almost completely new values due to a method shift in February 2021, requiring a recalibration of all previous measurements.  The new picture of ΔCO2 is graphed below.

The method shift is reported at a NOAA Global Monitoring Laboratory webpage, Carbon Dioxide (CO2) WMO Scale, with a justification for the difference between X2007 results and the new results from X2019 now in force.  The orange line shows that the shift has resulted in higher values, especially early on and a general slightly increasing trend over time.  However, these are small variations at the decimal level on values 340 and above.  Further, the graph shows that yearly differentials month by month are virtually the same as before.  Thus I redid the analysis with the new values.

Again, note that these are annual differences by month, i.e. the value for May 2022 is the reported CO2 concentration in May 2022 minus the May 2021 CO2.  Note also how the differences have declined sharply the last two years.

Global Temperature Anomalies (ΔTemp)

The other time series was the record of global temperature anomalies according to RSS. The current RSS dataset is not at all the same as the past.

Here we see some seriously unsettling science at work.  The purple line is RSS in 2014, and the blue is RSS as of 2020.  Some further increases appear in the gold 2022 rss dataset. The red line shows alterations from the old to the new.  There is a slight cooling of the data in the beginning years, then the three versions mostly match until 1997, when systematic warming enters the record.  From 1997/5 to 2003/12 the average anomaly increases by 0.04C.  After 2004/1 to 2012/8 the average increase is 0.15C.  At the end from 2012/9 to 2013/12, the average anomaly was higher by 0.21. The 2022 version added slight warming over 2020 values.

RSS continues that accelerated warming to the present, but it cannot be trusted.  And who knows what the numbers will be a few years down the line?  As Dr. Ole Humlum said some years ago (regarding Gistemp): “It should however be noted, that a temperature record which keeps on changing the past hardly can qualify as being correct.”

Given the above manipulations, I went instead to the other satellite dataset UAH version 6. UAH has also made a shift by changing its baseline from 1981-2010 to 1991-2020.  This resulted in systematically reducing the anomaly values, but did not alter the pattern of variation over time.  For comparison, here are the two records with measurements through May 2022. UAH dataset for temperatures in the lower troposphere (TLT).

Comparing UAH temperature anomalies to NOAA CO2 changes.

Here are UAH temperature anomalies compared to CO2 monthly changes year over year.

Changes in monthly CO2 synchronize with temperature fluctuations, which for UAH are anomalies now referenced to the 1991-2020 period.  As stated above, CO2 differentials are calculated for the present month by subtracting the value for the same month in the previous year (for example May 2022 minus May 2021).   Temp anomalies are calculated by comparing the present month with the baseline month. Note the dropping temperatures over the last two years, slightly preceding CO2 descending.

The final proof that CO2 follows temperature due to stimulation of natural CO2 reservoirs is demonstrated by the ability to calculate CO2 levels since 1979 with a simple mathematical formula:

For each subsequent year, the co2 level for each month was generated

CO2  this month this year = a + b × Temp this month this year  + CO2 this month last year

Jeremy used Python to estimate a and b, but I used his spreadsheet to guess values that place for comparison the observed and calculated CO2 levels on top of each other.

In the chart calculated CO2 levels correlate with observed CO2 levels at 0.9985 out of 1.0000.  This mathematical generation of CO2 atmospheric levels is only possible if they are driven by temperature-dependent natural sources, and not by human emissions which are small in comparison, rise steadily and monotonically.

Previous Post:  What Causes Rising Atmospheric CO2?

nasa_carbon_cycle_2008-1

This post is prompted by a recent exchange with those reasserting the “consensus” view attributing all additional atmospheric CO2 to humans burning fossil fuels.

The IPCC doctrine which has long been promoted goes as follows. We have a number over here for monthly fossil fuel CO2 emissions, and a number over there for monthly atmospheric CO2. We don’t have good numbers for the rest of it-oceans, soils, biosphere–though rough estimates are orders of magnitude higher, dwarfing human CO2.  So we ignore nature and assume it is always a sink, explaining the difference between the two numbers we do have. Easy peasy, science settled.

What about the fact that nature continues to absorb about half of human emissions, even while FF CO2 increased by 60% over the last 2 decades? What about the fact that in 2020 FF CO2 declined significantly with no discernable impact on rising atmospheric CO2?

These and other issues are raised by Murray Salby and others who conclude that it is not that simple, and the science is not settled. And so these dissenters must be cancelled lest the narrative be weakened.

The non-IPCC paradigm is that atmospheric CO2 levels are a function of two very different fluxes. FF CO2 changes rapidly and increases steadily, while Natural CO2 changes slowly over time, and fluctuates up and down from temperature changes. The implications are that human CO2 is a simple addition, while natural CO2 comes from the integral of previous fluctuations.  Jeremy Shiers has a series of posts at his blog clarifying this paradigm. See Increasing CO2 Raises Global Temperature Or Does Increasing Temperature Raise CO2 Excerpts in italics with my bolds.

The following graph which shows the change in CO2 levels (rather than the levels directly) makes this much clearer.

Note the vertical scale refers to the first differential of the CO2 level not the level itself. The graph depicts that change rate in ppm per year.

There are big swings in the amount of CO2 emitted. Taking the mean as 1.6 ppmv/year (at a guess) there are +/- swings of around 1.2 nearly +/- 100%.

And, surprise surprise, the change in net emissions of CO2 is very strongly correlated with changes in global temperature.

This clearly indicates the net amount of CO2 emitted in any one year is directly linked to global mean temperature in that year.

For any given year the amount of CO2 in the atmosphere will be the sum of

  • all the net annual emissions of CO2
  • in all previous years.

For each year the net annual emission of CO2 is proportional to the annual global mean temperature.

This means the amount of CO2 in the atmosphere will be related to the sum of temperatures in previous years.

So CO2 levels are not directly related to the current temperature but the integral of temperature over previous years.

The following graph again shows observed levels of CO2 and global temperatures but also has calculated levels of CO2 based on sum of previous years temperatures (dotted blue line).

Summary:

The massive fluxes from natural sources dominate the flow of CO2 through the atmosphere.  Human CO2 from burning fossil fuels is around 4% of the annual addition from all sources. Even if rising CO2 could cause rising temperatures (no evidence, only claims), reducing our emissions would have little impact.

Resources:

CO2 Fluxes, Sources and Sinks

Who to Blame for Rising CO2?

Fearless Physics from Dr. Salby

In this video presentation, Dr. Salby provides the evidence, math and charts supporting the non-IPCC paradigm.

Footnote:  As CO2 concentrations rose, BP shows Fossil Fuel consumption slumped in 2020

See also 2022 Update: Fossil Fuels ≠ Global Warming

Zero Carbon False Pretenses

19170447-global_warming_1.530x298

Legal Definition of false pretenses: false representations concerning past or present facts that are made with the intent to defraud another.  Marriam-Webster.

As we will see below, the zero carbon campaign relies on a series of false representations, primarily from omitting realities contradictory to the CO2 scare narrative.

In the aftermath of Glasgow COP, many have noticed how incredible were the pronouncements and claims from UK hosts as well as other speakers intending to inflame public opinion in support of the UN agenda.  No one in the media applies any kind of critical intelligence examining the veracity of facts and conclusions trumpeted before, during and after the conference.  In the interest of presenting an alternate, unalarming paradigm of earth’s climate, I am reposting a previous discussion of how wrongheaded is the IPCC “consensus science.”

Background

With all the fuss about the “Green New Deal” and attempts to blame recent cold waves on rising CO2, it is wise to remember the logic of the alarmist argument.  It boils down to two suppositions:

Rising atmospheric CO2 makes the planet warmer.

Rising emissions from humans burning fossil fuels makes atmospheric CO2 higher.

The second assertion is challenged in a post: Who to Blame for Rising CO2?

This post addresses the first claim.  Remember also that all of the so-called “lines of evidence” for global warming do not distinguish between human and natural causes.  Typically the evidence cited falls into these categories:

Global temperature rise
Warming oceans
Shrinking ice sheets
Glacial retreat
Decreased snow cover
Sea level rise
Declining Arctic sea ice
Extreme events

However, all of these are equivocal, involving signal and noise issues. Note also that all of them are alleged impacts from the first one.  And in any case, the fact of any changes does not in itself prove human causation.  That attribution rests solely on unvalidated climate models.  Below is a discussion of the reductionist mental process by which climate complexity and natural forces are systematically excluded to reach the pre-determined conclusion.

Original Post:  Climate Reductionism


Reductionists are those who take one theory or phenomenon to be reducible to some other theory or phenomenon. For example, a reductionist regarding mathematics might take any given mathematical theory to be reducible to logic or set theory. Or, a reductionist about biological entities like cells might take such entities to be reducible to collections of physico-chemical entities like atoms and molecules.
Definition from The Internet Encyclopedia of Philosophy

Some of you may have seen this recent article: Divided Colorado: A Sister And Brother Disagree On Climate Change

The reporter describes a familiar story to many of us.  A single skeptic (the brother) is holding out against his sister and rest of the family who accept global warming/climate change. And of course, after putting some of their interchanges into the text, the reporter then sides against the brother by taking the word of a climate expert. From the article:

“CO2 absorbs infrared heat in certain wavelengths and those measurements were made first time — published — when Abraham Lincoln was president of the United States,” says Scott Denning, a professor of atmospheric science at Colorado State University. “Since that time, those measurements have been repeated by better and better instruments around the world.”

CO2, or carbon dioxide, has increased over time, scientists say, because of human activity. It’s a greenhouse gas that’s contributing to global warming.

“We know precisely how the molecule wiggles and waggles, and what the quantum interactions between the electrons are that cause everyone one of these little absorption lines,” he says. “And there’s just no wiggle room around it — CO2 absorbs heat, heat warms things up, so adding CO2 to the atmosphere will warm the climate.”

Denning says that most of the CO2 we see added to the atmosphere comes from humans — mostly through burning coal, oil and gas, which, as he puts it, is “indirectly caused by us.”

When looking at the scientific community, Denning says it’s united, as far as he knows.

earth-science-climatic-change-Climate-System-3-114-g001

A Case Study of Climate Reductionism

Denning’s comments, supported by several presentations at his website demonstrate how some scientists (all those known to Denning) engage in a classic form of reductionism.

The full complexity of earth’s climate includes many processes, some poorly understood, but known to have effects orders of magnitude greater than the potential of CO2 warming. The case for global warming alarm rests on simplifying away everything but the predetermined notion that humans are warming the planet. It goes like this:

Our Complex Climate

Earth’s climate is probably the most complicated natural phenomenon ever studied. Not only are there many processes, but they also interact and influence each other over various timescales, causing lagged effects and multiple cycling. This diagram illustrates some of the climate elements and interactions between them.

Flows and Feedbacks for Climate Models

The Many Climate Dimensions

Further, measuring changes in the climate goes far beyond temperature as a metric. Global climate indices, like the European dataset include 12 climate dimensions with 74 tracking measures. The set of climate dimensions include:

  • Sunshine
  • Pressure
  • Humidity
  • Cloudiness
  • Wind
  • Rain
  • Snow
  • Drought
  • Temperature
  • Heat
  • Cold

And in addition there are compound measures combining temperature and precipitation. While temperature is important, climate is much more than that.  With this reduction, all other dimensions are swept aside, and climate change is simplified down to global warming as seen in temperature measurements.

Climate Thermodynamics: Weather is the Climate System at work.

Another distortion is the notion that weather is bad or good, depending on humans finding it favorable. In fact, all that we call weather are the ocean and atmosphere acting to resolve differences in temperatures, humidities and pressures. It is the natural result of a rotating, irregular planetary surface mostly covered with water and illuminated mostly at its equator.

The sun warms the surface, but the heat escapes very quickly by convection so the build-up of heat near the surface is limited. In an incompressible atmosphere, it would *all* escape, and you’d get no surface warming. But because air is compressible, and because gases warm up when they’re compressed and cool down when allowed to expand, air circulating vertically by convection will warm and cool at a certain rate due to the changing atmospheric pressure.

Climate science has been obsessed with only a part of the system, namely the atmosphere and radiation, in order to focus attention on the non-condensing IR active gases. The climate is framed as a 3D atmosphere above a 2D surface. That narrow scope leaves out the powerful non-radiative heat transfer mechanisms that dominate the lower troposphere, and the vast reservoir of thermal energy deep in the oceans.

As Dr. Robert E Stevenson writes, it could have been different:

“As an oceanographer, I’d been around the world, once or twice, and I was rather convinced that I knew the factors that influenced the Earth’s climate. The oceans, by virtue of their enormous density and heat-storage capacity, are the dominant influence on our climate. It is the heat budget and the energy that flows into and out of the oceans that basically determines the mean temperature of the global atmosphere. These interactions, plus evaporation, are quite capable of canceling the slight effect of man-produced CO2.”

The troposphere is dominated by powerful heat transfer mechanisms: conduction, convection and evaporation, as well as physical kinetic movements.  All this is ignored in order to focus on radiative heat transfer, a bit player except at the top of the atmosphere.

There’s More than the Atmosphere

Once the world of climate is greatly reduced down to radiation of infrared frequencies, yet another set of blinders is applied. The most important source of radiation is of course the sun. Solar radiation in the short wave (SW) range is what we see and what heats up the earth’s surface, particularly the oceans. In addition solar radiation includes infrared, some absorbed in the atmosphere and some at the surface. The ocean is also a major source of heat into the atmosphere since its thermal capacity is 1000 times what the air can hold. The heat transfer from ocean to air is both by way of evaporation (latent heat) and also by direct contact at the sea surface (conduction).

Yet conventional climate science dismisses the sun as a climate factor saying that its climate input is unvarying. That ignores significant fluctuations in parts of the light range, for example ultraviolet, and also solar effects such as magnetic fields and cosmic rays. Also disregarded is solar energy varying due to cloud fluctuations. The ocean is also dismissed as a source of climate change despite obvious ocean warming and cooling cycles ranging from weeks to centuries. The problem is such oscillations are not well understood or predictable, so can not be easily modeled.

With the sun and the earth’s surface and ocean dismissed, the only consideration left is the atmosphere.

The Gorilla Greenhouse Gas

Thus climate has been reduced down to heat radiation passing through the atmosphere comprised of gases. One of the biggest reductions then comes from focusing on CO2 rather than H20. Of all the gases that are IR-active, water is the most prevalent and covers more of the spectrum.

The diagram below gives you the sense of proportion.

GHG blocks

The Role of CO2

We come now to the role of CO2 in “trapping heat” and making the world warmer. The theory is that CO2 acts like a blanket by absorbing and re-radiating heat that would otherwise escape into space. By delaying the cooling while solar energy comes in constantly, CO2 is presumed to cause a buildup of heat resulting in warmer temperatures.

How the Atmosphere Processes Heat

There are 3 ways that heat (Infrared or IR radiation) passes from the surface to space.

1) A small amount of the radiation leaves directly, because all gases in our air are transparent to IR of 10-14 microns (sometimes called the “atmospheric window.” This pathway moves at the speed of light, so no delay of cooling occurs.

2) Some radiation is absorbed and re-emitted by IR active gases up to the tropopause. Calculations of the free mean path for CO2 show that energy passes from surface to tropopause in less than 5 milliseconds. This is almost speed of light, so delay is negligible. H2O is so variable across the globe that its total effects are not measurable. In arid places, like deserts, we see that CO2 by itself does not prevent the loss of the day’s heat after sundown.

3) The bulk gases of the atmosphere, O2 and N2, are warmed by conduction and convection from the surface. They also gain energy by collisions with IR active gases, some of that IR coming from the surface, and some absorbed directly from the sun. Latent heat from water is also added to the bulk gases. O2 and N2 are slow to shed this heat, and indeed must pass it back to IR active gases at the top of the troposphere for radiation into space.

In a parcel of air each molecule of CO2 is surrounded by 2500 other molecules, mostly O2 and N2. In the lower atmosphere, the air is dense and CO2 molecules energized by IR lose it to surrounding gases, slightly warming the entire parcel. Higher in the atmosphere, the air is thinner, and CO2 molecules can emit IR into space. Surrounding gases resupply CO2 with the energy it lost, which leads to further heat loss into space.

This third pathway has a significant delay of cooling, and is the reason for our mild surface temperature, averaging about 15C. Yes, earth’s atmosphere produces a buildup of heat at the surface. The bulk gases, O2 and N2, trap heat near the surface, while IR active gases, mainly H20 and CO2, provide the radiative cooling at the top of the atmosphere. Near the top of the atmosphere you will find the -18C temperature.

Sources of CO2

Note the size of the human emissions next to the red arrow.

A final reduction comes down to how much of the CO2 in the atmosphere is there because of us. Alarmists/activists say any increase in CO2 is 100% man-made, and would be more were it not for natural CO2 sinks, namely the ocean and biosphere. The claim overlooks the fact that those sinks are also sources of CO2 and the flux from the land and sea is an order of magnitude higher than estimates of human emissions. In fact, our few Gigatons of carbon are lost within the error range of estimating natural emissions. Insects produce far more CO2 than humans do by all our activity, including domestic animals.

Why Climate Reductionism is Dangerous

Reducing the climate in this fashion reaches its logical conclusion in the Activist notion of the “450 Scenario.”  Since Cancun, IPCC is asserting that global warming is capped at 2C by keeping CO2 concentration below 450 ppm. From Summary for Policymakers (SPM) AR5

Emissions scenarios leading to CO2-equivalent concentrations in 2100 of about 450 ppm or lower are likely to maintain warming below 2°C over the 21st century relative to pre-industrial levels. These scenarios are characterized by 40 to 70% global anthropogenic GHG emissions reductions by 2050 compared to 2010, and emissions levels near zero or below in 2100.

Thus is born the “450 Scenario” by which governments can be focused upon reducing human emissions without any reference to temperature measurements, which are troublesome and inconvenient. Almost everything in the climate world has been erased, and “Fighting Climate Change” is now code to mean accounting for fossil fuel emissions.

Conclusion

All propagandists begin with a kernel of truth, in this case the fact everything acting in the world has an effect on everything else. Edward Lorenz brought this insight to bear on the climate system in a ground breaking paper he presented in 1972 entitled: “Predictability: Does the Flap of a Butterfly’s Wings in Brazil Set Off a Tornado in Texas?”  Everything does matter and has an effect. Obviously humans impact on the climate in places where we build cities and dams, clear forests and operate farms. And obviously we add some CO2 when we burn fossil fuels.

But it is wrong to ignore the major dominant climate realities in order to exaggerate a small peripheral factor for the sake of an agenda. It is wrong to claim that IR active gases somehow “trap” heat in the air when they immediately emit any energy absorbed, if not already lost colliding with another molecule. No, it is the bulk gases, N2 and O2, making up the mass of the atmosphere, together with the ocean delaying the cooling and giving us the mild and remarkably stable temperatures that we enjoy. And CO2 does its job by radiating the heat into space.

Since we do little to cause it, we can’t fix it by changing what we do. The climate will not stop changing because we put a price on carbon. And the sun will rise despite the cock going on strike to protest global warming.

Footnote: For a deeper understanding of the atmospheric physics relating to CO2 and climate, I have done a guide and synopsis of Murry Salby’s latest textbook on the subject:  Fearless Physics from Dr. Salby

How Climatists Eclipsed the Sun

Recently, Dr. John Robson of the Climate Discussion Nexus (CDN) interviewed CERES co-team leader, Dr. Ronan Connolly, on the role of the Sun in recent climate change. Excerpts from ICECAP in italics with my bolds, followed by a video and my transcript from the closed captions.

CDN have now published their 20 minute “explainer” video including extracts from this interview and discussion of some of CERES’ recent scientific research. Although the video covers quite a few technical points, they are explained in a very clear and accessible manner.

Topics covered include:

The significance of the debates between the two main rival satellite estimates of solar activity trends since 1978, i.e., PMOD and ACRIM.

How using either PMOD or ACRIM to calibrate the pre-satellite era solar data can give very different estimates of how much solar activity has changed since the 19th century and earlier.

How politics and the UN’s Intergovernmental Panel on Climate Change (IPCC) reports have downplayed the possible role of solar activity in recent climate change.

The urbanization bias problem of current thermometer-based estimates of global temperature trends since the 19th century.

They say you should not look directly at the sun but when it comes to climate a lot of people take that advice to ridiculous extremes. That bright yellow ball in the sky is basically earth’s only source of energy though a very small amount radiates from the planet’s hot core. The sun’s output has been measured to a high degree of precision by satellites in orbit since the late 1970s. and we now know that it varies over time.

Since it is our only source of energy, if it gets stronger it stands to reason
that it could warm the climate.

Indeed there was a time about 20 years ago when many scientists believed that the sun had gotten a bit brighter during the 1980s and 1990s. And they even argued it was enough to explain much of the warming that had taken place.

But now agencies like the UN IPCC ( intergovernmental panel on climate change), NASA and others insist the change in solar output never happened. They say the warming can only be explained by greenhouse gases, so do not look at the sun.

People, something pretty basic doesn’t add up here.

If satellites are measuring the sun’s energy precisely, how can there be disagreement about what it’s been doing? The answer unfortunately is that there’s a gap in the satellite record, a gap that came about after the 1986 space shuttle challenger disaster. And as happens too much in this field, the gap quickly went from being a scientific problem to a political one. And the way that gap was handled is a story that deserves a little sunlight.

I’m John Robson and this is a climate discussion nexus backgrounder on the ACRIM gap controversy

The name ACRIM comes from an instrument called the Active Cavity Radiometer Irradiance Monitor that satellites use to measure solar output. And the amount of solar energy that hits the earth’s atmosphere is called the total solar irradiance or TSI measured in watts per square meter.

On average the sun provides about 1367 watts of energy per square meter continuously on the upper atmosphere. For comparison, all the carbon dioxide ever released from using fossil fuels is estimated by the IPCC to have added about 2 watts per square meter of energy to the atmosphere. And so given the overwhelming role of solar output in the total it shouldn’t take much of a change in the sun’s output to have a global influence on the climate.

We also have data on solar output from the pre-satellite era. For centuries astronomers have been keeping track of the number of dark circles or sunspots that appear on the surface of the sun. Galileo even wrote a book about them. The sunspot count rises and falls on a roughly 11-year cycle which provides clues to the changing strength of solar energy in the past. Scientists can also use evidence from chemical signatures in the earth, called cosmogenic isotopes, to reconstruct solar activity. As usual when you go backward in time on climate it’s only proxy data, and it’s considerably less precise than modern measurements.

Source IPCC Assessment Report #1

But by comparing proxies to satellite data since 1979 we get some idea of how to interpret the clues. In the IPCC’s first report in 1990 they presented a graph that summarized the prevailing view of the sun’s history over the 19th and 20th centuries. It showed the familiar sunspot cycle and also suggested average solar output grew stronger in the second half of the century but they said the changes were not large enough to cause much warming unless there are positive feedback mechanisms that amplify those changes.

But that qualification is not trivial because in fact the notion that carbon dioxide is the driver of warming itself depends on a series of positive feedback mechanisms. Because on its own the warming effect of CO2 is quite small. So there have been various proposals for amplifying mechanisms to increase its impact, which we’ll look at in more detail on another day.

When it comes to the sun basically the argument is that the sun doesn’t just affect how bright it is outside, it also influences how cloudy it is. And since some kinds of clouds have a major role in reflecting heat back into space if more solar output not only adds a bit of heat but also suppresses that kind of cloud formation, it can translate into a lot of surface warming.

So key point here: By the time of the IPCC’s third assessment report in 2001, their views about the sun’s history were getting more uncertain not less uncertain.  In AR3 in 2001, instead of having just one reconstruction of solar output, the IPCC now had multiple different ones to choose from. The reconstructions all agreed that solar output followed the sunspot cycle and they all agreed that solar output had increased over the 20th century.

Fig. 6.5 Reconstructions of total solar irradiance (TSI) by Lean et al. as well as Hoyt and Schatten 1993 updated.

But they disagreed over whether the increase was a lot or little and whether it had happened all at once early in the century or more gradually over the whole span. Since these differences arose from statistical estimates using proxy records, it didn’t look as though there would be an easy way to resolve the disagreements.

So attention turned to the modern satellite record with precise measurements of TSI available since 1978. It should have been possible to compare them with surface temperatures to see if there was any relationship. Unfortunately there was the problem we referred to at the outset: A big gap in the data. The satellites that carried the ACRIM system were first launched in 1978. From time to time satellites wear out and need to be replaced. A replacement satellite is supposed to be launched early enough so its ACRIM system overlaps with the existing one allowing the instruments to be calibrated to each other giving scientists a continuous record.

But as you can see, there’s a gap in the ACRIM record from June 1989 to October 1991. and that gap was a consequence of the space shuttle challenger disaster in January of 1986 that caused NASA’s satellite launch program to be suspended for several years.

By the time a new ACRIM system could be put into orbit in 1991 the old one had already been offline for two years. And the only data available to fill the gap was from a different monitor called the earth radiation budget system or ERB which flew on the Nimbus 7 satellite launched in 1978 as part of a separate series. That satellite didn’t have an acronym and unfortunately the ERB system was not meant to monitor solar output with much precision. Its sensors were pointed toward the earth so it could monitor the climate system and it only had a view of the sun during brief intervals of its orbit.. Also it generated two data series, called ERB and ERBS in the diagram, and they disagreed with each other regarding what the sun did during the ACRIM gap.

Still it was something to work with. In 1997 the lead scientist working on the ACRIM system RIchard Willson of Columbia University used the satellite data and all available information on the behavior of the onboard sensors in the various satellites to construct a composite ACRIM record. A comparison of the minimum points in the solar cycle suggested an increase in TSI from the early 1980s through to the end of the 1990s, after which solar output flattened out.

Since this broadly matched the progress of temperatures after 1980 it opened the door to the possibility that the sun might be responsible for some or all of recent climate changes. The alarmists didn’t like that result at all. In fact they reacted like that far side cartoon with the astronauts going blast the controls are jammed we’re headed right for Mr Sun.

So a few years later a different team led by Claus Fröhlich and Judith Lean published a new reconstruction of the same data that showed: Voila, no upward step, just the standard solar cycle steady downward trend after 1980. It’s called the PMOD reconstruction after the name of Fröhlich’s institute the Physical Meteorological Observatory in Davos. It had the convenient effect of ruling out the sun as a factor in climate change.

Now when I say convenient I do mean in the political sense. The authors made no secret of their motivation. In a recent article reviewing the whole episode scientist Ronan Connolly of the center for environmental research in earth science (CERES) massachusetts found some telling quotes from the authors and others working in the field. In a 2003 interview discussing the motivation for their research author Judith Lean stated the fact that some people could use Willson’s results as an excuse to do nothing about greenhouse gas emissions. It is one reason we felt we needed to look at the data ourselves. And in a later review published in 2014 Pia Zacharis of the international space science institute in switzerland conceded that the data adjustments are still a matter of active debate and have prevented the TSI community from coming up with a conclusive TSI composite so far.

But she went on to observe a conclusive TSI time series is not only desirable from the perspective of the scientific community but also when considering the rising interest of the public in questions related to climate change issues, “thus preventing climate skeptics from taking advantage of these discrepancies within the TSI community by for example putting forth a presumed solar effect as an excuse for inaction on anthropogenic warming.”

We spoke with scientist Ronan Connolly recently to discuss the ACRIM gap and how the IPCC handled the controversy. So the PMOD rival group took the ACRIM data and they’ve applied a series of adjustments which got rid of that rise in solar activity in the 80s and 90s, replacing it with a decline. The net effect shows a declining, effectively according to the PMOD, solar activity has been generally decreasing since at least 1970s.

If the ACRIM composite is correct then that would be consistent with a solar contribution because some of the warming in the 80s and 90s could be due to the solar activity. And then the reduction in warming, the pause or even a slight decline depending on the metric, that could be due to a reduction in solar activity. But if PMOD is correct then solar activity can’t really explain any of the global temperature trends during the satellite era.

Which gives us two things to think about. One is that if the sun’s output did get stronger over the 1980s and 1990s that means it bears some of the blame or gets some of the credit for warming the planet over that interval. Which is a valid argument for not blaming everything on greenhouse gases, especially since the sun’s subsequently quieting down coincides with two long pauses in any warming detected by satellites.

Second, the other thing is that we have scientists talking as if their motivation is not just finding the truth. It’s preventing so-called inaction on climate change and feeling no need to hide such a motive. On the  contrary they seem to be broadcasting it. And if you’re going to come right out and tell us that your goal is to push a policy agenda whether it’s scientifically justified or not, don’t act surprised when we ell you we’re skeptical about your results.

One group that wasn’t skeptical was the IPCC in their fourth assessment report or AR4 in 2007 they showed both the Willson series here in violet and the Piedmont series which is green. But in their next report in 2013 while they still mentioned the Willson series they dropped it from their calculations and said from now on they would only use the PMOD series that told them what they wanted to hear.

Namely that with no increase in solar output there’s no way to blame the sun for global warming so it must be all your fault

Which is one way to do science but what kind of way? My own experience is that there’s a lot of scientists that feel a lot of pressure to conform their work to the IPCC. The IPCC has become a very dominant political body within the scientific community.

How did the PMOD team come up with a different answer than Willson’s group? By arguing that one of the sensors on the ERB system was defective and experienced an increase in its sensitivity during its time in orbit, adding an artificial upward trend to its readings. The PMOD team corrected this supposed defect by pushing the later part of their data downward, thus erasing the increase and getting the result they were looking for.

But did the ERB system actually suffer this malfunction? In 2008 Richard Willson and another of
his co-authors physicist Nicola Scafetta of the university of Naples tracked down Dr Douglas Hoyt, the scientist who’d been in charge of the ERB satellite mission at the time but had since retired. And they asked him and Hoyt emailed them back the following:

Dear Dr. Scafetta:

Concerning the supposed increase in Nimbus 7 sensitivity at the end of September 1989 and other matters as proposed by Fröhlich’s PMOD TSI composite:

1.there is no known physical change in the electrically calibrated nimbus 7 radiometer or its electronics that could have caused it to become more sensitive. At least neither Lee Kyle nor I could never imagine how such a thing could happen. And no one else has ever come up with a physical theory for the instrument that could cause it to become more sensitive.

2.  The Nimbus-7 radiometer was calibrated electrically every 12 days. The calibrations before and after the September shutdown gave no indication of any change in the sensitivity of the radiometer. Thus, when Bob Lee of the ERBS team originally claimed there was a change in Nimbus 7 sensitivity, we examined the issue and concluded there was no internal evidence in the Nimbus 7 records to warrant the correction that he was proposing. Since the result was a null one, no publication was thought necessary.

3. Thus Fröhlich’s  PMOD TSI composite is not consistent with the internal data or physics of the Nimbus 7 cavity radiometer.

4. The correction of the Nimbus 7 tsi values for 1979 through 1980 proposed by Fröhlich is also puzzling. The raw data was run through the same algorithm for these early years and the subsequent years and there is no justification for Freulich’s adjustment in my opinion.

Sincerely Douglas Hoyt

Yeah puzzling, though we can think of other words like suspicious. So let’s look again at the various reconstructions of solar output. In the 2007 IPCC report here’s the range they admitted was possible from the 1600s to the turn of the century. And typically the uncertainty increases as you go backwards, but there are ways to try to decrease it. In that review article I mentioned by Ronan Connolly and 22 co-authors, when they surveyed the various ways experts have used the satellite and proxy records, they found 16 possible reconstructions of solar activity since 1600: Eight yielding fairly low variability and eight fairly high variability.

To illuminate solar influence on temperature these authors also took a close look at the other side of the equation, surface temperature data, and constructed a new climate record for the northern hemisphere using only rural weather stations and data collected over the sea surface to avoid contamination from urban heat islands. Then they coupled this with tree ring proxy data to assemble a temperature estimate covering the same interval as the solar series.

Putting the solar and temperature data together depending on which solar reconstruction you pick the sun turns out to explain either none of the observed warming or all of it or somewhere in between. So we can get a result from nothing to almost all of the temperature changes since 19th century in terms of solar activity depending on whether ACRIM is correct or PMOD is correct.

Now that result doesn’t mean we get to cherry pick the result we like and say, aha we’ve proven that the sun causes all climate change. But neither can the alarmists go, aha we’ve proven that the sun causes none of it. And the trouble is they do it when they put out reports confidently declaring that warming is all due to greenhouse gases.

They don’t tell you that their calculation is based on using one specific solar reconstruction and a lot of temperature data from cities which have grown bigger and hotter since the start of the 21st century.

I’m going to leave you here with one more quote from another scientist working in the solar measurement field. In a 2012 review paper physicist Michael Lockwood discussed all the difficulties in trying to reconstruct solar output and measure its current effects and lamented:
“The academic reputation of the field of sun climate relations is poor because many studies do not address all or even some of the limitations listed above. It is also a field that in recent years has been corrupted by unwelcome political and financial influence as climate change skeptics have seized upon putative solar effects as an excuse for inaction on anthropogenic warming.”

It’s strange when scientists insist that there’s political and financial corruption in their field but it only ever goes in one direction. And it’s not the direction the funders want because, don’t forget, climate research is funded overwhelmingly by governments who believe in a man-made global warming crisis.  And it’s also weird when they say that people drawing logical conclusions about the policy implications of the sun having a significant impact on climate are “just making excuses.”

I don’t expect these scientists want any advice from me but I’m going to give it to them anyway.

When you keep telling us that your motivation is to promote a costly policy agenda whether it’s scientifically justified or not;

and you keep getting caught trying to conceal the fact that you’re not nearly as certain about your conclusions as the IPCC keeps claiming;

and you keep getting caught fiddling data series;

and when challenged you substitute abuse for argument;

It makes the general public more skeptical and not less.

So please look up, because for the climate discussion nexus, I’m John Robson and I am looking at the Sun.

Europe’s Alps: From White to Green and Back Again

The usual suspects (BBC, Science Focus, Phys.org, The Independent, Metro UK, etc.) are worried that green spaces are visible from space, and snow cover will continue to retreat, with bad consequences for the Alpine eco-system, unless we stop burning fossil fuels.  This is triggered by a new paper by Sabine Rumpf et al. From white to green: Snow cover loss and increased vegetation productivity in the European Alps.  Excerpts from Science Focus in italics with my bolds.

Snow in the European Alps is melting and invasive plant species are outcompeting native Alpine plants, satellite imagery has shown. Both findings will reinforce climate change, say scientists.

The changes noticed in a new study, which uses satellite data from 1984 to 2021, show that as much as 77 per cent of the Alps has experienced greening, where areas with previously low vegetation have suddenly seen a boom in plant growth.

While the new plants do take a small amount of carbon out of the atmosphere by photosynthesis, scientists say the greening has a much bigger negative effect on climate change, as less of the Sun’s light will be reflected away from the Earth meaning the planet will get warmer.

The Alps are expected to see a reduction in snow mass of up to 25 per cent in the next 10-30 years, according to the Intergovernmental Panel on Climate Change’s 2019 report. As the snow melts, there will be more rock falls and landslides, which could have devastating consequences.

The new study shows that the Alps is experiencing snow cover recession that can already be seen from space, which the authors warn will only get worse as time goes on.

In the changing mountain environments, native Alpine plants have suffered while new species have thrived. This is because the plants specialised to higher elevations have had to focus on long-term living in the Alps, sacrificing the characteristics that could make them more competitive in the short term.

However, over time Alpine Temperatures and Snow are variable in quasi-cycles

For example, consider Change in temperature for the Greater Alpine Region, 1760–2007: Single years and 20-year smoothed mean series from the European Environment Agency (EEA)

Yes there are warming and cooling periods, and a rise recently.  However, summer minus winter half-years have declined the last century.  Calendar year averages peaked in 1994.  So the certainty about present conditions “only getting worse” is founded on faith rather than facts.

Then consider the record of snow cover over a longer period than the last thirty years.  Rutgers Snow Lab provides this graph:

So a lot of decadal variation is evident.  While 2020-21 snow extent is down from a peak in 2016, it was lower in 2007, and very much lower in 1988-1990.  True, the last 30 years had generally less snow than 30 years prior to 1990. But who is to say that the next 30 won’t see a return to earlier levels, still with large decadal fluxes.

And a longer term view of Alpine glaciers, shows how much climate change has gone on without the benefit of CO2 from humans.

 

Summer Temperatures (May – September) A rise in temperature during a warming period will result in a glacier losing more surface area or completely vanishing. This can happen very rapidly in only a few years or over a longer period of time. If temperatures drop during a cooling period and summer temperatures are too low, glaciers will begin to grow and advance with each season. This can happen very rapidly or over a longer period in time. Special thanks to Prof. em. Christian Schlüchter / (Quartärgeologie, Umweltgeologie) Universität Bern Institut für Geologie His work is on the Western Alps and was so kind to help Raymond make this graphic as correct as possible.

Summary

The combination of mild warming and higher CO2 has greatly benefited the biosphere globally, resulting in setting crop yield records nearly every year.  It should not be surprising that Europe’s Alps participated in this greening of the land.  But I object to the notion that humans caused it or can stop it by reducing emissions.  We do not control the climate or weather, and both warming and cooling periods will come and go as they always have.