Naming Heat Waves is Hype

 

Dr. Joel N. Myers, AccuWeather Founder and CEO writes Throwing cold water on extreme heat hype. Excerpts in italics with my bolds H/T John Ray

A story came to my attention recently that merited comment. It appeared in London’s The Telegraph, and was headlined, “Give heat waves names so people take them more seriously, say experts, as Britain braces for hottest day

The story’s leaping-off point was a press release from the London School of Economics (LSE), which noted, “A failure by the media to convey the severity of the health risks from heat waves, which are becoming more frequent due to climate change, could undermine efforts to save lives this week as temperatures climb to dangerous levels.” .” Is it time to start naming deadly heatwaves?

It added, “So how can the media be persuaded to take the risks of heat waves more seriously? Perhaps it is time … to give heat waves names [as is done] for winter storms.”

We disagree with some of the points being made.

First, and most important, we warn people all the time in plain language on our apps and on AccuWeather.com about the dangers of extreme heat, as well as all hazards. Furthermore, that is the reason we developed and patented the AccuWeather RealFeel® Temperature and our recently expanded AccuWeather RealFeel® Temperature Guide, to help people maximize their health, safety and comfort when outdoors and prepare and protect themselves from weather extremes. The AccuWeather RealFeel Temperature Guide is the only tool that properly takes into account all atmospheric conditions and translates them into actionable behavior choices for people.

Second, although average temperatures have been higher in recent years, there is no evidence so far that extreme heat waves are becoming more common because of climate change, especially when you consider how many heat waves occurred historically compared to recent history.

After June 2019 was recognized by a number of official organizations as the hottest June on record, July may have now been the hottest month ever recorded. That’s according to the Copernicus Centre for Climate Studies (C3S), which presented its data sets with the announcement that this July may have been marginally hotter than that of 2016, which was previously the hottest month on record.

New York City has not had a daily high temperature above 100 degrees since 2012, and it has had only five such days since 2002. However, in a previous 18-year span from 1984 through 2001, New York City had nine days at 100 degrees or higher. When the power went out in New York City earlier this month, the temperature didn’t even get to 100 degrees – it was 95, which is not extreme. For comparison, there were 12 days at 95 degrees or higher in 1999 alone.

Kansas City, Missouri, for example, experienced an average of 18.7 days a year at 100 degrees or higher during the 1930s, compared to just 5.5 a year over the last 10 years. And over the last 30 years, Kansas City has averaged only 4.8 days a year at 100 degrees or higher, which is only one-quarter of the frequency of days at 100 degrees or higher in the 1930s.

Here is a fact rarely, if ever, mentioned: 26 of the 50 states set their all-time high temperature records during the 1930s that still stand (some have since been tied). And an additional 11 state all-time high temperature records were set before 1930 and only two states have all-time record high temperatures that were set in the 21st century (South Dakota and South Carolina).

So 37 of the 50 states have an all-time high temperature record not exceeded for more than 75 years. Given these numbers and the decreased frequency of days of 100 degrees or higher,

it cannot be said that either the frequency or magnitude of heat waves is more common today.

Climate scientist Lennart Bengtsson said. “The warming we have had the last 100 years is so small that if we didn’t have meteorologists and climatologists to measure it we wouldn’t have noticed it at all.”

Why Al Gore Keeps Yelling “Fire!”

 

Some years ago I attended seminars regarding efforts to achieve operational changes in organizations. The notion was presented that people only change their habits, ie. leave their comfort zone, when they fear something else more than changing their behavior. The analogy was drawn comparing to workers leaping from a burning oil platform, or tenants from a burning building.

Al Gore is fronting an agenda to unplug modern societies, and thereby the end of life as we know it. Thus they claim the world is on fire, and only if we abandon our ways of living can we be saved.

The big lie is saying that the world is burning up when in fact nothing out of the ordinary is happening. The scare is produced by extrapolating dangerous, fearful outcomes from events that come and go in the normal flow of natural and seasonal climate change. They can not admit that the things they fear have not yet occurred.  We will jump only if we believe our platform, our way of life, is already crumbling.

And so we come to Al Gore recently claiming that his past predictions of catastrophe have all come true.

J.Frank Bullitt writes at Issues and Insights Gore Says His Global Warming Predictions Have Come True? Can He Prove It? Excerpts in italics with my bolds.

When asked Sunday about his 2006 prediction that we would reach the point of no return in 10 years if we didn’t cut human greenhouse gas emissions, climate alarmist in chief Al Gore implied that his forecast was exactly right.

“Some changes unfortunately have already been locked in place,” he told ABC’s Jonathan Karl.

 

Sea level increases are going to continue no matter what we do now. But, we can prevent much larger sea level increases. Much more rapid increases in temperature. The heat wave was in Europe. Now it’s in Arctic. We’re seeing huge melting of the ice there. So, the warnings of the scientists 10 years ago, 20 years ago, 30 years ago, unfortunately were accurate.”

Despite all this gloom, he’s found “good news” in the Democratic presidential field, in which “virtually all of the candidates are agreed that this is either the top issue or one of the top two issues.”

So what has Gore been predicting for the planet? In his horror movie “An Inconvenient Truth,” he claimed:

Sea levels could rise as much as 20 feet. He didn’t provide a timeline, which was shrewd on his part. But even if he had said 20 inches, over 20 years, he’d still have been wrong. Sea level has been growing for about 10,000 years, and, according to the National Oceanic and Atmospheric Administration, continues to rise about one-eighth of an inch per year.

“Storms are going to grow stronger.” There’s no evidence they are stronger nor more frequent.

Mt. Kilimanjaro was losing its snow cap due to global warming. By April 2018, the mountain glaciers were taking their greatest snowfall in years. Two months later, Kilimanjaro was “covered by snow” for “an unusually long stint. But it’s possible that all the snow and ice will be gone soon. Kilimanjaro is a stratovolcano, with a dormant cone that could erupt.

Point of no return. If we have truly gotten this far, why even care that “virtually all” of the Democratic candidates have agreed that global warming is a top issue? If we had passed the point of no return, there’d be no reason to maintain hope. The fact Gore’s looking for a “savior” from among the candidates means that even he doesn’t believe things have gone too far.

A year after the movie, Gore was found claiming that polar bears’ “habitat is melting” and “they are literally being forced off the planet.” It’s possible, however, that there are four times as many polar bears as there were in the 1960s. Even if not, they’ve not been forced off the planet.

Also in 2007, Gore started making “statements about the possibility of a complete lack of summer sea ice in the Arctic by as early as 2013,” fact-checker Snopes, which leans so hard left that it often falls over and has to pick itself up, said, before concluding that “Gore definitely erred in his use of preliminary projections and misrepresentations of research.”

Unwilling to fully call out one its own, Snopes added that “Arctic sea ice is, without question, on a declining trend.” A fact check shows that to be true. A deeper fact check, though, shows that while Arctic sea ice has been falling, Antarctic sea ice has been increasing.

 

Finally — just for today because sorting out Gore’s fabrications is an ongoing exercise — we remind readers of the British judge who found that “An Inconvenient Truth” contained “nine key scientific errors” and “ruled that it can only be shown with guidance notes to prevent political indoctrination,” the Telegraph reported in 2007.

Gore has been making declarative statements about global warming for about as long as he’s been in the public eye. He has yet to prove a single claim, though. But how can he? The few examples above show that despite his insistence to the contrary, his predictions have failed.

Even if all turned out to be more accurate than a local three-day forecast, there’s no way to say with 100% certainty that the extreme conditions were caused by human activity. Our climate is a complex system, there are too many other variables, and the science itself has limits, unlike Gore’s capacity to inflate the narrative.

Footnote: 

Lest anyone think this is all about altruism, Al Gore is positioned to become even more wealthy from the war on meat.

Generation Investment Management is connected to Kleiner Perkins, where former Vice President Al Gore is one of its partners and advisors.

Who’s Kleiner Perkins? It turns out they are Beyond Meat’s biggest investor, according to bizjournals.com here. Beyond Meat is a Los Angeles-based producer of plant-based meat substitutes founded in 2009 by Ethan Brown. The company went public in May and just weeks later more than quadrupled in value.

Yes, Al Gore, partner and advisor to Kleiner Perkins, Beyond Meat’s big investor, stands to haul in millions, should governments move to restrict real meat consumption and force citizens to swallow the dubious substitutes and fakes.

If taken seriously, the World Research Institute Report, backed by Gore hacks, will help move the transition over to substitute meats far more quickly.

 

July SSTs NH Anomaly

The best context for understanding decadal temperature changes comes from the world’s sea surface temperatures (SST), for several reasons:

  • The ocean covers 71% of the globe and drives average temperatures;
  • SSTs have a constant water content, (unlike air temperatures), so give a better reading of heat content variations;
  • A major El Nino was the dominant climate feature in recent years.

HadSST is generally regarded as the best of the global SST data sets, and so the temperature story here comes from that source, the latest version being HadSST3.  More on what distinguishes HadSST3 from other SST products at the end.

The Current Context

The chart below shows SST monthly anomalies as reported in HadSST3 starting in 2015 through June 2019.

A global cooling pattern is seen clearly in the Tropics since its peak in 2016, joined by NH and SH cycling downward since 2016.  In 2019 all regions had been converging to reach nearly the same value in April.

Now something exceptional is happening in NH rising 0.4C in the last two months, matching the 2015 summer peak.  Meanwhile the SH remains relatively cooler, and the Tropics not changing much.  Despite the sharp jump in NH, the global anomaly rose only slightly.

Note that higher temps in 2015 and 2016 were first of all due to a sharp rise in Tropical SST, beginning in March 2015, peaking in January 2016, and steadily declining back below its beginning level. Secondly, the Northern Hemisphere added three bumps on the shoulders of Tropical warming, with peaks in August of each year.  A fourth NH bump was lower and peaked in September 2018.  As noted above, July 2019 is matching the first of these upward bumps.

And as before, note that the global release of heat was not dramatic, due to the Southern Hemisphere offsetting the Northern one.  The major difference between now and 2015-2016 is the absence of Tropical warming driving the SSTs.

Note: The NH spike is unexpected since UAH ocean air tempts dropped sharply in July 2019.  The discrpency between the two datasets is surprising since previously they were quite similar.

The annual SSTs for the last five years are as follows:

Annual SSTs Global NH SH  Tropics
2014 0.477 0.617 0.335 0.451
2015 0.592 0.737 0.425 0.717
2016 0.613 0.746 0.486 0.708
2017 0.505 0.650 0.385 0.424
2018 0.480 0.620 0.362 0.369

2018 annual average SSTs across the regions are close to 2014, slightly higher in SH and much lower in the Tropics.  The SST rise from the global ocean was remarkable, peaking in 2016, higher than 2011 by 0.32C.

A longer view of SSTs

The graph below  is noisy, but the density is needed to see the seasonal patterns in the oceanic fluctuations.  Previous posts focused on the rise and fall of the last El Nino starting in 2015.  This post adds a longer view, encompassing the significant 1998 El Nino and since.  The color schemes are retained for Global, Tropics, NH and SH anomalies.  Despite the longer time frame, I have kept the monthly data (rather than yearly averages) because of interesting shifts between January and July.

Open image in new tab to enlarge.

1995 is a reasonable starting point prior to the first El Nino.  The sharp Tropical rise peaking in 1998 is dominant in the record, starting Jan. ’97 to pull up SSTs uniformly before returning to the same level Jan. ’99.  For the next 2 years, the Tropics stayed down, and the world’s oceans held steady around 0.2C above 1961 to 1990 average.

Then comes a steady rise over two years to a lesser peak Jan. 2003, but again uniformly pulling all oceans up around 0.4C.  Something changes at this point, with more hemispheric divergence than before. Over the 4 years until Jan 2007, the Tropics go through ups and downs, NH a series of ups and SH mostly downs.  As a result the Global average fluctuates around that same 0.4C, which also turns out to be the average for the entire record since 1995.

2007 stands out with a sharp drop in temperatures so that Jan.08 matches the low in Jan. ’99, but starting from a lower high. The oceans all decline as well, until temps build peaking in 2010.

Now again a different pattern appears.  The Tropics cool sharply to Jan 11, then rise steadily for 4 years to Jan 15, at which point the most recent major El Nino takes off.  But this time in contrast to ’97-’99, the Northern Hemisphere produces peaks every summer pulling up the Global average.  In fact, these NH peaks appear every July starting in 2003, growing stronger to produce 3 massive highs in 2014, 15 and 16.  NH July 2017 was only slightly lower, and a fifth NH peak still lower in Sept. 2018.  Note also that starting in 2014 SH plays a moderating role, offsetting the NH warming pulses. (Note: these are high anomalies on top of the highest absolute temps in the NH.)

What to make of all this? The patterns suggest that in addition to El Ninos in the Pacific driving the Tropic SSTs, something else is going on in the NH.  The obvious culprit is the North Atlantic, since I have seen this sort of pulsing before.  After reading some papers by David Dilley, I confirmed his observation of Atlantic pulses into the Arctic every 8 to 10 years.

But the peaks coming nearly every summer in HadSST require a different picture.  Let’s look at August, the hottest month in the North Atlantic from the Kaplan dataset.
AMO August 2018

The AMO Index is from from Kaplan SST v2, the unaltered and not detrended dataset. By definition, the data are monthly average SSTs interpolated to a 5×5 grid over the North Atlantic basically 0 to 70N. The graph shows warming began after 1992 up to 1998, with a series of matching years since. Because the N. Atlantic has partnered with the Pacific ENSO recently, let’s take a closer look at some AMO years in the last 2 decades.
This graph shows monthly AMO temps for some important years. The Peak years were 1998, 2010 and 2016, with the latter emphasized as the most recent. The other years show lesser warming, with 2007 emphasized as the coolest in the last 20 years. Note the red 2018 line is at the bottom of all these tracks. The short black line shows that 2019 began slightly cooler, then tracked 2018, but has now risen to match previous summer pulses.

Summary

The oceans are driving the warming this century.  SSTs took a step up with the 1998 El Nino and have stayed there with help from the North Atlantic, and more recently the Pacific northern “Blob.”  The ocean surfaces are releasing a lot of energy, warming the air, but eventually will have a cooling effect.  The decline after 1937 was rapid by comparison, so one wonders: How long can the oceans keep this up? If the pattern of recent years continues, NH SST anomalies may rise slightly in coming months, but once again, ENSO which has weakened will probably determine the outcome.

Footnote: Why Rely on HadSST3

HadSST3 is distinguished from other SST products because HadCRU (Hadley Climatic Research Unit) does not engage in SST interpolation, i.e. infilling estimated anomalies into grid cells lacking sufficient sampling in a given month. From reading the documentation and from queries to Met Office, this is their procedure.

HadSST3 imports data from gridcells containing ocean, excluding land cells. From past records, they have calculated daily and monthly average readings for each grid cell for the period 1961 to 1990. Those temperatures form the baseline from which anomalies are calculated.

In a given month, each gridcell with sufficient sampling is averaged for the month and then the baseline value for that cell and that month is subtracted, resulting in the monthly anomaly for that cell. All cells with monthly anomalies are averaged to produce global, hemispheric and tropical anomalies for the month, based on the cells in those locations. For example, Tropics averages include ocean grid cells lying between latitudes 20N and 20S.

Gridcells lacking sufficient sampling that month are left out of the averaging, and the uncertainty from such missing data is estimated. IMO that is more reasonable than inventing data to infill. And it seems that the Global Drifter Array displayed in the top image is providing more uniform coverage of the oceans than in the past.

uss-pearl-harbor-deploys-global-drifter-buoys-in-pacific-ocean

USS Pearl Harbor deploys Global Drifter Buoys in Pacific Ocean

Supremes May Rein In Agency Lawmaking

This post consists of a legal discussion regarding undesirable outcomes from some Supreme Court rulings that gave excessive deference to Executive Branch Agency regulators. Relying on the so called “Chevron Deference” can result in regulations going beyond what congress intended by their laws.

Professor Mike Rappaport writes at Law and Liberty Replacing Chevron with a Sounder Interpretive Regime. Excerpts in italics with my bolds

The Issue

The need for a new interpretive arrangement to replace Chevron is demonstrated by a climate change example cited at the end.

Importantly, this new arrangement would significantly limit agencies from using their legal discretion to modify agency statutes to combat new problems never envisioned by the enacting Congress. For example, when the Clean Air Act was passed, no one had in mind it would be addressed to anything like climate change. Yet, the EPA has used Chevron deference to change the meaning of the statute so that it can regulate greenhouse gases without Congress having to decide whether and in what way that makes sense.

Such discretion gives the EPA enormous power to pursue its own agenda without having to secure the approval of the legislative or judicial branches.

Background and Proposals

One of the most important questions within administrative law is whether the Supreme Court will eliminate Chevron deference. But if Chevron deference is eliminated, as I believe it should be, a key question is what should replace it. In my view, there is a ready alternative which makes sense as a matter of law and policy. Courts should not give agencies Chevron deference, but should provide additional weight to agency interpretations that are adopted close to the enactment of a statute or that have been followed for a significant period of time.

Chevron deference is the doctrine that provides deference to an administrative agency when it interprets a statute that it administers. In short, the agency’s interpretation will only be reversed if a court deems the interpretation unreasonable rather than simply wrong. Such deference means that the agency can select among the (often numerous) “reasonable” interpretations of a statute to pursue its agenda. Moreover, the agency is permitted to change from one reasonable interpretation to another over time based on its policy views. In conjunction with the other authorities given to agencies, such as the delegation of legislative power, Chevron deference constitutes a key part of agency power.

There is, however, a significant chance that the Supreme Court may eliminate Chevron deference. Two of the leaders of this movement are Justices Thomas and Gorsuch. But Chief Justice Roberts as well as Justices Alito and Kavanaugh have also indicated that they might be amenable to overturning Chevron. For example, in the Kisor case from this past term, which cut back on but declined to overturn the related doctrine of Auer deference, these three justices all joined opinions that explicitly stated that they thought Chevron deference was different from Auer deference, suggesting that Chevron might still be subject to overruling.

But if Chevron deference is eliminated, what should replace it? The best substitute for Chevron deference would be the system of interpretation employed in the several generations prior to the enactment of the Administrative Procedure Act. Under that system, as explained by Aditya Bamzai in his path-breaking article, judges would interpret the statute based on traditional canons of interpretation, including two—contemporaneous exposition and customary practice—that provide weight to certain agency interpretations.

Under the canon of contemporaneous exposition, an official governmental act would be entitled to weight as an interpretation of a statute (or of the Constitution) if it were taken close to the period of the enactment of the provision. This would apply to government acts by the judiciary and the legislature as well as those by administrative agencies. Thus, agency interpretations of statutes would be entitled to some additional weight if taken at the time of the statute’s enactment.

This canon has several attractive aspects. First, it has a clear connection to originalism. Contemporaneous interpretations are given added weight because they were adopted at the time of the law’s enactment and therefore are thought to be more likely to offer the correct interpretation—that is, one attuned to the original meaning. Second, this canon also promotes the rule of law by both providing notice to the public of the meaning of the statute and limiting the ability of the agency to change its interpretation of the law.

The second canon is that of customary practice or usage. Under this framework, an interpretation of a government actor in its official capacity would be entitled to weight if it were consistently followed over a period of time. Thus, the agency interpretation would receive additional weight if it became a regular practice, even if were not adopted at the time of statutory enactment.

The canon of customary practice has a number of desirable features. While it does not have a connection to originalism, it does, like contemporaneous exposition, promote the rule of law. Once a customary interpretation has taken hold, the public is better able to rely on the existing interpretation and the government is more likely to follow that interpretation.

Second, the customary interpretation may also be an attractive interpretation. That the interpretation has existed over a period of time suggests that it has not created serious problems of implementation that have led courts or the agency to depart from it. While the customary interpretation may not be the most desirable one as a matter of policy, it is unlikely to be very undesirable.

This traditional interpretive approach also responds to one of the principal criticisms of eliminating Chevron deference: that it will give significant power to a judiciary that lacks expertise and can abuse its authority. I don’t agree with this criticism, since I believe that judges are expert at interpreting statutes and are subject to less bias than agencies that exercise not merely executive power, but also judicial and legislative authority.

But even if one believed that the courts were problematic, this arrangement would leave the judiciary with much less power than a regime that provides no weight to agency interpretations. The courts would often be limited by agency interpretations that accorded with the canons—interpretations adopted when the statute was enacted or that were customarily followed. Since those interpretations would be given weight, the courts would often follow them. But while these interpretations would limit the courts, they would not risk the worst dangers of Chevron deference. This interpretive approach would not allow an agency essentially free reign to change its interpretation over time in order to pursue new programs or objectives. Once the interpretation is in place, the agency would not be able to secure judicial deference if it changed the interpretation.

Importantly, this new arrangement would significantly limit agencies from using their legal discretion to modify agency statutes to combat new problems never envisioned by the enacting Congress. For example, when the Clean Air Act was passed, no one had in mind it would be addressed to anything like climate change. Yet, the EPA has used Chevron deference to change the meaning of the statute so that it can regulate greenhouse gases without Congress having to decide whether and in what way that makes sense. Such discretion gives the EPA enormous power to pursue its own agenda without having to secure the approval of the legislative or judicial branches.

In short, if Chevron deference is eliminated, there is a traditional and attractive interpretive approach that can replace it. Hopefully, the Supreme Court will take the step it refused to take in Kisor and eliminate an unwarranted form of deference.

H2O Reduces CO2 Climate Sensitivity

Francis Massen writes at his blog meteoLCD on The Kauppinen papers, summarizing and linking to studies by Dr Jyrki Kauppinen (Turku University in Finland) regarding the climate sensitivity problem. Excerpts in italics with my bolds

Dr. Jyrki Kauppinen (et al.) has published during the last decade several papers on the problem of finding the climate sensitivity (List with links at end). All these papers are, at least for big parts, heavy on mathematics, even if parts thereof are not too difficult to grasp. Let me try to summarize in layman’s words (if possible):

The authors remember that the IPCC models trying to deliver an estimate for ECS or TCR usually take the relative humidity of the atmosphere as constant, and practically restrict to allowing one major cause leading to a global temperature change: the change of the radiative forcing Q. Many factors can change Q, but overall the IPCC estimates the human caused emission of greenhouse gases and the land usage changes (like deforestation) are the principal causes of a changing Q. If the climate sensitivy is called R, the IPCC assumes that DT = R*DQ (here “D” is taken as the greek capital “delta”). This assumption leads to a positive water vapour feedback factor and so to the high values of R.

Kauppinen et al. disagree: They write that one has to include in the expression of DT the changes of the atmospheric water mass (which may show up in changes of the relative humidity and/or low cloud cover. Putting this into a equation leads to the conclusion that the water vapour feedback is negative and as a consequence that climate sensitivity is much lower.

Let us insist that the authors do not write that increasing CO2 concentrations do not have any influence on global temperature. They have, but it is many times smaller than the influence of the hydrological cycle.

Here what Kauppinen et al. find if they take real observational values (no fudge parameters!) and compare their calculated result to one of the offical global temperature series:

Figure 4. [2] Observed global mean temperature anomaly (red), calculated anomaly (blue), which is the sum of the natural and carbon dioxide contributions. The green line is the CO2 contribution merely. The natural component is derived using the observed changes in the relative humidity. The time resolution is one year.

The visual correlation is quite good: the changes in low cloud cover explain almost completely the warming of the last 40 years!

In their 2017 paper, they conclude to a CO2 sensitivity of 0.24°C (about ten times lower than the IPCC consensus value). In the last 2019 paper they refine their estimate, find again R=0.24 and give the following figure:

Figure 2. [2] Global temperature anomaly (red) and the global low cloud cover changes (blue) according to the observations. The anomalies are between summer 1983 and summer 2008. The time resolution of the data is one month, but the seasonal signal is removed. Zero corresponds about 15°C for the temperature and 26 % for the low cloud cover.

Clearly the results are quite satisfactory, and show also clearly that their simple model can not render the spikes caused by volcanic or El Nino activity, as these natural disturbances are not included in their balance.

The authors conclude that the IPCC models can not give a “correct” value for the climate sensitivity, as they practically ignore (at least until AR5) the influence of low cloud cover. Their finding is politically explosive in the sense that there is no need for a precipitous decarbonization (even if on the longer run a reduction in carbon intensity in many activities might be recommendable.

Francis Massen opinion

As written in part 1, Kauppinen et al. are not the first to conclude to a much lower climate sensitivity as the IPCC and its derived policies do. Many papers, even if based on different assumptions and methods come to a similar conclusion i.e. the IPCC models give values that are (much) too high. Kaupinnen et al. also show that the hydrological cycle can not be ignored, and that the influence of low clouds cover (possibly modulated by solar activity) should not be ignored.

What makes their papers so interesting is that they rely only on practically 2 observational factors and are not forced to introduce various fudge parameters.

The whole problem is a complicated one, and rushing into ill-reflected and painful policies should be avoided before we have a much clearer picture.

Footnote: The four Kauppinen papers.

2011 : Major portions in climate change: physical approach. (International Review of Physics) link

2014: Influence of relative humidity and clouds on the global mean surface temperature (Energy & Environment). Link to abstract.   Link to jstor read-only version (download is paywalled).

2018: Major feedback factors and effects of the cloud cover and the relative humidity on the climate. Link.

2019: No experimental evidence for the significant anthropogenic climate change. Link.
The last two papers are on arXiv and are not peer reviewed, not an argument to refute them in my opinion.

Francis Massen (francis.massen@education.lu), a physicist by education, who manages and operates the meteo/climate station http://meteo.lcd.lu of the Lycée Classique de Diekirch in Luxembourg, Europe.

See Also my recent post More 2019 Evidence of Nature’s Sunscreen

Postscript:

Dr. Dai Davies summarized this perspective this way:

The most fundamental of the many fatal mathematical flaws in the IPCC related modelling of atmospheric energy dynamics is to start with the impact of CO2 and assume water vapour as a dependent ‘forcing’ .  This has the tail trying to wag the dog. The impact of CO2 should be treated as a perturbation of the water cycle. When this is done, its effect is negligible.

See Davies article synopsis at Earth Climate Layers

Forget IPCC: Energy Industry Cuts Emissions, Nations Don’t

Maj. Gen. Paul Vallely writes at Town Hall Wait…Who’s Trying to Beat Climate Change?

Well, there goes the justification for Green Socialism and Nationalizing Energy Supply. Excerpts in italics with my bolds.

The energy industry is waging war against climate change – and winning.

Last week, the Environmental Partnership, a group of oil and gas firms dedicated to cutting greenhouse gas emissions, released its first annual progress report. The results are impressive — and showcase what happens when an industry unites to further the public good.

The Environmental Partnership launched in late 2017 with 26 members. Within 12 months, it more than doubled in size to 58 members — including 32 of America’s top 40 oil and gas producers. Today, its members account for nearly half of America’s oil and natural gas production.

The group focuses on cutting emissions of methane and other greenhouse gases known as “volatile organic compounds.” Without proper monitoring and maintenance, these gases can escape from drilling rigs and pipelines and contribute to global warming.

Even before the partnership formed, firms were spending millions to reduce their carbon footprints. Methane emissions have plummeted in America’s largest energy-rich basins, even as oil and gas production has spiked.  

Production at the Appalachia Basin, which spans from Alabama to Maine, rose more than 380 percent from 2011 to 2017 — yet methane emissions dropped 70 percent. Texas’s Eagle Ford Basin, meanwhile, produced 130 percent more oil and gas, but released 65 percent less methane.  And the Permian Basin, split between Texas and New Mexico, doubled production while decreasing emissions by almost 40 percent.

But firms in the Environmental Partnership weren’t satisfied with that progress. They sought to slash emissions even further.

First, the partnership focused on updating outdated technology like high-bleed pneumatic controllers. Pneumatic controllers regulate temperature, pressure, and liquid levels at natural gas sites by opening or closing valves. To operate these valves, the controllers rely on pressurized natural gas. As their name suggests, high-bleed pneumatic controllers can release relatively large amounts of natural gas, along with methane and VOC byproducts, into the air.  

The Environmental Partnership plans to replace all high-bleed pneumatic controllers in five years. And it’s well on its way to doing so. It replaced, retrofitted, or removed more than 28,000 prior to 2018 and an additional 3,000 last year. As a result, nearly 40 participating firms don’t use high-bleed controllers at all.

Second, the partnership set out to curb methane leaks – which can sometimes happen as firms extract, store, and burn natural gas. Methane is both a potent greenhouse gas and the main ingredient in natural gas. Participating companies conducted more than 156,000 surveys across 78,000 production sites, inspecting more than 56 million individual parts.

After its thorough inspections and repairs, the Environmental Partnership found that just 0.16 percent of industry parts contained leaks — and member firms repaired 99 percent of those in 60 days or less.
Participating firms also worked to better monitor liquid removal from natural gas wells. When too much liquid, mostly consisting of water, builds up within gas wells, firms manually direct the liquid to vents that bring it to surface. During that process, methane or volatile organic compounds can potentially escape into the atmosphere.

Over the course of 2018, the Environmental Partnership oversaw more than 130,000 manual removals to ensure environmentally safe execution.

In addition to these three initiatives, the Environmental Partnership held numerous conferences and workshops across the country to share best practices and new technologies. These conferences featured energy experts, regulators, and academics.

These meetings amount to more than feel-good powwows. The Environmental Partnership has spurred America’s largest energy producers to take a good, hard look at their operations, pinpoint the need for critical changes, and execute those reforms.

Methane emissions from natural gas systems fell over 14 percent between 1990 and 2017. The Environmental Partnership’s initiatives will undoubtedly cut these emissions even further. According to the EPA’s own estimates, reducing methane leaks and replacing high-bleed controllers can slash emissions by 40 and 60 percent, respectively.

Energy firms are weaponizing their data and tools for the common good. Let’s hope they keep up the fight in the war against climate change.

Paul E. Vallely is a retired U.S. Army major general who serves as a senior military analyst for Fox News. Gen. Vallely is the founder and chairman of Stand Up America, a public policy research organization committed to national security and energy independence.

 

July Land and Sea Temps Cooler After June Bump

banner-blog

With apologies to Paul Revere, this post is on the lookout for cooler weather with an eye on both the Land and the Sea.  UAH has updated their tlt (temperatures in lower troposphere) dataset for July.  Previously I have done posts on their reading of ocean air temps as a prelude to updated records from HADSST3. This month also has a separate graph of land air temps because the comparisons and contrasts are interesting as we contemplate possible cooling in coming months and years.

Presently sea surface temperatures (SST) are the best available indicator of heat content gained or lost from earth’s climate system.  Enthalpy is the thermodynamic term for total heat content in a system, and humidity differences in air parcels affect enthalpy.  Measuring water temperature directly avoids distorted impressions from air measurements.  In addition, ocean covers 71% of the planet surface and thus dominates surface temperature estimates.  Eventually we will likely have reliable means of recording water temperatures at depth.

Recently, Dr. Ole Humlum reported from his research that air temperatures lag 2-3 months behind changes in SST.  He also observed that changes in CO2 atmospheric concentrations lag behind SST by 11-12 months.  This latter point is addressed in a previous post Who to Blame for Rising CO2?

After a technical enhancement to HadSST3 delayed March and April updates, May was posted early in June, hopefully a signal the future months will also appear more promptly.  For comparison we can look at lower troposphere temperatures (TLT) from UAHv6 which are now posted for July. The temperature record is derived from microwave sounding units (MSU) on board satellites like the one pictured above. Recently there was a change in UAH processing of satellite drift corrections, including dropping one platform which can no longer be corrected. The graphs below are taken from the new and current dataset.

The UAH dataset includes temperature results for air above the oceans, and thus should be most comparable to the SSTs. There is the additional feature that ocean air temps avoid Urban Heat Islands (UHI).  The graph below shows monthly anomalies for ocean temps since January 2015.

June ocean air temps rose in all regions after May’s drop, resulting in the Global average back up matching June 2017.  Now in July all regions dropped back down to May levels.  The temps this July are warmer than 2018 and similar to 07/2017, and of course lower than 2016.  What’s different now is how synchronized are all the ocean regions.

Land Air Temperatures Tracking Downward in Seesaw Pattern

We sometimes overlook that in climate temperature records, while the oceans are measured directly with SSTs, land temps are measured only indirectly.  The land temperature records at surface stations record air temps at 2 meters above ground.  UAH gives tlt anomalies for air over land separately from ocean air temps.  The graph updated for July is below.

Here we have freash evidence of the greater volatility of the Land temperatures, along with an extraordincary departure by SH land.  Despite the small amount of SH land, it rose so sharply it pulled the global average upward against cooling elsewhere.  Note also that any SH warming in July means a milder winter in those places, especially Australia.  The overall pattern shows global land temps follow NH temps.  Note how much lower are NH land temps now compared to peaks over previous years.

The longer term picture from UAH is a return to the mean for the period starting with 1995:

TLTs include mixing above the oceans and probably some influence from nearby more volatile land temps.  Clearly NH and Global land temps have been dropping in a seesaw pattern, now more than 1C lower than the peak in 2016.  TLT measures started the recent cooling later than SSTs from HadSST3, but are now showing the same pattern.  It seems obvious that despite the three El Ninos, their warming has not persisted, and without them it would probably have cooled since 1995.  Of course, the future has not yet been written.

Today’s Arctic Ice Precedented 150 years Ago

This map from the Canadian Ice Service shows sea ice conditions in the western part of High Arctic islands on Sept. 8, 2018. The dark blue shows a low concentration (less than 10 per cent) of ice, while white shows a high concentration (100 per cent). At this time of the year, the Arctic ice cover is the highest it has been since 2014, the National Snow and Ice Data Center said Sept. 5.

The usual alarms are sounding again this summer to celebrate the annual melting of Arctic Sea Ice prior to refreezing again. Science Daily claims:

A new study provides a 110-year record of the total volume of Arctic sea ice, using early US ships’ voyages to verify the earlier part of the record. The current sea ice volume and rate of loss are unprecedented in the 110-year record.

Had they been willing to go a little further back in time they could have confirmed what others previously concluded from the same sources.

Researchers found that ice conditions in the 19th century were remarkably similar to today’s, observations falling within normal variability. The study is Accounts from 19th-century Canadian Arctic Explorers’ Logs Reflect Present Climate Conditions (here) by James E. Overland, Pacific Marine Environmental Laboratory/NOAA, Seattle,Wash., and Kevin Wood, Arctic Research Office/NOAA, Silver Spring, Md.   H/t GWPF Excerpts in italics with my bolds.

Overview

This article demonstrates the use of historical instrument and descriptive records to assess the hypothesis that environmental conditions observed by 19th-century explorers in the Canadian archipelago were consistent with a Little Ice Age as evident in proxy records.  We find little evidence for extreme cold conditions.

It is clear that the first-hand observations of 19th-century explorers are not consistent with the hypothesized severe conditions of a multi-decadal Little Ice Age. Explorers encountered both warm and cool seasons, and generally typical ice conditions, in comparison to 20th-century norms.

Analysis

There were more than seventy expeditions or scientific enterprises of various types dispatched to the Canadian Arctic in the period between 1818 and 1910. From this number, we analyzed 44 original scientific reports and related narratives; many from expeditions spanning several years. The majority of the data come from large naval expeditions that wintered over in the Arctic and had the capacity to support an intensive scientific effort. A table listing the expeditions and data types is located at http://www.pmel.noaa.gov/arctic/history.  The data cover about one-third of the possible number of years depending on data type, and every decade is represented.

Our analysis focuses on four indicators of climatic change: summer sea ice extent, annual sea ice thickness, monthly mean temperature, and the onset of melt and freeze as estimated from daily mean temperature. Historical observations in these four categories were compared with modern reference data; the reference period varied, depending on data availability.  Both sea ice extent and the onset of melt and freeze were compared to the 30- year reference period 1971–2000; monthly means are compared to the 50-year period 1951–2000. Modern sea ice thickness records are less continuous, and some terminate in the 1980s; the reference period is therefore based on 19 to 26 years of homogeneous record.

arctic-explorers-fig1

Fig.1.

(a) Proxy record of standardized summer air temperature variation derived from ice cores taken on Devon Island. This proxy record suggests that a significantly colder climate prevailed in the 19th century. Shading indicates temperatures one standard deviation warmer or colder than average for the reference period 1901–1960 [Overpeck,1998].

(b) Historical monthly mean temperature observations compared to the 20th-century reference period 1951–2000. Sixty-three percent of 343 monthly mean temperatures recorded on 19th-century expeditions between 1819 and 1854 fall within one standard deviation of the reference mean at nearby stations (reference data from Meteorological Service of Canada,2002; and National Climatic Data Center,2002).

(c) Onset of melt observed by expeditions between 1820 and 1906 expressed as departures from the mean for the reference period 1971–2000. The period of melt transition observed by 19th century explorers is not inconsistent with modern values.

(d) Onset of freeze observed between 1819 and 1905 compared to the reference period 1971–2000. The onset of freeze transition is frequently consistent with modern values,but in some cases occurred earlier than usual. The incidence of an early onset of freeze represents the largest departure from present conditions evident in the historical records examined in this study. Melt and freeze transition dates for the reference period 1971–2000 were calculated from temperature data extracted from the Global Daily Climatology Network data base (National Climate Data Center, 2002).

arctic-explorers-fig2

Fig.2. The ship tracks and winter-over locations of Arctic discovery expeditions from 1818 to 1859 are surprisingly consistent with present sea ice climatology (contours represented by shades of blue). The climatology shown reflects percent frequency of sea ice presence on 10 September which is the usual date of annual ice minimum for the reference period 1971–2000 (Canadian Ice Service,2002). On a number of occasions,expeditions came within 150 km of completing the Northwest Passage, but even in years with unfavorable ice conditions, most ships were still able to reach comparatively advanced positions within the Canadian archipelago. By 1859, all possible routes comprising the Northwest Passage had been discovered.

Summary

As stated here before, Arctic ice is part of a self-oscillating system with extents expanding and retreating according to processes internal to the ocean-ice-atmosphere components. We don’t know exactly why 19th century ice extent was less than previously or less than the 1970s, but we can be sure it wasn’t due to fossil fuel emissions.

arctic-explorers-fig3rev

Explorers encountered both favorable and unfavorable ice conditions. This drawing from the vicinity of Beechey Island illustrates the situation of the H.M.S.Resolute and the steam-tender Pioneer on 5 September 1850 [from Facsimile of the Illustrated Arctic News,courtesy of Elmer E.Rasmuson Library,Univ.of AlaskaFairbanks].

Update: At Last A Climate Policy with Teeth

Update August 7 2019:

Tired of all the tokenism in proposals to “fight climate change”?  Like this one from Singapore today: Want to do more to fight climate change? Cut down on driving, buying stuff and eating meat
Or What will it take to kick Singapore’s growing multimillion-dollar addiction to bottled water?

Once again the UK is at the forefront showing how to get serious in fighting climate change. Euan Mearns has the story UK Government to Announce New Energy Policies Excerpts in italics with my bolds.

Amidst Brexit chaos, the Prime Minister will today introduce a white paper to Parliament detailing the Government’s new energy strategy. Stunned by criticism that she has failed to listen, the new policies will take full cognisance of the concerns recently raised by striking school children. The new policy has 4 main strands. The Downing Street press release is below the fold.

[BEGINS] In view of the grave concerns raised by 5 to 17 year old children on the impact of CO2 on Earth’s climate, Her Majesty’s government will today introduce legislation that will address the most pressing issue of our times, namely CO2 emissions and the ensuing climate mayhem that they cause (Exhibit 1, Appendix 1). CO2 has risen to record levels from 0.0280% (pre-industrial) to 0.0405% today (see endnote 1). The new energy policy has four main strands:

1. Adult only flights

As of 1 January 2020 juveniles below the age of 18 will no longer be allowed to fly on commercial flights within the UK and between the UK and foreign destinations. A reciprocal arrangement will apply to incoming flights that will not be allowed to land on British soil if there are juveniles on board. The government appreciates this will have a major impact on family holidays and tourism. But that is the policy goal. We can no longer countenance families flying all over the place simply for the sake of seeking some sunshine. Tourism is one of the most useless and resource wasteful activities known to Mankind. What is the point in wrecking Earth’s climate to go and gaze at the Eiffel Tower or to go visit Euro Disney when an equally enjoyable time can be had at our home grown attractions of the Blackpool Tower and Center Parcs (Figures 1 and 2).

The government appreciates this is going to have a catastrophic impact on the airline and airport industries. That is the whole point of the policy. We can longer countenance giving shelter to evil polluting companies on these islands. The UK will press our allies throughout the OECD to follow suit. Given time this should also have a catastrophic impact on the airliner manufacturing sectors where we expect Rolls Royce (engines) and BAE systems (wings) to be hardest hit. We point to the troubled Jaguar Landrover, caused by government policy, as a shining example of government aptitude at wrecking British industry.

2. An end to North Sea Ferries

The government is often accused of lacking foresight and we wish to stress that we are smart enough to recognise that selfish polluting families may simply try to avoid the adult only flight policy by using car ferries instead. The government sees no way of tackling this problem other than to close down all ferry services between the UK and mainland Europe, the Island of Ireland and all other destinations. Car ferries travelling between Scottish Islands comes under the jurisdiction of the Scottish Parliament.

The activity of transporting a two tonne SUV on board a ship running on filthy dirty bunker fuel needs to be consigned to history. The idea of families boarding a ship to simply drive around Europe looking at stuff, while wrecking Earth’s climate, needs to be stopped.

The government is aware that these policies may seem to be anti-tourism. Nothing could be further from the truth. We remain committed to a robust, albeit crippled, tourist industry. British children will simply need to learn how to enjoy beach holidays at home (Figure 3). And to prove this point, children will still be allowed to travel to Europe on all electric Eurostar trains. And really rich families will even be allowed to take cars with them, so long as they are all-electric vehicles.

3. An end to driving to School

With immediate effect, the UK Government is to introduce a ban on children being driven to school by their parents in petrol or diesel cars. We will continue to allow children of very wealthy families to be driven to school in all-electric vehicles. Hybrid plugin electric vehicles will not face an immediate ban but will be phased out over three years.

To enforce this ban children will be encouraged to spy on their friends (or enemies) by taking pictures of children covertly being dropped off just around the corner and sharing these images on social media. This should create a deterrent to illegal child dropping.

4. Phasing out of gas or oil heating systems in schools

In keeping with the recently announced policy of the Dutch Government to phase out natural gas all together and the allied UK policy of ceasing to build homes with gas central heating, the government will bring forward a bill to phase out gas or oil heating systems in all our schools by 2022. Schools will instead by obliged to install all-electric heating that runs exclusively on in-situ, off-grid, renewable energy systems. Using the latest SMART technology it is anticipated that this should be simple and straightforward to achieve.

Here’s the clever part. Children of all ages (5 to 17) will be allowed to participate in designing these SMART heating systems. The Government does not have spare funds to support this initiative so schools will have to pay for it out of existing budgets. However, since renewable energy prices have tumbled, paying for this should not be a problem. If schools struggle to meet this bill, they will be encouraged to either lay-off staff or ask parents to pay for this vital flagship policy. [ENDS]

Thank you Euan.  There is no fool like a Climate Fool today or all year round.