Climate Faith ≠ Climate Works

Protestors march to raise awareness of climate change and ecological issues on the second day of the Glastonbury Festival at Worthy Farm, Somerset, England, Thursday, June 27, 2019. (Photo by Grant Pollard/Invision/AP) GRANT POLLARD/INVISION/AP

Michael Lynch writes at Forbes Is The Climate Change Debate A Replay Of The Reformation? Excerpts in italics with my bolds.

During the Reformation, there was an intense debate over whether Christians could enter paradise by doing good works, or whether faith alone allowed such a benefit. (See Fatal Discord: Erasmus, Luther and the Fight for the Western Mind by Michael Massing) This reminds me of the current attitude many have towards climate change policy, where some appear to think that faith alone is sufficient to solve the problem.

In the early days of the global warming debate, I read an English writer praising his country’s example of recognizing climate change compared to American skepticism, although he did admit the British hadn’t actually taken steps to address the problem. Similarly, the U.S. has reduced greenhouse gas emissions more than most countries in the past few years, but incidentally, mostly due to cheap natural gas, and it remains the climate villain in the eyes of many because the president is a denier.

Additionally, a lot of energy, well, effort, goes into demonizing actors or actions that have no practical impact on climate. For example, opposing the construction of oil and gas pipelines does not reduce consumption of oil and gas, and usually increases emissions. Suing the oil or auto industries for blocking climate policies or misleading the public about climate science appeals to many, but with no measurable environmental impact. The same with demanding divestment in fossil fuel company stocks.

Some of the new proposals to address climate change put me mind of the debate between faith and works, especially when they seem more for demonstration purpose than actually reducing emissions. Numerous governments have suggested phasing out all carbon-based electricity generation or all petroleum-fueled vehicles by a point decades into the future, and these tend to be hailed by activists as representing, if not solutions, then great strides forward. New York state, for example, just proposed phasing out carbon-based electricity by 2050; France wants to ban conventional vehicles by 2040, the U.K. by 2050. But as Michael Coren notes, “So far, it’s just words.”

Which reminds me of comedian Billy West who, in the persona of a radio personality, bragged to someone about his fund-raising, adding, “…but mostly it’s just pledges.” Governments have been great at setting goals, but implementation has been seriously lacking. The setting of goals seems more an act of faith than a carrying out of works.

And we have been here before. Many other national and sub-national environmental programs were later abandoned; the 1990s saw California enact mandates for electric vehicle sales—requiring 10% of sales in 2003 be zero emission vehicles—which was adopted by a number of other states, primarily in New England. Ultimately, it was abandoned after wasting billions of dollars. Numerous locales in the U.S. signed on to requirements for oxygenated gasoline, only to back out at the last minute when the cost became apparent.

Technology mandates are a mix of demonizing the producers and demonstrations of faith: telling utilities to buy a certain portion of carbon-free electricity is calling on someone else to act, while hiding the cost of the action. Those who believe in works would do better to buy their own renewable power, either producing it directly or from an independent power producer.

Automobile efficiency standards arguably fall into this category as well, that is, making it seem as if the manufacturers are to blame for consumers’ desire to purchase large, powerful vehicles. There are very fuel-efficient vehicles for sale in the United States, and they are much cheaper than the sauropods dominating American highways, so addressing manufacturer behavior is not the issue. Mandating vehicle efficiency is rather like demanding that a portion of butchers’ sales be veggie burgers; Beyond Meat has shown that success for veggie burgers comes from satisfying consumers, not lecturing them on environmental ethics.

This is where a carbon tax comes in: it is designed to change consumer preferences, reducing carbon emissions in favor of other consumables. It would also motivate producers to meet the demand for products that require less carbon emissions, either in their production or operation. Although the impact would grow over time, it would begin immediately upon implementation, and while it could theoretically be reversed, taxes on consumption tend to be extremely persistent.

Comment:

I like the author’s comparing of the climate faithful marching in processions to the religious faithful marching on Holy Days. He is right to point out the hypocrisy of of those obsessed over CO2 demonstrating their belief, while still enjoying fossil fuel benefits. And he ridicules the symbolic but ineffectual policies proposed, noting they are merely another form of showing faith rather than taking action that works.

But he ends up accepting the warmist unproven premise: We are sinners because we burn fossil fuels. Moreover, he seems to suggest that imposing a carbon indulgence tax overcomes the moral shortcoming. In fact Reformers strongly opposed the Catholic Church practice of taking money for future promises they could not deliver. Now this scam returns with governments taking the opportunity to fill their coffers. Further, as Bill Gates explained, the tax has a faulty premise: There is presently no substitute for fossil fuels powering modern societies.

The good news is, today’s weather and climate are within ordinary bounds.  The bad news:  If they actually turn climate faith into works, it is the end of life as you know it.

Warmists Make Bad Investors

Terence Corcoran explains at the Financial Post: The world needs more of what Exxon is selling (and will for decades). Excerpts in italics with my bolds.

World demand for Exxon’s products, fossil fuels, is expected to increase and remain steady over the coming decade

It’s the kind of story that lights up headlines: one of Britain’s biggest fund managers started selling shares in Exxon Mobil Corp. because the global oil giant wasn’t doing enough to address climate change.

The investment fund manager, Legal & General Investment Management (LGIM), oversees $1.3 trillion, making it the 11th largest money manager in the world. Legal and General (as it is called) is also one of scores of investment management firms, activists and hand-wringing organizations that are part of the burgeoning global sustainable and environmental social finance and governance effort to promote collaborative engagement and foster responsible investment and divestment. The goal is to enhance disclosure target-setting within corporations so that they can become leaders and builders of business models that will help the planet achieve a prosperous and sustainable future and overcome the climate emergency/crisis/disaster now faced by humanity if fossil fuels are not reduced to near-zero in the not-too-distant future.

As part of this movement, LGIM is a member of an organization called Climate Action 100+: Global Investors Driving Business Decisions, a collection of meddling institutional investors around the world, mostly government-run pension plans — although Quebec’s state pension fund, the Caisse de dépôt et placement du Québec, is the only obvious Canadian member of Climate Action 100+.

Exxon was one of five companies LGIM said it had placed on the divestment list as it steps up pressure on companies to address climate change: ExxonMobil Corporation, Hormel Foods, Korean Electric Power Corporation, Kroger and Metlife. “These names,” said LGIM, “are in addition to China Construction Bank, Rosneft Oil, Japan Post Holdings, Subaru, Loblaw and Sysco Corporation, all of whom remain engaged but who have yet to take the substantive actions to warrant re-instatement.”

Meryam OImi, head of Sustainability and Responsible Investment Strategy at LGIM, said the investment firm “will continue to push companies to build business models fit for a prosperous, sustainable future.” LGIM’s name-and-shame strategy was enthusiastically endorsed last week in Forbes magazine for maintaining “a sophisticated approach to climate change.”

One has to wonder, however, about the wisdom of divesting Exxon Mobil, one of the world’s most successful fossil-fuel producers, at a time when world energy forecasters project continuing expansion of fossil fuel demand well into …

Whoa. Hold on a second. Let me go back a few paragraphs. Loblaw? Is that our Canadian Loblaw, national champion virtue-signalling food industry giant, master of green product marketing, installer of solar panels on supermarket roofs, and most recently recipient of government funding to help upgrade the company’s refrigeration units to make them more green?

By gosh, it is our Loblaw. In a release, LGIM said “Loblaw, the Canadian grocery chain,” will continue as an “exclusion candidate.” According to Angeli Behham, a corporate governance manger who leads LGIM’s “pledge engagements with the food sector,” Canada’s leading food company “has made improvements in its governance, appointing a Lead Independent Director to ensure a counter-balancing voice to the Chair/CEO role. But we believe there are still a number of necessary steps for companies of such scale, and look forward to continuing engagement and support for substantive changes in the future.” Only then, it seems, will Loblaw be removed from the “divested” list and “reinstated.”

A colleague here at FP Comment, Peter Foster, sent an email to a public affairs person at Loblaw’s head office in Toronto about LGIM’s listing of Loblaw as a climate laggard. “Did they tell you where you are falling short? Are you taking steps to regain their approval? Does this mean they don’t invest in you at all, or just in one of their funds?” There has been no reply as of deadline.

Meanwhile, back to Exxon Mobil, from which LGIM has commenced divesting. Presumably the objective is to use slow trickle-down divestiture as a form of blackmail: change your ways, Exxon, or we will take away our investment, publicly announce our intent and drive your share price down.

This may be terrific green headline-grabbing investment politics, but in the stock market world the plan seems a little naive. According to the latest forecasts — from the International Energy Agency, BP’s Energy Outlook, and McKinsey — world demand for Exxon’s products, fossil fuels, is expected to increase and remain steady over the coming decades.

Natural gas demand, for example, surged last year, and McKinsey reports that gas demand will continue to increase from about 3,500 billion cubic feet (bcf) today to a peak of about 4,200 bcf in 2035 before declining slightly back to today’s level by 2050. Over the next 30 years, oil will also gain from 100 million barrels a day (MMBD) a year today to 108 MMBD a year in 2033 before falling back down to 100 MMBD by 2050.

That means that Exxon and other fossil-fuel companies are forecast to produce a total of 3,000 MMBD of oil over the next 30 years and 120,000 billion cubic feet of gas.

By most investment standards, this is no time to be divesting fossil-fuel stocks. If LGIM and other dumb fund management clucks agitating for sustainable investment and divestment want no part of it, then let them have their political fun. Sell, baby, sell. As they do their bit to keep the fossil-fuel stocks low, they are creating buying opportunities for smarter investors. In future, it seems, the world will need more Exxon.

Comment:  How is it that so-called professional wealth managers can be so crippled with wish dreams and political correctness?  Do they think that everyone with disposable income lives in their progressive, post-modern bubble?  I hope they lose their shirts.  (Except for Quebec pension fund who need to send me a check every month.)

Climate Zealots Throw Sand into Energy Supply

Roger Conrad reports on how the US energy infrastructure is hobbled by climate activists empowered by funds and lawyers. His article at Forbes is Best Bets On Pipeline Politics. Excerpts in italics with my bolds.

It seems like a long, long time ago in a galaxy far, far away. But barely two years back, permits for new US oil and especially natural gas pipelines were basically a formality.

Back then, the only US pipeline facing significant regulatory hurdles was TC Energy Corp’s (TRP) proposed Keystone XL pipeline to bring Alberta oil sands to US markets. And on the day the Obama Administration rejected that project for the final time, officials actually approved two oil pipelines elsewhere.

Everything changed following the November 2016 presidential election. Congress’ failure in 2016 to fill empty seats on the five-member Federal Energy Regulatory Commission led to the lack of quorum in early 2017.

New approvals ground to a halt for nearly six months. That gave “keep it in the ground” advocates precious time to tap into record fundraising, fueled by a groundswell of opposition to Trump Administration policies.

One result has been legal challenges to projects on an unprecedented scale at multiple venues. Work on Enbridge Inc’s (ENB, ENB) Line 3 pipeline expansion, for example, is now completed in Canada as well as North Dakota and Wisconsin.

Project suspended in June 2017.

Courts, however, have overturned Minnesota regulators’ prior approval of the project’s Environmental Impact Statement. That’s forced officials to go through the process again, delaying completion at least until the second half of 2020.

We’ve also seen a decided shift to more restrictive energy politics in several states, notably Colorado. Others like New York have dug in further in refusing to grant water permits from long-delayed projects like the Constitution Pipeline. That’s triggered warnings of prospective natural gas shortages from New York City’s distribution utility Consolidated Edison ED +0% (ED), which is restricting new customer additions.

Time equals money when it comes to multi-year, multi-billion dollar projects. Bloomberg Intelligence estimates a $2.75 million cost increase per mile of planned pipeline for every one-quarter delay in construction. The projected final cost of the Line 3 expansion, for example, is already billions higher than initial estimates.

Consequently, the game being played by pipeline opponents is to delay. That means mounting enough challenges to ramp up costs and ultimately convince developers to walk away. And for the first time, they have the funds to do the job.

Project abandoned in April 2016.

Opponents have been particularly successful quashing projects in New England and the Northeast US. To date, they’ve failed in Texas, where several giant pipelines are under construction. Kinder Morgan KMI +0% Inc (KMI) has one major gas pipeline from the Permian Basin coming on full stream later this year. It has another next year and a third in early stages of development.

Ground zero now in pipeline politics is the struggle of two projects in the Middle Atlantic/Southeast US to cross the Appalachian Trail: The Atlantic Coast Pipeline and the Mountain Valley Pipeline.

These projects’ ultimate success or failure will have a huge impact on the long-term profitability of Appalachia-based gas and oil producers, which are sitting on huge reserves in the Marcellus and Utica shale. Ironically, the longer they’re delayed, the greater demand will be for Texas energy and by extension new pipelines in the state.

That will benefit Texas developers like Kinder Morgan and Plains All-American Pipeline (PAA), which is focused on oil. And it will hit pipeline companies in the East like EQM Midstream Partners LP (EQM), which faces a massive writeoff if the Mountain Valley Pipeline can’t win through.

To be sure, natural gas development especially still has plenty of support in the US. Replacing older coal-fired facilities with gas, for example, reduces operating costs and electricity rates. New plants increase utilities’ rate base, spurring earnings and dividend growth. And the prospective environmental benefits are enormous, cutting future legal liabilities.

Gas emits none of coal’s particulate matter, which is blamed for a host of respiratory woes. It emits no acid rain gases that have caused billions in property damage and creates no toxic ash.

As for carbon dioxide, equivalent sized gas power plants emit less than half what coal does. In fact, gas adoption is the single biggest reason America is still meeting greenhouse gas commitments under the Paris Accords. Finally, surging US energy production has dramatically shifted global energy politics, demonstrated by the relative lack of reaction in oil prices to elevated tensions in the Persian Gulf.

During the Obama years, those facts were more than enough to hold together a consensus for US natural gas development. And the result was a relatively easy path for pipeline approvals.

These days, that’s not enough for pipelines to succeed. The silver lining is the more difficult it becomes to build, the more valuable existing infrastructure and ultimately successful projects will be.

In the days when pipeline approvals were swift, any company raising funds economically could get projects built. These days, would-be developers need to be financially and operationally strong enough to handle legal challenges wherever they occur.

Footnote:  The Climatist Manifesto

Mission: Deindustrialize Civilization

Goal: Drive industrial corporations into Bankruptcy

Strategy: Cut off the Supply of Cheap, Reliable Energy

Tactics:

  • Raise the price of fossil fuels
  • Force the power grid to use expensive, unreliable renewables
  • Demonize Nuclear energy
  • Spread fear of extraction technologies such as fracking
  • Increase regulatory costs on energy production
  • Scare investors away from carbon energy companies
  • Stop pipelines because they are too safe and efficient
  • Force all companies to account for carbon usage and risk

See Also Why People Rely on Pipelines

Payback Upon Climate Grasshoppers

 

 

The LIA Warming Rebound Is Over

Figure 1. Graph showing the number of volcanoes reported to have been active each year since 1800 CE. Total number of volcanoes with reported eruptions per year (thin upper black line) and 10-year running mean of same data (thick upper red line). Lower lines show only the annual number of volcanoes producing large eruptions (>= 0.1 km3 of tephra or magma) and scale is enlarged on the right axis; thick red lower line again shows 10-year running mean. Global Volcanism Project Discussion

Thanks to Dr. Francis Manns for drawing my attention to the role of Volcanoes as a climate factor, particularly related to the onset of the Little Ice Age (LIA), 1400 to 1900 AD. I was aware that the temperature record since about 1850 can be explained by a steady rise of 0.5C per century rebound overlaid with a quasi-60 year cycle, most likely oceanic driven. See below Dr. Syun Akasofu 2009 diagram from his paper Two Natural Components of Recent Warming.
When I presented this diagram to my warmist friends, they would respond, “But you don’t know what caused the LIA or what ended it!” To which I would say, “True, but we know it wasn’t due to burning fossil fuels.” Now I find there is a body of evidence suggesting what caused the LIA and why the temperature rebound may be over. Part of it is a familiar observation that the LIA coincided with a period when the sun was lacking sunspots, the Maunder Minimum.

Not to be overlooked is the climatic role of volcano activity inducing deep cooling patterns such as the LIA.  Jihong Cole-Dai explains in a paper published 2010 entitled Volcanoes and climate. Excerpt in italics with my bolds.

There has been strong interest in the role of volcanism during the climatic episodes of Medieval Warm Period (MWP,800–1200 AD) and Little Ice Age (LIA, 1400–1900AD), when direct human influence on the climate was negligible. Several studies attempted to determine the influence of solar forcing and volcanic forcing and came to different conclusions: Crowley and colleagues suggested that increased frequency of stratospheric eruptions in the seventeenth century and again in the early nineteenth century was responsible in large part for LIA. Shindell et al. concluded that LIA is the result of reduced solar irradiance, as seen in the Maunder Minimum of sunspots, during the time period. Ice core records show that the number of large volcanic eruptions between 800 and 1100 AD is possibly small (Figure 1), when compared with the eruption frequency during LIA. Several researchers have proposed that more frequent large eruptions during the thirteenth century(Figure 1) contributed to the climatic transition from MWP to LIA, perhaps as a part of the global shift from a warmer to a colder climate regime. This suggests that the volcanic impact may be particularly significant during periods of climatic transitions.

How volcanoes impact on the atmosphere and climate

Alan Robock explains Climatic Impacts of Volcanic Eruptions in Chapter 53 of the Encyclopedia of Volcanoes.  Excerpts in italics with my bolds.

The major component of volcanic eruptions is the matter that emerges as solid, lithic material or solidifies into large particles, which are referred to as ash or tephra. These particles fall out of the atmosphere very rapidly, on timescales of minutes to a few days, and thus have no climatic impacts but are of great interest to volcanologists, as seen in the rest of this encyclopedia. When an eruption column still laden with these hot particles descends down the slopes of a volcano, this pyroclastic flow can be deadly to those unlucky enough to be at the base of the volcano. The destruction of Pompeii and Herculaneum after the AD 79 Vesuvius eruption is the most famous example.

Volcanic eruptions typically also emit gases, with H2O, N2, and CO2 being the most abundant. Over the lifetime of the Earth, these gases have been the main source of the Earth’s atmosphere and ocean after the primitive atmosphere of hydrogen and helium was lost to space. The water has condensed into the oceans, the CO2 has been changed by plants into O2 or formed carbonates, which sink to the ocean bottom, and some of the C has turned into fossil fuels. Of course, we eat plants and animals, which eat the plants, we drink the water, and we breathe the oxygen, so each of us is made of volcanic emissions. The atmosphere is now mainly composed of N2 (78%) and O2 (21%), both of which had sources in volcanic emissions.

Of these abundant gases, both H2O and CO2 are important greenhouse gases, but their atmospheric concentrations are so large (even for CO2 at only 400 ppm in 2013) that individual eruptions have a negligible effect on their concentrations and do not directly impact the greenhouse effect. Global annually averaged emissions of CO2 from volcanic eruptions since 1750 have been at least 100 times smaller than those from human activities. Rather the most important climatic effect of explosive volcanic eruptions is through their emission of sulfur species to the stratosphere, mainly in the form of SO2, but possibly sometimes as H2S. These sulfur species react with H2O to form H2SO4 on a timescale of weeks, and the resulting sulfate aerosols produce the dominant radiative effect from volcanic eruptions.

The major effect of a volcanic eruption on the climate system is the effect of the stratospheric cloud on solar radiation (Figure 53.1). Some of the radiation is scattered back to space, increasing the planetary albedo and cooling the Earth’s atmosphere system. The sulfate aerosol particles (typical effective radius of 0.5 mm, about the same size as the wavelength of visible light) also forward scatter much of the solar radiation, reducing the direct solar beam but increasing the brightness of the sky. After the 1991 Pinatubo eruption, the sky around the sun appeared more white than blue because of this. After the El Chicho´n eruption of 1982 and the Pinatubo eruption of 1991, the direct radiation was significantly reduced, but the diffuse radiation was enhanced by almost as much. Nevertheless, the volcanic aerosol clouds reduced the total radiation received at the surface.

Crowley et al 2008 go into the details in their paper Volcanism and the Little Ice Age. Excerpts in italics with my bolds.

Although solar variability has often been considered the primary agent for LIA cooling, the most comprehensive test of this explanation (Hegerl et al., 2003) points instead to volcanism being substantially more important, explaining as much as 40% of the decadal-scale variance during the LIA. Yet, one problem that has continually plagued climate researchers is that the paleo-volcanic record, reconstructed from Antarctic and Greenland ice cores, cannot be well calibrated against the instrumental record. This is because the primary instrumental volcano reconstruction used by the climate community is that of Sato et al. (1993), which is relatively poorly constrained by observations prior to 1960 (especially in the southern hemisphere).

Here, we report on a new study that has successfully calibrated the Antarctic sulfate record of volcanism from the 1991 eruptions of Pinatubo (Philippines) and Hudson (Chile) against satellite aerosol optical depth (AOD) data (AOD is a measure of stratospheric transparency to incoming solar radiation). A total of 22 cores yield an area-weighted sulfate accumulation rate of 10.5 kg/km2 , which translates into a conversion rate for AOD of 0.011 AOD/ kg/km2 sulfate. We validated our time series by comparing a canonical growth and decay curve for eruptions for Krakatau (1883), the 1902 Caribbean eruptions (primarily Santa Maria), and the 1912 eruption of Novarupta/Katmai (Alaska)

We therefore applied the methodology to part of the LIA record that had some of the largest temperature changes over the last millennium.

Figure 2: Comparison of 30-90°N version of ice core reconstruction with Jones et al. (1998) temperature reconstruction over the interval 1630-1850. Vertical dashed lines denote levels of coincidence between eruptions and reconstructed cooling. AOD = Aerosol Optical Depth.

The ice core chronology of volcanoes is completely independent of the (primarily) tree ring based temperature reconstruction. The volcano reconstruction is deemed accurate to within 0 ± 1 years over this interval. There is a striking agreement between 16 eruptions and cooling events over the interval 1630-1850. Of particular note is the very large cooling in 1641-1642, due to the concatenation of sulfate plumes from two eruptions (one in Japan and one in the Philippines), and a string of eruptions starting in 1667 and culminating in a large tropical eruption in 1694 (tentatively attributed to Long Island, off New Guinea). This large tropical eruption (inferred from ice core sulfate peaks in both hemispheres) occurred almost exactly at the beginning of the coldest phase of the LIA in Europe and represents a strong argument against the implicit link of Late Maunder Minimum (1640-1710) cooling to solar irradiance changes.

Figure 1: Comparison of new ice core reconstruction with various instrumental-based reconstructions of stratospheric aerosol forcing. The asterisks refer to some modification to the instrumental data; for Sato et al. (1993) and the Lunar AOD, the asterisk refers to the background AOD being removed for the last 40 years. For Stothers (1996), it refers to the fact that instrumental observations for Krakatau (1883) and the 1902 Caribbean eruptions were only for the northern hemisphere. To obtain a global AOD for these estimates we used Stothers (1996) data for the northern hemisphere and our data for the southern hemisphere. The reconstruction for Agung eruption (1963) employed Stothers (1996) results from 90°N-30°S and the Antarctic ice core data for 30-90°S.

During the 18th century lull in eruptions, temperatures recovered somewhat but then cooled early in the 19th century. The sequence begins with a newly postulated unknown tropical eruption in midlate 1804, which deposited sulfate in both Greenland and Antarctica. Then, there are four well-documented eruptions—an unknown tropical eruption in 1809, Tambora (1815) and a second doublet tentatively attributed in part to Babuyan (Philippines) in 1831 and Cosiguina (Nicaragua) in 1835. These closely spaced eruptions are not only large but have a temporally extended effect on climate, due to the fact that they reoccur within the 10-year recovery timescale of the ocean mixed layer.

The ocean has not recovered from the first eruption so the second eruption drives the temperatures to an even lower state.

Implications for Contemporary Climate Science

In this context Dr. Francis Manns went looking for a volcanic signature in recent temperature records. His paper is Volcano and Enso Punctuation of North American Temperature: Regression Toward the Mean  Excerpts in italics with my bolds.

Abstract: Contrary to popular media and urban mythology the global warming we have experienced since the Little Ice Age is likely finished. A review of 10 temperature time series from US cities ranging from the hottest in Death Valley, CA, to possible the most isolated and remote at Key West, FL, show rebound from the Little Ice Age (which ended in the Alps by 1840) by 1870. The United States reached temperatures like modern temperatures (1950 – 2000) by about 1870, then declined precipitously principally caused by Krakatoa, and a series of other violent eruptions. Nine of these time series started when instrumental measurement was in its infancy and the world was cooled by volcanic dust and sulphate spewed into the atmosphere and distributed by the jet streams. These ten cities represent a sample of the millions of temperature measurements used in climate models. The average annual temperatures are useful because they account for seasonal fluctuations. In addition, time series from these cities are punctuated by El Nino Southern Oscillation (ENSO).

As should be expected, temperature at each city reacted differently to differing events. Several cities measured the effects of Krakatoa in 1883 while only Death Valley, CA and Berkeley CA sensed the minor new volcano Paricutin in Michoacán, Mexico. The Key West time series shows rapid rebound from the Little Ice Age as do Albany, NY, Harrisburg, PA, and Chicago. IL long before the petroleum-industrial revolution got into full swing. Recording at most sites started during a volcanic induced temperature minimum thus giving an impression of global warming to which industrial carbon dioxide is persuasively held responsible. Carbon dioxide, however, cannot be proven responsible for these temperatures. These and likely subsequent temperatures could be the result of regression to the normal equilibrium temperatures of the earth (for now). If one were to remove the volcanic punctuation and El Nino Southern Oscillation (ENSO) input many would display very little alarming warming from 1815 to 2000. This review illustrates the weakness of linear regression as a measure of change. If there is a systemic reason for the global warming hypothesis, it is an anthropogenic error in both origin and termination. ENSO compliments and confirms the validity of NOAA temperature data. Temperatures since 2000 during the current hiatus are not available because NOAA has closed the public website.

Example of time series from Manns. Numbers refer to major named volcano eruptions listed in his paper.  For instance, #3 was Krakatoa

The cooling effect is said to have lasted for 5 years after Krakatoa erupted – from 1883 to 1888. Examination of these charts, However, shows that, e.g., Krakatoa did not add to the cooling effect from earlier eruptions of Cosaguina in 1835 and Askja in 1875. The temperature charts all show rapid rebound to equilibrium temperature for the region affected in a year or two at most.

Manns Map

Fourteen major volcanic eruptions, however, were recorded between 1883 and 1918 (Robock, 2000, and this essay). Some erupted for days or weeks and some were cataclysmic and shorter. The sum of all these eruptions from Krakatoa onward effected temperatures early in the instrumental age. Judging from wasting glaciers in the Alps, abrupt retreat began about 1860).

Manns Conclusions:
1)Four of these time series (Albany, Harrisburg, Chicago and Key West) show recovery to the range of today’s temperatures by 1870 before the eruption of Askja in 1875. The temperature rebounded very quickly after the Little Ice Age in the northern hemisphere.

Manns ENSO Map

2)Volcanic eruptions and unrelated huge swings shown from ENSO largely rule global temperature. Volcanic history and the El Nino Southern Oscillation (ENSO) trump all other increments of temperature that may be hidden in the lists.

3)The sum of the eruptions from Krakatoa (1883) to Katla (1918) and Cerro Azul (1932) was a cold start for climate models.

4)It is beyond doubt that academic and bureau climate models use data that was gathered when volcanic activity had depressed global temperature. The cluster from Krakatoa to Katla (1883 -1918) were global.

5)Modern events, Mount Saint Helens and Pinatubo, moreover, were a fraction of the event intensity of the late 19th and early 20th centuries eruptions.

6) The demise of frequent violent volcanos has allowed the planet to regress toward a norm (for now).

Summary

These findings describe a natural process by which a series of volcanoes along with a period of quiet solar cycles ended the Medieval Warm Period (MWP), chilling the land and inducing deep oceanic cooling resulting in the Little Ice Age. With much less violent volcanic activity in the 20th century, coincidental with typically active solar cycles, a Modern Warm Period ensued with temperatures rebounding back to approximately the same as before the LIA.

This suggests that humans and the biosphere were enhanced by a warming process that has ended. The solar cycles are again going quiet and are forecast to continue that way. Presently, volcanic activity has been routine, showing no increase over the last 100 years. No one knows how long will last the current warm period, a benefit to us from the ocean recovering after the LIA. But future periods are as likely to be cooler than to be warmer compared to the present.

Scientific vs. Social Authenticity

Credit: Stanislaw Pytel Getty Images

This post was triggered by an essay in Scientific American Authenticity under Fire by Scott Barry Kaufman. He raises modern issues and expresses a social and psychological sense of authenticity that left me unsatisfied.  So following that, I turn to a scientific standard much richer in meaning and closer to my understanding.

Social Authenticity

Researchers are calling into question authenticity as a scientifically viable concept

Authenticity is one of the most valued characteristics in our society. As children we are taught to just “be ourselves”, and as adults we can choose from a large number of self-help books that will tell us how important it is to get in touch with our “real self”. It’s taken as a given by everyone that authenticity is a real thing and that it is worth cultivating.

Even the science of authenticity has surged in recent years, with hundreds of journal articles, conferences, and workshops. However, the more that researchers have put authenticity under the microscope, the more muddied the waters of authenticity have become.

Many common ideas about authenticity are being overturned.
Turns out, authenticity is a real mess.

One big problem with authenticity is that there is a lack of consensus among both the general public and among psychologists about what it actually means for someone or something to be authentic. Are you being most authentic when you are being congruent with your physiological states, emotions, and beliefs, whatever they may be?

Another thorny issue is measurement. Virtually all measures of authenticity involve self-report measures. However, people often do not know what they are really like or why they actually do what they do. So tests that ask people to report how authentic they are is unlikely to be a truly accurate measure of their authenticity.

Perhaps the thorniest issue of them all though is the entire notion of the “real self”. The humanistic psychotherapist Carl Rogers noted that many people who seek psychotherapy are plagued by the question “Who am I, really?” While people spend so much time searching for their real self, the stark reality is that all of the aspects of your mind are part of you.

So what is this “true self” that people are always talking about? Once you take a closer scientific examination, it seems that what people refer to as their “true self” really is just the aspects of themselves that make them feel the best about themselves.

Even more perplexing, it turns out that most people’s feelings of authenticity have little to do with acting in accord with their actual nature. The reality appears to be quite the opposite. All people tend to feel most authentic when having the same experiences, regardless of their unique personality.

Another counterintuitive finding is that people actually tend to feel most authentic when they are acting in socially desirable ways, not when they are going against the grain of cultural dictates (which is how authenticity is typically portrayed). On the flip side, people tend to feel inauthentic when they are feeling socially isolated, or feel as though they have fallen short of the standards of others.

Therefore, what people think of as their true self may actually just be what people want to be seen as. According to social psychologist Roy Baumeister, we will report feeling highly authentic and satisfied when the way others think of us matches up with how we want to be seen, and when our actions “are conducive to establishing, maintaining, and enjoying our desired reputation.”

Conversely, Baumeister argues that when people fail to achieve their desired reputation, they will dismiss their actions as inauthentic, as not reflecting their true self (“That’s not who I am”). As Baumeister notes, “As familiar examples, such repudiation seems central to many of the public appeals by celebrities and politicians caught abusing illegal drugs, having illicit sex, embezzling or bribing, and other reputation-damaging actions.”

Kaufman Conclusion

As long as you are working towards growth in the direction of who you truly want to be, that counts as authentic in my book regardless of whether it is who you are at this very moment. The first step to healthy authenticity is shedding your positivity biases and seeing yourself for who you are, in all of your contradictory and complex splendor. Full acceptance doesn’t mean you like everything you see, but it does mean that you’ve taken the most important first step toward actually becoming the whole person you most wish to become. As Carl Rogers noted, “the curious paradox is that when I accept myself just as I am, then I can change.”

My Comment:
Kaufman describes contemporary ego-centric group-thinking, which leads to the philosophical dead end called solipsism. As an epistemological position, solipsism holds that knowledge of anything outside one’s own mind is unsure; the external world and other minds cannot be known and might not exist outside the mind.

His discussion proves the early assertion that authenticity (in the social or psychological sense) is indeed a mess. The author finds no objective basis to determine fidelity to reality, thus leaving everyone struggling whether to be self-directed or other-directed. As we know from Facebook, most resolve that conflict by competing to see who can publish the most selfies while acquiring the most “friends.”This is the best Scientific American can do? The swamp is huge and deep indeed.

It reminds me of what Ross Pomeroy wrote at Real Science: “Psychology, as a discipline, is a house made of sand, based on analyzing inherently fickle human behavior, held together with poorly-defined concepts, and explored with often scant methodological rigor. Indeed, there’s a strong case to be made that psychology is barely a science.”

Scientific Authenticity

In contrast, let us consider some writing by Philip Kanarev, A practicing physicist, he is concerned with the demise of scientific thinking and teaching and calls for a return to fundamentals. His essay is Scientific Authenticity Criteria by Ph. M. Kanarev in the General Science Journal.  Excerpts in italics with my bolds.

A conjunction of scientific results in the 21st century has reached a level that provides an opportunity to find and to systematize the scientific authenticity criteria of precise knowledge already gained by mankind.

Neither Euclid, nor Newton gave precise definitions of the notions of an axiom, a postulate and a hypothesis. As a result, Newton called his laws the axioms, but it was in conflict with the Euclidean ideas concerning the essence of the axioms. In order to eliminate these contradictions, it was necessary to give a definition not only to the notions of the axiom and the postulate, but also to the notion of the hypothesis. This necessity is stipulated by the fact that any scientific research begins with an assumption regarding the reason causing a phenomenon or process being studied. A formulation of this assumption is a scientific hypothesis.

Thus, the axioms and the postulates are the main criteria of authenticity of any scientific result.

An axiom is an obvious statement, which requires no experimental check and has no exceptions. Absolute authenticity of an axiom appears from this definition. It protects it by a vivid connection with reality. A scientific value of an axiom does not depend on its recognition; that is why disregarding an axiom as a scientific authenticity criterion is similar to ineffectual scientific work.

A postulate is a non-obvious statement, its reliability being proven in the way of experiment or a set of theoretic results originating from the experiments. The reliability of a postulate is determined by the level of acknowledgement by the scientific community. That’s why its value is not absolute.

An hypothesis is an unproven statement, which is not a postulate. A proof can be theoretical and experimental. Both proofs should not be at variance with the axioms and the recognized postulates. Only after that, hypothetical statements gain the status of postulates, and the statements, which sum up a set of axioms and postulates, gain the status of a trusted theory.

The first axioms were formulated by Euclid. Here are some of them:
1 – To draw a straight line from any point to any point.
2 – To produce a finite straight line continuously in a straight line.
3 – That all right angles equal one another.

Euclidean formulation concerning the parallelism of two straight lines proved to be less concise. As a result, it was questioned and analyzed in the middle of the 19th century. It was accepted that two parallel straight lines cross at infinity. Despite a complete absence of evidence of this statement, the status of an axiom was attached to it. Mankind paid a lot for such an agreement among the scientists. All theories based on this axiom proved to be faulty. The physical theories of the 20th century proved to be the principal ones among them.

In order to understand the complicated situation being formed, one has to return to Euclidean axioms and assess their completeness. It has turned out that there are no axioms, which reflect the properties of the primary elements of the universe (space, matter and time), among those of Euclid. There are no phenomena, which could compress space, stretch it or distort it, in the nature; that is why space is absolute. There are no phenomena, which change the rate of the passing of time in nature. Time does not depend on anything; that’s why we have every reason to consider time absolute. The absolute nature of space and time has been acknowledged by scientists since Euclidean times. But when his axiom concerning the parallelism of straight lines was disputed, the ideas of relativity of space and time as well as the new theories, which were based on these ideas and proved (as we noted) to be faulty, appeared.

A law of acknowledgement of new scientific achievements was introduced by Max Planck. He formulated it in the following way: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it”. Our attempt to report the reliability of this law to the authorities is in the history of science an unnecessary intention. Certainly, time appeared in space only after matter. But still we do not know of a source that produces elementary particles – building blocks of the material world. That’s why we have no reason to consider matter absolute. But it does not prevent us from paying attention to an interconnection of the primary elements of the universe: space, matter and time. They exist only together and regardless of each other. This fact is vivid, and we have every reason to consider an indivisible existence of space, matter and time as an axiomatic one, and to call the axiom, which reflects this fact, the Unity axiom. The philosophic essence of this axiom has been noted long ago, but the practitioners of the exact sciences have failed to pay attention to the fact that it is implemented in the experimental and analytical processes of cognition of the world. When material bodies move, the mathematical description of this motion should be based on the Unity axiom. It appears from this axiom, that an axis of motion of any object is the time function. Almost all physical theories of the 20th century are in conflict with the Unity axiom. It is painful to write about it in detail.

Let us go on analyzing the role of postulates as scientific authenticity criteria. First of all, let us recollect the famous postulate by Niels Bohr concerning the orbital motion of the electrons in atoms. This catchy model of the process of the interaction of the electrons in the atoms goes on being formed in the mind of the pupils in school despite of the fact that its impropriety has been proven more than 10 years ago.

The role of Niels Bohr’s generalized postulate is great. Practically, it is used in the whole of modern chemistry and the larger part of physics. This postulate is based on the calculation of the spectrum of the hydrogen atom. But it is impossible to calculate the spectrum of the first orbit of the helium atom (which occupies the second place in Mendeleev’s table,) with Bohr’s postulate, to say nothing of the spectra of more complicated atoms and ions. It was enough to dispute the authenticity of Bohr’s postulate, but the mission of doubt has fallen to our lot for some reason. Two years were devoted to decoding the spectrum of the first electron of the helium atom. As a result, the law of formation of the spectra of atoms and ions has taken place as well as the law of the change of binding energy of the electron with the protons of the nuclei when energy-jumps take place in the atoms. It has turned out that there is no energy of orbital motion of the electrons in these laws; there are only the energies of their linear interaction with the protons of the nuclei.

Thereafter, it has become clear that only elementary particle models can play the role of the scientific result authenticity criteria in cognition of the micro-world. From the analysis of behaviour of these models, one should derive the mathematical models, which have been ascertained analytically long ago, and describe their behaviour in the experiments that have been carried out earlier.

The ascertained models of the photons of all frequencies, the electron, the proton and the neutron meet the above-mentioned requirements. They are interconnected with each other by such a large set of theoretical and experimental information, whose impropriety cannot be proven. This is the main feature of the proximity to reality of the ascertained models of the principle elementary particles. Certainly, the process of their generation has begun from a formulation of the hypothesis concerning their structures. Sequential development of the description of these structures and their behaviour during the interactions extended the range of experimental data where the parameters of the elementary particles and their interactions were registered. For example, the formation and behaviour of electrons are governed by more than 20 constants.

We have every reason to state that the models of the photons, the electron, the proton and the neutron, which have been ascertained by us, as well as the principles of formation of the nuclei, the atoms, the ions, the molecules and the clusters already occupy a foundation for the postulates, and new scientific knowledge will cement its strength.

Science has a rather complete list of criteria in order to estimate the authenticity of scientific investigative results. The axioms (the obvious statements, which require no experimental check and have no exceptions,) occupy the first place; the second place is occupied by the postulates. If the new theory is in conflict with at least one axiom, it will be rejected immediately by the scientific community without discussion. If the experimental data, which are in conflict with any postulate (as it happened, for example, to the Newton’s first law), appear, the future scientific community, which has learned a lesson from scientific cowardice of the academic elite of the 20th century, will submit such a postulate to a collective analysis of its authenticity.

Kanarev Conclusion

To the academicians who have made many mistakes in knowledge of the fields of physics and chemistry, we wish them to recover their sight in old age and be glad that these mistakes are already amended. It is time to understand that a prolongation of stuffing the heads of young people with faulty knowledge is similar to a crime that will be taken to heart emotionally in the near future.

The time has ended, when a diploma confirming higher education was enough in order to get a job. Now it is not a convincing argument for an employer; in order to be on the safe side, he hires a young graduate as a probationer at first as he wants to see what the graduate knows and what he is able to do. A new system of higher education has almost nullified a possibility for the student to have the skills of practical work according to his specialty and has preserved a requirement to have moronic knowledge, i.e. the knowledge which does not reflect reality.

My Summary

In Science, authenticity requires fidelity to axioms and postulates describing natural realities. It also means insisting that hypotheses be validated by experimental results. Climate science claims are not scientifically authentic unless or until confirmed by observations, and not simply projections from a family of divergent computer models. And despite all of the social support for climate hysteria, those fears are again more stuffing of nonsense into heads of youth and of the scientifically illiterate.

See Also Degrees of Climate Truth

Superhuman Renewables Targets

Faster than a speeding bullet! More powerful than a locomotive! Able to leap tall buildings in a single bound! It’s Superman.

New York is not the only climate cuckoo’s nest in the United States. Here are four more states promising efforts to install wind and solar power at rates that would exhaust Superman. EIA reports; Four states updated their renewable portfolio standards in the first half of 2019. Excerpts in italics with my bolds.

As of the end of 2018, 29 states and the District of Columbia had renewable portfolio standards (RPS), or polices that require electricity suppliers to source a certain portion of their electricity from designated renewable resources or eligible technologies. Four states—New Mexico, Washington, Nevada, and Maryland—and the District of Columbia have updated their RPS since the start of 2019.

States with legally binding RPS collectively accounted for 63% of electricity retail sales in the United States in 2018. In addition to the 29 states with binding RPS policies, 8 states have nonbinding renewable portfolio goals.

New Mexico increased its overall RPS target in March 2019 to 100% of electricity sales from carbon-free generation by 2045, up from the previous target of 20% renewable generation by 2020. The new policy only applies to investor-owned utilities; cooperative electric utilities have until 2050 to reach the 100% carbon-free generation goal. The target has intermittent goals of 50% renewable generation by 2030 and 80% renewable generation by 2040.

In April 2019, the Nevada legislature increased its RPS to 50% of sales from renewable generation by 2030, including a goal of 100% of electricity sales from clean energy by 2050. Later that month, Washington increased its RPS target to 100% of sales from carbon-neutral generation by 2045, an increase from the previous target of 15% of sales from renewable generation by 2020. In addition, the policy mandates a phaseout of coal-fired electricity generation in Washington by 2025. Nevada and Washington became the fourth and fifth states, respectively, to pass legislation for 100% clean electricity, following Hawaii, California, and New Mexico.

In May 2019, Maryland increased its overall RPS target to 50% of electricity sales from renewable generation by 2030, replacing the earlier target of 22.5% by 2024. In addition, the legislation mandates further study of the effects and the possibility of Maryland reaching 100% generation from renewables by 2040.

Update on Warming Hiatus

Figure 1. Comparison of HadCRUT3 and the latest HadCRUT4.6 Notice how all trends pivot around the 1998 El Nino peak.

Clive Best has a new post exploring the question: Whatever happened to the Global Warming Hiatus ? Excerpts in italics with my bolds and comment at end.

The last IPCC assessment in 2013 showed a clear pause in global warming lasting 16 years from 1998 to 2012 – the notorious hiatus. As a direct consequence of this AR5 estimates of climate sensitivity were reduced and CMIP5 models appeared to clearly overestimate trends. Following the first release of HadCRUT4 in 2014 the ‘headline’ then was that 2005 and 2010 were marginally warmer than 1998. This was the first dent in removing the hiatus. Since then each new version of H4 has showed further incremental warming trends, such that by 2019 the hiatus has now completely vanished. Anyone mentioning it today is likely to be ridiculed by the climate science community. So how did this reversal happen within just 5 years? I decided to find out exactly why the post 1998 temperature record changed so dramatically in such a short period of time.

In what follows I always use the same algorithm as CRU for the station data and then blend that with the Hadley SST data. I have checked that I can reproduce exactly the latest HadCRUT4.6 results based on the current 7820 stations from CRU merged with HadSST3. Back in 2012 I downloaded the original station data from CRU – CRUTEM3. I have also downloaded the latest CRUTEM4 station data.

Figure 1 above compares the latest HadCRUT4.6 results with the last version of HadCRUT3.

I had assumed that the reason for the apparent trend change was because CRUTEM4 had added many new weather stations in the Arctic (removing some in S.America as well), while additionally the SST data had also been updated (HadSST2 moved to HADSST3). However, as I show below, my assumption simply isn’t true.

To investigate I recalculated a ‘modern’ version of HadCRUT3 by using only the original 4100 stations (used by CRUTEM3) from CRUTEM4 station data.  The list of these stations are defined here. I then merged these with  both the older HadSST2 and HADSST3 to derive annual global temperature anomalies. Figure 2 shows the result. I get almost exactly the same values as the full 7820 stations in HadCRUT4. It certainly does not reproduce HadCRUT3 !

Figure 2. The black curve is based on “modern” CRUTEM3 stations combined with HADSST3 and the Yellow curve is “modern” CRUTEM3 stations with HADSST2

This result provides two conclusions.

  1. Modern CRUTEM3 stations give a different result to the original CRUTEM3 stations.
  2. SST data is not responsible for the difference between HadCRUT4 and HadCRUT3

    To confirm point 1) I used exactly the same code to regenerate HadCRUT3 temperature series using the original CRUTEM3 station data as opposed to the ‘modern’ values based on CRUTEM4.
Figure 3: Comparison of HadCRUT3 with my calculation using the original CRUTEM3 station data.

The original CRUTEM3 station data I had previously downloaded in 2012. These are combined with HADSST2 data. Now we see that the agreement with the H3 annual temperatures is very good, and indeed reproduces the hiatus.

So the conclusion is very simple. The monthly temperature values in over 4000 CRUTEM3 stations have all been continuously changed, and it is these changes alone that have resulted in transforming the 16 year long hiatus in global warming into a rising temperature trend. Furthermore all these updates have only affected temperatures AFTER 1998! Temperatures before 1998 have hardly changed at all, which is the second requirement needed to eliminate the hiatus.

P.S. I am sure there are excellent arguments as to why pair-wise ‘homogenisation’ is wonderful but why then does it only affect data after 1998 ?

Comment:

Meanwhile, UAH is showing a return to the mean annual anomaly since 1995.

New York: Climate Cuckoo’s Nest

Scene from One Flew Over the Cuckoo’s Nest

This just in from the Cuckoo’s Nest AKA New York State Legislature: N.Y. lawmakers agree to historic climate plan by Benjamin Storrow, E&E News. Excerpts in italics with my bolds. The comments are powerful displays of the brain damage from drinking too much climate kool aid.

Gov. Andrew Cuomo (D) said yesterday he has reached an agreement with legislative leaders over a bill to slash New York’s greenhouse gas emissions, setting the stage for one of the most significant state climate victories since President Trump took office.

The announcement, coming just days before the close of the legislative session, represented a big victory for climate activists, who have spent three years pushing for major legislation to curb greenhouse gases in the Empire State.

Lawmakers were still working on final amendments yesterday, but the outlines of the deal were becoming clear. The legislation calls for reducing emissions by 40% from 1990 levels by 2030 and 85% by 2050. The remaining 15% of emissions would be offset, making the state carbon neutral. The bill would also require that all electricity generation come from carbon-free sources by 2040. A Climate Action Council would be established to ensure the state meets its targets.

“I believe we have an agreement, and I believe it is going to pass,” Cuomo said in a radio interview on WAMC.

The comment ended months of speculation over the fate of climate legislation in New York. Democratic lawmakers, who seized complete control of state government when they took over the state Senate last fall, had been pushing a bill called the “Climate and Community Protection Act.” The bill would spend 40% of the state’s clean energy revenues on energy efficiency measures and renewable installations in disadvantaged communities.

That drew repeated public objections from Cuomo, who said he wanted to ensure that environmental revenue was spent on environmental programs. Ultimately, the two sides settled on a compromise: At least 35% of revenues would go to disadvantaged communities. That funding could rise as high as 40%, which would amount to $370 million in fiscal 2018-19.

“It was a question of the distribution of the funding,” Cuomo told WAMC. “I understand the politics on these issues. Everyone wants to make all these advocacy groups happy. Taxpayers’ money is taxpayers’ money. And if it’s taxpayers’ money for an environmental purpose, I want to make sure it’s going to an environmental purpose.

“This transformation to a new green economy is very expensive. We don’t have the luxury of using funding for political purposes.”

Business interests had urged Cuomo and Democratic lawmakers to slow down, saying the legislation threatened 40,000 manufacturing jobs in the state. The Business Council of New York State called zero carbon emissions “unrealistic.”

But Democratic lawmakers forged ahead, working through the weekend to iron out a deal with Cuomo before a filing deadline for legislation Sunday. They argued that:

The risks of climate change, coupled with the benefits of a green energy economy, outweighed the potential costs.

“It means that on Father’s Day, when I see my grandchildren next year, I’ll have a lot less uncertainty about their future than I did yesterday morning,”

said Democratic Assemblyman Steve Englebright, a champion of the climate legislation. “It means we are going to be in the vanguard among states, tackling a problem that will affect every jurisdiction here and around the globe.

New York will lead the way.”

State Sen. Todd Kaminsky (D) said New York’s action would send a major signal to markets, helping companies plan for a cleaner future. But ultimately, he said, lawmakers were responding to voters.

“Our constituents told us, ‘Don’t come back without doing something on climate,'” Kaminsky said. “The future is now. I think we’ve taken that important step.”

Policy mandate with teeth’

Republican control of the state Senate meant climate policy in New York had been centered in the governor’s office until this year. Cuomo has pumped out executive orders banning hydraulic fracturing, calling for the closure of the state’s remaining coal plants in 2020 and targeting a 40% reduction in emissions by 2030, among other things.

The legislation enshrines many of Cuomo’s targets into law, ensuring they will outlast the current governor. The new Climate Action Council would be required to issue recommendations on how to install 6 gigawatts of distributed solar by 2025, 9 GW of offshore wind by 2035 and 3 GW of energy storage by 2030.

But the law also represents a power shift of sorts. Environmentalists have long criticized Cuomo for failing to follow through on many of his environmental objectives. His environmental agencies have yet to produce a climate action plan, despite an executive order directing them to do so. And his coal regulation arrived earlier this year, when only two coal plants remained (Climatewire, May 10).

The legislation ensures the governor will follow through, advocates said. It would require an annual greenhouse gas inventory. The Climate Action Council would release a report every four years detailing the state’s progress (or lack thereof) toward its emissions goals.

And by putting the state’s climate commitment in law, it would open the door for citizens to sue if New York does not meet its targets.

“This isn’t just a planning document. It is a policy mandate with teeth that they can’t evade or avoid,” said Eddie Bautista, executive director of the New York City Environmental Justice Alliance.

“By inserting climate justice, it becomes a plan about humanity; it becomes a plan about the dignity and value we place on humanity. There are ways to help all of us to catalyze economic development for all of us,”

Bautista said. “That is why so many people are excited about this plan and hope it replicates.”

Unlike many of its peers, the Empire State is casting its gaze beyond power plants, the traditional realm of state climate policy. Transportation accounts for roughly a third of the state’s emissions, followed by residential heating and cooling (16%) and electricity generation (13%). The bill would direct the Climate Action Council to recommend emissions performance standards for the transportation, building and industrial sectors.

At the same time, the legislation leaves considerable questions regarding how New York intends to slash emissions. Much would depend on the Climate Action Council and the governor, who will ultimately be charged with implementing the legislation.

New York has considerable ground to make up. The state cut emissions only 8% between 1990 and 2015, according to the most recent New York greenhouse gas inventory.

Greens acknowledged the amount of work ahead but said they are optimistic.

“Now, for the first time, we actually have our commitment on climate enshrined into law,” said Conor Bambrick, who leads the air and energy program at Environmental Advocates of New York. “That is going to allow for real long-term planning and action so New York can get serious about tackling this problem, serious about getting innovative about ways to tackle this problem and doing it equitably.”

My Comment:

Sorry New York, you are not leading the way. Germany and South Australia have already gone down this rabbit hole, and you can only surpass them by an even greater destruction to your economy and society.

Arctic Ice In Perspective

With Arctic ice melting season underway, warmists are again stoking fears about ice disappearing in the North.  In fact, the pattern of Arctic ice seen in historical perspective is not alarming. People are over-thinking and over-analyzing Arctic Ice extents, and getting wrapped around the axle (or should I say axis).  So let’s keep it simple and we can all readily understand what is happening up North.

I will use the ever popular NOAA dataset derived from satellite passive microwave sensors.  It sometimes understates the ice extents, but everyone refers to it and it is complete from 1979 to 2018.  Here’s what NOAA reports (in M km2):

We are frequently told that only the March maximums and the September minimums matter, since the other months are only transitional between the two.  So the graph above shows the mean ice extent, averaging the two months March and September.

If I were adding this to the Ice House of Mirrors, the name would be The X-Ray Ice Mirror, because it looks into the structure of the time series.   For even more clarity and simplicity, here is the table:

NOAA NH Annual Average Ice Extents (in M km2).  Sea Ice Index v3.0 (here)

Year Average Change Rate of Change
1979 11.697
1996 11.353 -0.344 -0.020 per year
2007 9.405 -1.949 -0.177 per year
2018 9.506  +0.102 +0.009 per year

The satellites involve rocket science, but this does not.  There was a small loss of ice extent over the first 17 years, then a dramatic downturn for 11 years, 9 times the rate as before. That was followed by the current plateau with no further loss of ice extent.  All the fuss is over that middle period, and we know what caused it.  A lot of multi-year ice was flushed out through the Fram Strait, leaving behind more easily melted younger ice. The effects from that natural occurrence bottomed out in 2007.

Kwok et al say this about the Variability of Fram Strait ice flux:

The average winter area flux over the 18-year record (1978–1996) is 670,000 km2, ;7% of the area of the Arctic Ocean. The winter area flux ranges from a minimum of 450,000 km2 in 1984 to a maximum of 906,000 km2 in 1995. . .The average winter volume flux over the winters of October 1990 through May 1995 is 1745 km3 ranging from a low of 1375 km3 in the 1990 flux to a high of 2791 km3 in 1994.

https://www.researchgate.net/publication/261010602/download

Conclusion:

Some complain it is too soon to say Arctic Ice is recovering, or that 2007 is a true change point.  The same people were quick to jump on a declining period after 1996 as evidence of a “Death Spiral.”

Footnote:

No one knows what will happen to Arctic ice.

Except maybe the polar bears.

And they are not talking.

Except, of course, to the admen from Coca-Cola

We are Ignored, then Dissed, then Debated, then We Win.

Global Warming Debate Soho Forum May 8, 2019

The post title is a reference to a quote from Mahatma Gandhi who said, when facing overwhelming odds opposed by an entrenched establishment in India:

Watch: Skeptical scientist wins rare New York City climate debate against warmist scientist – Audience flips from warmist views to skeptical after debate (H/T John Ray at his blog Greenie Watch)Excerpts in italics with my bolds.

The Soho Forum, Published on May 6, 2019

Resolution: There is little or no rigorous evidence that rising concentrations of carbon dioxide are causing dangerous global warming and threatening life on the planet.

For the affirmative:

Craig Idso is the founder, former president, and currently chairman of the Center for the Study of Carbon Dioxide and Global Change. The Center was founded in 1998 as a non-profit public charity dedicated to discovering and disseminating scientific information pertaining to the effects of atmospheric carbon dioxide enrichment on climate and the biosphere.

Dr. Idso’s research has appeared many times in peer-reviewed journals, and is the author or coauthor of several books, including The Many Benefits of Atmospheric CO2 Enrichment (Vales Lake Publishing, LLC, 2011), CO2, Global Warming and Coral Reefs (Vales Lake Publishing, LLC, 2009).

Dr. Idso also serves as an adjunct scholar for the Cato Institute and as a policy advisor for the CO2Coalition, the Heartland Institute and the Committee For A Constructive Tomorrow.

For the negative:

Jeffrey Bennett is an astrophysicist and educator. He has focused his career on math and science literacy. He is the lead author of bestselling college textbooks in astronomy, astrobiology, mathematics, and statistics, and of critically acclaimed books for the general public on topics including Einstein’s theory of relativity, the search for extraterrestrial life, and the importance of math to our everyday lives.

Other career highlights include serving two years as a Visiting Senior Scientist at NASA Headquarters, proposing and helping to develop the Voyage Scale Model Solar System that resides on the National Mall in Washington, DC, and creating the freeTotality app that has helped tens of thousands of people learn how to view a total solar eclipse.

His book A Global Warming Primeris posted freely online at http://www.globalwarmingprimer.com.

Moderator: “We have the final vote. The yes vote on the resolution that there is no evidence that’s causing dangerous global warming: It began at 24% (of the skeptical yes vote supporting that position) and it went up to 46% (after the debate). So [skeptical argument] gained 22% points. That’s the number to beat (46%).

The no resolution (warmist position) started at 29%. It went up to 41% or up 11 points.” The winner of the debate is skeptical scientist Dr. Craig Idso with his resolution asserting that “There is little or no rigorous evidence that rising concentrations of carbon dioxide are causing dangerous global warming and threatening life on the planet.”

Flashback 2007: Scientific Smackdown: Skeptics Voted The Clear Winners Against Global Warming Believers in Heated NYC Debate – RealClimate.org’s Gavin Schmidt appeared so demoralized that he mused that debates equally split between believers of a climate ‘crisis’ and scientific skeptics are probably not “worthwhile” to ever agree to again.

The 2007 Debate was on the statement: Global Warming is not a Crisis. Clips, Transcripts and Results are here :
https://www.intelligencesquaredus.org/debates/global-warming-not-crisis

December 1, 2009, was the Munk Debate on the statement: Be it resolved, climate change is mankind’s defining crisis, and demands a commensurate response… 
PRO George Monbiot, Elizabeth May 
CON Bjørn Lomborg, Lord Nigel Lawson
RESULT CON gains 8%. CON wins
See: https://munkdebates.com/debates/climate-change

More Recent is the March 2020 Karoly/Tamblyn–Happer Dialogue on Global Warming at The Best Schools

Drs. Karoly and Happer argued the following theses:

Dr. Karoly: Science has established that it is virtually certain that increases of atmospheric CO2 due to burning of fossil fuels will cause climate change that will have substantial adverse impacts on humanity and on natural systems. Therefore, immediate, stringent measures to suppress the burning of fossil fuels are both justified and necessary.

Dr. Happer: There is no scientific basis for the claim that increases of atmospheric CO2 due to burning of fossil fuels will cause climate change that will have substantial adverse impacts on humanity and on natural systems. If fossil fuels are burnt responsibly to limit real pollutants like fly ash, oxides of nitrogen or sulfur, heavy metals, etc., the CO2 released will be a benefit to the world. Any resulting climate change will be moderate, and there will be very major benefits to agriculture and other plant life.