Inside the Sea Level Scare Machine

3047060508_737c7687bd_o.0.0

Such beach decorations exhibit the fervent belief of activists that sea levels are rising fast and will flood the coastlines if we don’t stop burning fossil fuels.  As we will see below there is a concerted effort to promote this notion empowered with slick imaging tools to frighten the gullible.  Of course there are frequent media releases sounding the alarms.  Recently for example:

From the Guardian Up to 410 million people at risk from sea level rises – study.  Excerpts in italics with my bolds.

The paper, published in Nature Communications, finds that currently 267 million people worldwide live on land less than 2 metres above sea level. Using a remote sensing method called Lidar, which pulsates laser light across coastal areas to measure elevation on the Earth’s surface, the researchers predicted that by 2100, with a 1 metre sea level rise and zero population growth, that number could increase to 410 million people.

The climate emergency has caused sea levels to rise and more frequent and severe storms to occur, both of which increase flood risks in coastal environments.

Last year, a survey published by Climate and Atmospheric Science, which aggregated the views of 106 specialists, suggested coastal cities should prepare for rising sea levels that could reach as high as 5 metres by 2300, which could engulf areas home to hundreds of millions of people.

The rest of this post provides a tour of seven US cities demonstrating how the sea level scare machine promotes fear among people living or invested in coastal properties.  In each case there are warnings published in legacy print and tv media, visual simulations powered by computers and desktop publishing, and a comparison of imaginary vs. observed sea level trends.

Prime US Cities on the “Endangered” List
Newport , R.I.

Examples of Media Warnings

Bangor Daily News:  In Maine’s ‘City of Ships,’ climate change’s coastal threat is already here

Bath, the 8,500-resident “City of Ships,” is among the places in Maine facing the greatest risks from increased coastal flooding because so much of it is low-lying. The rising sea level in Bath threatens businesses along Commercial and Washington streets and other parts of the downtown, according to an analysis by Climate Central, a nonprofit science and journalism organization.

Water levels reached their highest in the city during a record-breaking storm in 1978 at a little more than 4 feet over pre-2000 average high tides, and Climate Central’s sea level team found there’s a 1-in-4 chance of a 5-foot flood within 30 years. That level could submerge homes and three miles of road, cutting off communities that live on peninsulas, and inundate sites that manage wastewater and hazardous waste along with several museums.

UConn Today:  Should We Stay or Should We Go? Shoreline Homes and Rising Sea Levels in Connecticut

As global temperatures rise, so does the sea level. Experts predict it could rise as much as 20 inches by 2050, putting coastal communities, including those in Connecticut, in jeopardy.

One possible solution is a retreat from the shoreline, in which coastal homes are removed to take them out of imminent danger. This solution comes with many complications, including reductions in tax revenue for towns and potentially diminished real estate values for surrounding properties. Additionally, it can be difficult to get people to volunteer to relocate their homes.

Computer Simulations of the Future

Newport Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

Newport past & projected 2020

Boston, Mass.

Example of Media Warnings

From WBUR Radio Boston:  Rising Sea Levels Threaten MBTA’s Blue Line

Could it be the end of the Blue Line as we know it? The Blue Line, which features a mile-long tunnel that travels underwater, and connects the North Shore with Boston’s downtown, is at risk as sea levels rise along Boston’s coast. To understand the threat sea-level rise poses to the Blue Line, and what that means for the rest of the city, we’re joined by WBUR reporter Simón Ríos and Julie Wormser, Deputy Director at the Mystic River Watershed Association.

As sea levels continue to rise, the Blue Line and the whole MBTA system face an existential threat. The MBTA is also facing a serious financial crunch, still reeling from the pandemic, as we attempt to fully reopen the city and the region. Joining us to discuss is MBTA General Manager Steve Poftak.

Computer Simulations of the Future

Boston Obs Imaged2

 

Imaginary vs. Observed Sea Level Trends (2020 Update)

Boston Past and Projected 2020

 

New York City

Example of Media Warnings

From Quartz: Sea level rise will flood the neighborhood around the UN building with two degrees warming

Right now, of every US city, New York City has the highest population living inside a floodplain. By 2100, seas could rise around around the city by as much as six feet. Extreme rainfall is also predicted to rise, with roughly 1½ times more major precipitation events per year by the 2080s, according to a 2015 report by a group of scientists known as the New York City Panel on Climate Change.

But a two-degree warming scenario, which the world is on track to hit, could lock in dramatic sea level rise—possibly as much as 15 feet.

Computer Simulations of the Future

NYC Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

NYC past & projected 2020

Philadelphia, PA.

Example of Media Warnings

From NBCPhiladelphia:  Climate Change Studies Show Philly Underwater

NBC10 is looking at data and reading studies on climate change to showcase the impact. There are studies that show if the sea levels continue to rise at this rate, parts of Amtrak and Philadelphia International Airport could be underwater in 100 years.

Computer Simulations of the Future

Philly Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

Phil past & projected 2020

Miami, Florida

Examples of Media Warnings

From WLRN Miami: Miles Of Florida Roads Face ‘Major Problem’ From Sea Rise. Is State Moving Fast Enough?

One 2018 Department of Transportation study has already found that a two-foot rise, expected by mid-century, would imperil a little more than five percent — 250-plus miles — of the state’s most high-traffic highways. That may not sound like a lot, but protecting those highways alone could easily cost several billion dollars. A Cat 5 hurricane could be far worse, with a fifth of the system vulnerable to flooding. The impact to seaports, airports and railroads — likely to also be significant and expensive — is only now under analysis.

From Washington Post:  Before condo collapse, rising seas long pressured Miami coastal properties

Investigators are just beginning to try to unravel what caused the Champlain Towers South to collapse into a heap of rubble, leaving at least 159 people missing as of Friday. Experts on sea-level rise and climate change caution that it is too soon to speculate whether rising seas helped destabilize the oceanfront structure. The 40-year-old building was relatively new compared with others on its stretch of beach in the town of Surfside.

But it is already clear that South Florida has been on the front lines of sea-level rise and that the effects of climate change on the infrastructure of the region — from septic systems to aquifers to shoreline erosion — will be a management problem for years.

Computer Simulations of the Future

Florida Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

KW past & projected 2020

Houston, Texas

Example of Media Warnings

From Undark:  A $26-Billion Plan to Save the Houston Area From Rising Seas

As the sea rises, the land is also sinking: In the last century, the Texas coast sank about 2 feet into the sea, partly due to excessive groundwater pumping. Computer models now suggest that climate change will further lift sea levels somewhere between 1 and 6 feet over the next 50 years. Meanwhile, the Texas coastal population is projected to climb from 7 to 9 million people by 2050.

Protecting Galveston Bay is no simple task. The bay is sheltered from the open ocean by two low, sandy strips of land — Galveston Island and Bolivar Peninsula — separated by the narrow passage of Bolivar Roads. When a sufficiently big storm approaches, water begins to rush through that gap and over the island and peninsula, surging into the bay.

Computer Simulations of the Future

Galv Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

Galv past & projected 2020

San Francisco, Cal.

Example of Media Warnings

From San Francisco Chronicle:  Special Report: SF Bay Sea Level Rise–Hayward

Sea level rise is fueled by higher global temperatures that trigger two forces: Warmer water expands oceans while the increased temperatures hasten the melting of glaciers on Antarctica and Greenland and add yet more water to the oceans.

The California Ocean Protection Council, a branch of state government, forecasts a 1-in-7 chance that the average daily tides in the bay will rise 2 or more feet by 2070. This would cause portions of the marshes and bay trail in Hayward to be underwater during high tides. Add another 2 feet, on the higher end of the council’s projections for 2100 and they’d be permanently submerged. Highway 92 would flood during major storms. So would the streets leading into the power plant.

From San Francisco Chronicle Special Report: SF Bay Sea Level Rise–Mission Creek

Along San Francisco’s Mission Creek, sea level rise unsettles the waters.  Each section of this narrow channel must be tailored differently to meet an uncertain future. Do nothing, and the combination of heavy storms with less than a foot of sea level rise could send Mission Creek spilling over its banks in a half-dozen places, putting nearby housing in peril and closing the two bridges that cross the channel.

Whatever the response, we won’t know for decades if the city’s efforts can keep pace with the impact of global climatic forces that no local government can control.

Though Mission Creek is unique, the larger dilemma is one that affects all nine Bay Area counties.

Computer Simulations of the Future

SF Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

SF CA past & projected 2020

Summary: This is a relentless, high-tech communications machine to raise all kinds of scary future possibilities, based upon climate model projections, and the unfounded theory of CO2-driven global warming/climate change.  The graphs above are centered on the year 2000, so that the 21st century added sea level rise is projected from that year forward.  In addition, we now have observations at tidal gauges for the first 20 years, 1/5 of the total expected.  The gauges in each city are the ones with the longest continuous service record, and wherever possible the locations shown in the simulations are not far from the tidal gauge.  For example, NYC best gauge is at the Battery, and Fulton St. is also near the Manhattan southern tip.

Already the imaginary rises are diverging greatly from observations, yet the chorus of alarm goes on.  In fact, the added rise to 2100 from tidal gauges ranges from 6 to 9.5 inches, except for Galveston projecting 20.6 inches. Meanwhile models imagined rises from 69 to 108 inches. Clearly coastal settlements must adapt to evolving conditions, but also need reasonable rather than fearful forecasts for planning purposes.

Footnote:  The problem of urban flooding is discussed in some depth at a previous post Urban Flooding: The Philadelphia Story

Background on the current sea level campaign is at USCS Warnings of Coastal Floodings

And as always, an historical perspective is important:

post-glacial_sea_level

Still No Global Warming June 2021

a62edf0f39de560a219b7262163b0d45

The post below updates the UAH record of air temperatures over land and ocean.  But as an overview consider how recent rapid cooling has now completely overcome the warming from the last 3 El Ninos (1998, 2010 and 2016).  The UAH record shows that the effects of the last one are now gone as of April 2021. (UAH baseline is now 1991-2020).

UAH Global 1995to202104 w co2 overlayFor reference I added an overlay of CO2 annual concentrations as measured at Mauna Loa.  While temperatures fluctuated up and down ending flat, CO2 went up steadily by ~55 ppm, a 15% increase.

Furthermore, going back to previous warmings prior to the satellite record shows that the entire rise of 0.8C since 1947 is due to oceanic, not human activity.

 

gmt-warming-events

The animation is an update of a previous analysis from Dr. Murry Salby.  These graphs use Hadcrut4 and include the 2016 El Nino warming event.  The exhibit shows since 1947 GMT warmed by 0.8 C, from 13.9 to 14.7, as estimated by Hadcrut4.  This resulted from three natural warming events involving ocean cycles. The most recent rise 2013-16 lifted temperatures by 0.2C.  Previously the 1997-98 El Nino produced a plateau increase of 0.4C.  Before that, a rise from 1977-81 added 0.2C to start the warming since 1947.

Importantly, the theory of human-caused global warming asserts that increasing CO2 in the atmosphere changes the baseline and causes systemic warming in our climate.  On the contrary, all of the warming since 1947 was episodic, coming from three brief events associated with oceanic cycles. 

mc_wh_gas_web20210423124932

See Also Worst Threat: Greenhouse Gas or Quiet Sun?

June Update Ocean and Land Air Temps Continue Down

banner-blog

With apologies to Paul Revere, this post is on the lookout for cooler weather with an eye on both the Land and the Sea.  While you will hear a lot about 2020 temperatures matching 2016 as the highest ever, that spin ignores how fast is the cooling setting in.  The UAH data analyzed below shows that warming from the last El Nino is now fully dissipated with chilly temperatures setting in all regions.  Last month despite some warming in NH, temps in the Tropics and SH dropped sharply.

UAH has updated their tlt (temperatures in lower troposphere) dataset for June.  Previously I have done posts on their reading of ocean air temps as a prelude to updated records from HADSST3. This month also has a separate graph of land air temps because the comparisons and contrasts are interesting as we contemplate possible cooling in coming months and years. Again last month showed air over land warmed slightly while oceans dropped down further.

Note:  UAH has shifted their baseline from 1981-2010 to 1991-2020 beginning with January 2021.  In the charts below, the trends and fluctuations remain the same but the anomaly values change with the baseline reference shift.

Presently sea surface temperatures (SST) are the best available indicator of heat content gained or lost from earth’s climate system.  Enthalpy is the thermodynamic term for total heat content in a system, and humidity differences in air parcels affect enthalpy.  Measuring water temperature directly avoids distorted impressions from air measurements.  In addition, ocean covers 71% of the planet surface and thus dominates surface temperature estimates.  Eventually we will likely have reliable means of recording water temperatures at depth.

Recently, Dr. Ole Humlum reported from his research that air temperatures lag 2-3 months behind changes in SST.  Thus the cooling oceans now portend cooling land air temperatures to follow.  He also observed that changes in CO2 atmospheric concentrations lag behind SST by 11-12 months.  This latter point is addressed in a previous post Who to Blame for Rising CO2?

After a technical enhancement to HadSST3 delayed updates Spring 2020, May resumed a pattern of HadSST updates toward the following month end.  For comparison we can look at lower troposphere temperatures (TLT) from UAHv6 which are now posted for April.  The temperature record is derived from microwave sounding units (MSU) on board satellites like the one pictured above. Recently there was a change in UAH processing of satellite drift corrections, including dropping one platform which can no longer be corrected. The graphs below are taken from the new and current dataset.

The UAH dataset includes temperature results for air above the oceans, and thus should be most comparable to the SSTs. There is the additional feature that ocean air temps avoid Urban Heat Islands (UHI).  The graph below shows monthly anomalies for ocean temps since January 2015.

UAH Oceans 202106

Note 2020 was warmed mainly by a spike in February in all regions, and secondarily by an October spike in NH alone. End of 2020 November and December ocean temps plummeted in NH and the Tropics. In January SH dropped sharply, pulling the Global anomaly down despite an upward bump in NH. An additional drop in March has SH matching the coldest in this period. March drops in the Tropics and NH made those regions at their coldest since 01/2015.  In June 2021 despite an uptick in NH, the Global anomaly dropped back down due to a record low in SH along with a Tropical cooling.

Land Air Temperatures Tracking Downward in Seesaw Pattern

We sometimes overlook that in climate temperature records, while the oceans are measured directly with SSTs, land temps are measured only indirectly.  The land temperature records at surface stations sample air temps at 2 meters above ground.  UAH gives tlt anomalies for air over land separately from ocean air temps.  The graph updated for June is below.

UAH Land 202106aHere we have fresh evidence of the greater volatility of the Land temperatures, along with an extraordinary departure by SH land.  Land temps are dominated by NH with a 2020 spike in February, followed by cooling down to July.  Then NH land warmed with a second spike in November.  Note the mid-year spikes in SH winter months.  In December all of that was wiped out.

Then January 2021 showed a sharp drop in SH, but a rise in NH more than offset, pulling the Global anomaly upward.  In February NH and the Tropics cooled further, pulling down the Global anomaly, despite slight SH land warming.  March continued to show all regions roughly comparable to early 2015, prior to the 2016 El Nino.  Then in April NH land dropped sharply along with the Tropics, bringing Global Land anomaly down by nearly 0.2C.  Now a remarkable divergence with NH rising in May and June, while SH drops sharply to a new low, along with Tropical cooling. With NH having most of the land mass, the Global land anomaly ticked upward.

The Bigger Picture UAH Global Since 1995

UAH Global 1995to202106

The chart shows monthly anomalies starting 01/1995 to present.  The average anomaly is 0.04, since this period is the same as the new baseline, lacking only the first 4 years.  1995 was chosen as an ENSO neutral year.  The graph shows the 1998 El Nino after which the mean resumed, and again after the smaller 2010 event. The 2016 El Nino matched 1998 peak and in addition NH after effects lasted longer, followed by the NH warming 2019-20, with temps now returning again to the mean.

TLTs include mixing above the oceans and probably some influence from nearby more volatile land temps.  Clearly NH and Global land temps have been dropping in a seesaw pattern, more than 1C lower than the 2016 peak.  Since the ocean has 1000 times the heat capacity as the atmosphere, that cooling is a significant driving force.  TLT measures started the recent cooling later than SSTs from HadSST3, but are now showing the same pattern.  It seems obvious that despite the three El Ninos, their warming has not persisted, and without them it would probably have cooled since 1995.  Of course, the future has not yet been written.

 

2021 Update: Fossil Fuels ≠ Global Warming

gas in hands

Previous posts addressed the claim that fossil fuels are driving global warming. This post updates that analysis with the latest (2020) numbers from BP Statistics and compares World Fossil Fuel Consumption (WFFC) with three estimates of Global Mean Temperature (GMT). More on both these variables below.

WFFC

2020 statistics are now available from BP for international consumption of Primary Energy sources. 2021 Statistical Review of World Energy. 

The reporting categories are:
Oil
Natural Gas
Coal
Nuclear
Hydro
Renewables (other than hydro)

Note:  British Petroleum (BP) now uses Exajoules to replace MToe (Million Tonnes of oil equivalents.) It is logical to use an energy metric which is independent of the fuel source. OTOH renewable advocates have no doubt pressured BP to stop using oil as the baseline since their dream is a world without fossil fuel energy.

From BP conversion table 1 exajoule (EJ) = 1 quintillion joules (1 x 10^18). Oil products vary from 41.6 to 49.4 tonnes per gigajoule (10^9 joules).  Comparing this annual report with previous years shows that global Primary Energy (PE) in MToe is roughly 24 times the same amount in Exajoules.  The conversion factor at the macro level varies from year to year depending on the fuel mix. The graphs below use the new metric.

This analysis combines the first three, Oil, Gas, and Coal for total fossil fuel consumption world wide (WFFC).  The chart below shows the patterns for WFFC compared to world consumption of Primary Energy from 1965 through 2020.

WFFC 2020

The graph shows that global Primary Energy (PE) consumption from all sources has grown continuously over 5 decades. Since 1965  oil, gas and coal (FF, sometimes termed “Thermal”) averaged 89% of PE consumed, ranging from 94% in 1965 to 84% in 2019.  Note that last year, 2020, PE dropped 25 EJ (4%) slightly below 2017 consumption.  WFFC for 2020 dropped 27 EJ (6%), 83% of PE and matching 2013 WFFC consumption. For the 55 year period, the net changes were:

Oil 168%
Gas 506%
Coal 161%
WFFC 218%
PE 259%
Global Mean Temperatures

Everyone acknowledges that GMT is a fiction since temperature is an intrinsic property of objects, and varies dramatically over time and over the surface of the earth. No place on earth determines “average” temperature for the globe. Yet for the purpose of detecting change in temperature, major climate data sets estimate GMT and report anomalies from it.

UAH record consists of satellite era global temperature estimates for the lower troposphere, a layer of air from 0 to 4km above the surface. HadSST estimates sea surface temperatures from oceans covering 71% of the planet. HADCRUT combines HadSST estimates with records from land stations whose elevations range up to 6km above sea level.

Both GISS LOTI (land and ocean) and HADCRUT4 (land and ocean) use 14.0 Celsius as the climate normal, so I will add that number back into the anomalies. This is done not claiming any validity other than to achieve a reasonable measure of magnitude regarding the observed fluctuations.

No doubt global sea surface temperatures are typically higher than 14C, more like 17 or 18C, and of course warmer in the tropics and colder at higher latitudes. Likewise, the lapse rate in the atmosphere means that air temperatures both from satellites and elevated land stations will range colder than 14C. Still, that climate normal is a generally accepted indicator of GMT.

Correlations of GMT and WFFC

The next graph compares WFFC to GMT estimates over the five decades from 1965 to 2020 from HADCRUT4, which includes HadSST3.

WFFC and Hadcrut 2020

Since 1965 the increase in fossil fuel consumption is dramatic and monotonic, steadily increasing by 218% from 146 to 463 exajoules.  Meanwhile the GMT record from Hadcrut shows multiple ups and downs with an accumulated rise of 0.9C over 55 years, 7% of the starting value.

The graph below compares WFFC to GMT estimates from UAH6, and HadSST3 for the satellite era from 1980 to 2020, a period of 40 years.

WFFC and UAH HadSST 2020

In the satellite era WFFC has increased at a compounded rate of nearly 2% per year, for a total increase of 82% since 1979. At the same time, SST warming amounted to 0.52C, or 3.7% of the starting value.  UAH warming was 0.7C, or 5% up from 1979.  The temperature compounded rate of change is 0.1% per year, an order of magnitude less than WFFC.  Even more obvious is the 1998 El Nino peak and flat GMT since.

Summary

The climate alarmist/activist claim is straight forward: Burning fossil fuels makes measured temperatures warmer. The Paris Accord further asserts that by reducing human use of fossil fuels, further warming can be prevented.  Those claims do not bear up under scrutiny.

It is enough for simple minds to see that two time series are both rising and to think that one must be causing the other. But both scientific and legal methods assert causation only when the two variables are both strongly and consistently aligned. The above shows a weak and inconsistent linkage between WFFC and GMT.

Going further back in history shows even weaker correlation between fossil fuels consumption and global temperature estimates:

wfc-vs-sat

Figure 5.1. Comparative dynamics of the World Fuel Consumption (WFC) and Global Surface Air Temperature Anomaly (ΔT), 1861-2000. The thin dashed line represents annual ΔT, the bold line—its 13-year smoothing, and the line constructed from rectangles—WFC (in millions of tons of nominal fuel) (Klyashtorin and Lyubushin, 2003). Source: Frolov et al. 2009

In legal terms, as long as there is another equally or more likely explanation for the set of facts, the claimed causation is unproven. The more likely explanation is that global temperatures vary due to oceanic and solar cycles. The proof is clearly and thoroughly set forward in the post Quantifying Natural Climate Change.

Background context for today’s post is at Claim: Fossil Fuels Cause Global Warming.

June 2021 N. Atlantic Finally Cooling?

RAPID Array measuring North Atlantic SSTs.

For the last few years, observers have been speculating about when the North Atlantic will start the next phase shift from warm to cold.

Source: Energy and Education Canada

An example is this report in May 2015 The Atlantic is entering a cool phase that will change the world’s weather by Gerald McCarthy and Evan Haigh of the RAPID Atlantic monitoring project. Excerpts in italics with my bolds.

This is known as the Atlantic Multidecadal Oscillation (AMO), and the transition between its positive and negative phases can be very rapid. For example, Atlantic temperatures declined by 0.1ºC per decade from the 1940s to the 1970s. By comparison, global surface warming is estimated at 0.5ºC per century – a rate twice as slow.

In many parts of the world, the AMO has been linked with decade-long temperature and rainfall trends. Certainly – and perhaps obviously – the mean temperature of islands downwind of the Atlantic such as Britain and Ireland show almost exactly the same temperature fluctuations as the AMO.

Atlantic oscillations are associated with the frequency of hurricanes and droughts. When the AMO is in the warm phase, there are more hurricanes in the Atlantic and droughts in the US Midwest tend to be more frequent and prolonged. In the Pacific Northwest, a positive AMO leads to more rainfall.

A negative AMO (cooler ocean) is associated with reduced rainfall in the vulnerable Sahel region of Africa. The prolonged negative AMO was associated with the infamous Ethiopian famine in the mid-1980s. In the UK it tends to mean reduced summer rainfall – the mythical “barbeque summer”.Our results show that ocean circulation responds to the first mode of Atlantic atmospheric forcing, the North Atlantic Oscillation, through circulation changes between the subtropical and subpolar gyres – the intergyre region. This a major influence on the wind patterns and the heat transferred between the atmosphere and ocean.

The observations that we do have of the Atlantic overturning circulation over the past ten years show that it is declining. As a result, we expect the AMO is moving to a negative (colder surface waters) phase. This is consistent with observations of temperature in the North Atlantic.

Cold “blobs” in North Atlantic have been reported, but they are usually a winter phenomena. For example in April 2016, the sst anomalies looked like this

But by September, the picture changed to this

And we know from Kaplan AMO dataset, that 2016 summer SSTs were right up there with 1998 and 2010 as the highest recorded.

As the graph above suggests, this body of water is also important for tropical cyclones, since warmer water provides more energy.  But those are annual averages, and I am interested in the summer pulses of warm water into the Arctic. As I have noted in my monthly HadSST3 reports, most summers since 2003 there have been warm pulses in the north atlantic.

AMO June 2021
The AMO Index is from from Kaplan SST v2, the unaltered and not detrended dataset. By definition, the data are monthly average SSTs interpolated to a 5×5 grid over the North Atlantic basically 0 to 70N.  The graph shows warming began after 1970s up to 1998, with a series of matching years since.  Since 2016, June SSTs have backed down despite an upward bump in 2020. Because McCarthy refers to hints of cooling to come in the N. Atlantic, let’s take a closer look at some AMO years in the last 2 decades.

AMO decade 062021

This graph shows monthly AMO temps for some important years. The Peak years were 1998, 2010 and 2016, with the latter emphasized as the most recent. The other years show lesser warming, with 2007 emphasized as the coolest in the last 20 years. Note the red 2018 line is at the bottom of all these tracks.  Note that 2020 tracked the 2016 highs, even exceeding those temps the first 4 months.  Now 2021 is starting tracking the much cooler 2018.

With all the talk of AMOC slowing down and a phase shift in the North Atlantic, we await SST measurements for July, August and September to confirm if cooling is starting to set in.

July 2021 Heat Records Silly Season Again

Photo illustration by Slate. Photos by Thinkstock.

A glance at the news aggregator shows the silly season is in full swing.  A partial listing of headlines today proclaiming the hottest whatever.

  • Last Month Was the Hottest June in North America in Recent Recorded History TIME
  • Global Warming Is Increasing the Likelihood of Frost Damage in Vineyards Martha Stewart
  • Heat records smashed in Moscow and Helsinki CGTN
  • Alberta glacial melt about 3 times higher than average during heat wave: expert The Weather Network
  • US and Canada heatwave ‘impossible’ without climate change, analysis shows Sky News
  • Heat waves caused warmest June ever in North America The Independent
  • Glacial melt: the European Alps New Age
  • The North American heatwave shows we need to know how climate change will change our weather Cyprus Mail
  • Global evidence links rise in extreme precipitation to human-driven climate change Phys.org
  • Drought-Stricken Western Districts Plan New Ways to Store Water Bloomberg
  • Amid record heat, Equilibrium Capital raises $1 billion for second greenhouse fund ImpactAlpha
  • Last Month Was Hottest June on Record in North America MTV Lebanon
  • Superior National Forest could provide refuge to wildlife as the climate warms Yale Climate Connections
  • How climate change is exacerbating record heatwaves The Telegraph
  • Lapland records hottest day for more than a century as heatwave grips region Sky News
  • Heatwave stokes North America’s warmest June on record The Raw Story

Time for some Clear Thinking about Heat Records (Previous Post)

Here is an analysis using critical intelligence to interpret media reports about temperature records this summer. Daniel Engber writes in Slate Crazy From the Heat

The subtitle is Climate change is real. Record-high temperatures everywhere are fake.  As we shall see from the excerpts below, The first sentence is a statement of faith, since as Engber demonstrates, the notion does not follow from the temperature evidence. Excerpts in italics with my bolds.

It’s been really, really hot this summer. How hot? Last Friday, the Washington Post put out a series of maps and charts to illustrate the “record-crushing heat.” All-time temperature highs have been measured in “scores of locations on every continent north of the equator,” the article said, while the lower 48 states endured the hottest-ever stretch of temperatures from May until July.

These were not the only records to be set in 2018. Historic heat waves have been crashing all around the world, with records getting shattered in Japan, broken on the eastern coast of Canada, smashed in California, and rewritten in the Upper Midwest. A city in Algeria suffered through the highest high temperature ever recorded in Africa. A village in Oman set a new world record for the highest-ever low temperature. At the end of July, the New York Times ran a feature on how this year’s “record heat wreaked havoc on four continents.” USA Today reported that more than 1,900 heat records had been tied or beaten in just the last few days of May.

While the odds that any given record will be broken may be very, very small, the total number of potential records is mind-blowingly enormous.

There were lots of other records, too, lots and lots and lots—but I think it’s best for me to stop right here. In fact, I think it’s best for all of us to stop reporting on these misleading, imbecilic stats. “Record-setting heat,” as it’s presented in news reports, isn’t really scientific, and it’s almost always insignificant. And yet, every summer seems to bring a flood of new superlatives that pump us full of dread about the changing climate. We’d all be better off without this phony grandiosity, which makes it seem like every hot and humid August is unparalleled in human history. It’s not. Reports that tell us otherwise should be banished from the news.

It’s true the Earth is warming overall, and the record-breaking heat that matters most—the kind we’d be crazy to ignore—is measured on a global scale. The average temperature across the surface of the planet in 2017 was 58.51 degrees, one-and-a-half degrees above the mean for the 20th century. These records matter: 17 of the 18 hottest years on planet Earth have occurred since 2001, and the four hottest-ever years were 2014, 2015, 2016, and 2017. It also matters that this changing climate will result in huge numbers of heat-related deaths. Please pay attention to these terrifying and important facts. Please ignore every other story about record-breaking heat.

You’ll often hear that these two phenomena are related, that local heat records reflect—and therefore illustrate—the global trend. Writing in Slate this past July, Irineo Cabreros explained that climate change does indeed increase the odds of extreme events, making record-breaking heat more likely. News reports often make this point, linking probabilities of rare events to the broader warming pattern. “Scientists say there’s little doubt that the ratcheting up of global greenhouse gases makes heat waves more frequent and more intense,” noted the Times in its piece on record temperatures in Algeria, Hong Kong, Pakistan, and Norway.

Yet this lesson is subtler than it seems. The rash of “record-crushing heat” reports suggest we’re living through a spreading plague of new extremes—that the rate at which we’re reaching highest highs and highest lows is speeding up. When the Post reports that heat records have been set “at scores of locations on every continent,” it makes us think this is unexpected. It suggests that as the Earth gets ever warmer, and the weather less predictable, such records will be broken far more often than they ever have before.

But that’s just not the case. In 2009, climatologist Gerald Meehl and several colleagues published an analysis of records drawn from roughly 2,000 weather stations in the U.S. between 1950 and 2006. There were tens of millions of data points in all—temperature highs and lows from every station, taken every day for more than a half-century. Meehl searched these numbers for the record-setting values—i.e., the days on which a given weather station saw its highest-ever high or lowest-ever low up until that point. When he plotted these by year, they fell along a downward-curving line. Around 50,000 new heat records were being set every year during the 1960s; then that number dropped to roughly 20,000 in the 1980s, and to 15,000 by the turn of the millennium.

From Meehl et al 2009.

This shouldn’t be surprising. As a rule, weather records will be set less frequently as time goes by. The first measurement of temperature that’s ever taken at a given weather station will be its highest (and lowest) of all time, by definition. There’s a good chance that the same station’s reading on Day 2 will be a record, too, since it only needs to beat the temperature recorded on Day 1. But as the weeks and months go by, this record-setting contest gets increasingly competitive: Each new daily temperature must now outdo every single one that came before. If the weather were completely random, we might peg the chances of a record being set at any time as 1/n, where n is the number of days recorded to that point. In other words, one week into your record-keeping, you’d have a 1 in 7 chance of landing on an all-time high. On the 100th day, your odds would have dropped to 1 percent. After 56 years, your chances would be very, very slim.

The weather isn’t random, though; we know it’s warming overall, from one decade to the next. That’s what Meehl et al. were looking at: They figured that a changing climate would tweak those probabilities, goosing the rate of record-breaking highs and tamping down the rate of record-breaking lows. This wouldn’t change the fundamental fact that records get broken much less often as the years go by. (Even though the world is warming, you’d still expect fewer heat records to be set in 2000 than in 1965.) Still, one might guess that climate change would affect the rate, so that more heat records would be set than we’d otherwise expect.

That’s not what Meehl found. Between 1950 and 2006, the rate of record-breaking heat seemed unaffected by large-scale changes to the climate: The number of new records set every year went down from one decade to the next, at a rate that matched up pretty well with what you’d see if the odds were always 1/n. The study did find something more important, though: Record-breaking lows were showing up much less often than expected. From one decade to the next, fewer records of any kind were being set, but the ratio of record lows to record highs was getting smaller over time. By the 2000s, it had fallen to about 0.5, meaning that the U.S. was seeing half as many record-breaking lows as record-breaking highs. (Meehl has since extended this analysis using data going back to 1930 and up through 2015. The results came out the same.)

What does all this mean? On one hand, it’s very good evidence that climate change has tweaked the odds for record-breaking weather, at least when it comes to record lows. (Other studies have come to the same conclusion.) On the other hand, it tells us that in the U.S., at least, we’re not hitting record highs more often than we were before, and that the rate isn’t higher than what you’d expect if there weren’t any global warming. In fact, just the opposite is true: As one might expect, heat records are getting broken less often over time, and it’s likely there will be fewer during the 2010s than at any point since people started keeping track.

This may be hard to fathom, given how much coverage has been devoted to the latest bouts of record-setting heat. These extreme events are more unusual, in absolute terms, than they’ve ever been before, yet they’re always in the news. How could that be happening?

While the odds that any given record will be broken may be very, very small, the total number of potential records that could be broken—and then reported in the newspaper—is mind-blowingly enormous. To get a sense of how big this number really is, consider that the National Oceanic and Atmospheric Administration keeps a database of daily records from every U.S. weather station with at least 30 years of data, and that its website lets you search for how many all-time records have been set in any given stretch of time. For instance, the database indicates that during the seven-day period ending on Aug. 17—the date when the Washington Post published its series of “record-crushing heat” infographics—154 heat records were broken.

 

That may sound like a lot—154 record-high temperatures in the span of just one week. But the NOAA website also indicates how many potential records could have been achieved during that time: 18,953. In actuality, less than one percent of these were broken. You can also pull data on daily maximum temperatures for an entire month: I tried that with August 2017, and then again for months of August at 10-year intervals going back to the 1950s. Each time the query returned at least about 130,000 potential records, of which one or two thousand seemed to be getting broken every year. (There was no apparent trend toward more records being broken over time.)

Now let’s say there are 130,000 high-temperature records to be broken every month in the U.S. That’s only half the pool of heat-related records, since the database also lets you search for all-time highest low temperatures. You can also check whether any given highest high or highest low happens to be a record for the entire month in that location, or whether it’s a record when compared across all the weather stations everywhere on that particular day.

Add all of these together and the pool of potential heat records tracked by NOAA appears to number in the millions annually, of which tens of thousands may be broken. Even this vastly underestimates the number of potential records available for media concern. As they’re reported in the news, all-time weather records aren’t limited to just the highest highs or highest lows for a given day in one location. Take, for example, the first heat record mentioned in this column, reported in the Post: The U.S. has just endured the hottest May, June, and July of all time. The existence of that record presupposes many others: What about the hottest April, May and June, or the hottest March, April, and May? What about all the other ways that one might subdivide the calendar?

Geography provides another endless well of flexibility. Remember that the all-time record for the hottest May, June, and July applied only to the lower 48 states. Might a different set of records have been broken if we’d considered Hawaii and Alaska? And what about the records spanning smaller portions of the country, like the Midwest, or the Upper Midwest, or just the state of Minnesota, or just the Twin Cities? And what about the all-time records overseas, describing unprecedented heat in other countries or on other continents?

Even if we did limit ourselves to weather records from a single place measured over a common timescale, it would still be possible to parse out record-breaking heat in a thousand different ways. News reports give separate records, as we’ve seen, for the highest daily high and the highest daily low, but they also tell us when we’ve hit the highest average temperature over several days or several weeks or several months. The Post describes a recent record-breaking streak of days in San Diego with highs of at least 83 degrees. (You’ll find stories touting streaks of daily highs above almost any arbitrary threshold: 90 degrees, 77 degrees, 60 degrees, et cetera.) Records also needn’t focus on the temperature at all: There’s been lots of news in recent weeks about the fact that the U.K. has just endured its driest-ever early summer.

“Record-breaking” summer weather, then, can apply to pretty much any geographical location, over pretty much any span of time. It doesn’t even have to be a record—there’s an endless stream of stories on “near-record heat” in one place or another, or the “fifth-hottest” whatever to happen in wherever, or the fact that it’s been “one of the hottest” yadda-yaddas that yadda-yadda has ever seen. In the most perverse, insane extension of this genre, news outlets sometimes even highlight when a given record isn’t being set.

Loose reports of “record-breaking heat” only serve to puff up muggy weather and make it seem important. (The sham inflations of the wind chill factor do the same for winter months.) So don’t be fooled or flattered by this record-setting hype. Your summer misery is nothing special.

Summary

This article helps people not to confuse weather events with climate.  My disappointment is with the phrase, “Climate Change is Real,” since it is subject to misdirection.  Engber uses that phrase referring to rising average world temperatures, without explaining that such estimates are computer processed reconstructions since the earth has no “average temperature.”  More importantly the undefined “climate change” is a blank slate to which a number of meanings can be attached.

Some take it to mean: It is real that rising CO2 concentrations cause rising global warming.  Yet that is not supported by temperature records.
Others think it means: It is real that using fossil fuels causes global warming.  This too lacks persuasive evidence.
WFFC and Hadcrut 2018Over the last five decades the increase in fossil fuel consumption is dramatic and monotonic, steadily increasing by 234% from 3.5B to 11.7B oil equivalent tons. Meanwhile the GMT record from Hadcrut shows multiple ups and downs with an accumulated rise of 0.74C over 53 years, 5% of the starting value.

Others know that Global Mean Temperature is a slippery calculation subject to the selection of stations.

Graph showing the correlation between Global Mean Temperature (Average T) and the number of stations included in the global database. Source: Ross McKitrick, U of Guelph

Global warming estimates combine results from adjusted records.
Conclusion

The pattern of high and low records discussed above is consistent with natural variability rather than rising CO2 or fossil fuel consumption. Those of us not alarmed about the reported warming understand that “climate change” is something nature does all the time, and that the future is likely to include periods both cooler and warmer than now.

Background Reading:

The Climate Story (Illustrated)

2020 Update: Fossil Fuels ≠ Global Warming

Man Made Warming from Adjusting Data

What is Global Temperature? Is it warming or cooling?

NOAA US temp 2019 2021

Be Afraid of Covid Variants. Be very Afraid.

image-13

This campfire ghost story has been around since last March, but is revived and promoted now to maintain public anxiety and perpetuate the medical-industrial complex. Some background and background resources are in Daniel Horowitz’s Blaze article The Delta deception: New COVID variant might be less deadly.  Excerpts in italics with my bolds.

“This COVID variant will be the one to really get us. No, it’s this one. Well, Alpha, Beta, and Gamma weren’t a problem, but I promise you ‘the Delta’ spells the end of civilization.”

That is essentially the panic porn dressed up as science that we have been treated to ever since the virus declined in January following the winter spread, which appears to have given us a great deal of herd immunity. Despite the advent of the British and South African variants, cases, not to mention fatalities, have continued to plummet in all of the places where those variants were supposedly common. Which is why they are now repeating the same mantra about the “Delta” variant from India.

The headlines are screaming with panic over the impending doom of the Delta variant hitting the U.S.:

“WHO says delta is the fastest and fittest Covid variant and will ‘pick off’ most vulnerable” (CNBC)
“Highly contagious Delta variant could cause next COVID-19 wave: ‘This virus will still find you'” (CBS)
“Delta Variant Gains Steam in Undervaccinated U.S. Counties” (Bloomberg)
“The Delta variant might pose the biggest threat yet to vaccinated people” (Business Insider)
And of course, no list would be complete without the headline from Dr. Fauci yesterday, stating that the “Delta variant is the greatest threat in the US.”

The implication from these headlines is that somehow this variant is truly more transmissible and deadly (as the previous variants were falsely portrayed to be), they escape natural immunity and possibly the vaccine — and therefore, paradoxically, you must get vaccinated and continue doing all the things that failed to work for the other variants!

After each city and country began getting ascribed its own “variant,” I think the panic merchants realized that the masses would catch on to the variant scam, so they decided to rename them Alpha (British), Beta (South African), Gamma (Brazilian), and Delta (Indian), which sounds more like a hierarchy of progression and severity rather than each region simply getting hit when it’s in season until the area reaches herd immunity.

However, if people would actually look at the data, they’d realize that the Delta variant is actually less deadly. These headlines are able to gain momentum only because of the absurd public perception that somehow India got hit worse than the rest of the world. In reality, India has one-seventh the death rate per capita of the U.S.; it’s just that India got the major winter wave later, when the Western countries were largely done with it, thereby giving the illusion that India somehow suffered worse. Now, the public health Nazis are transferring their first big lie about what happened in India back to the Western world.

Fortunately, the U.K. government has already exposed these headlines as a lie, for those willing to take notice. On June 18, Public Health England published its 16th report on “SARS-CoV-2 variants of concern and variants under investigation in England,” this time grouping the variants by Greek letters.

img

As you can see, the Delta variant has a 0.1% case fatality rate (CFR) out of 31,132 Delta sequence infections confirmed by investigators. That is the same rate as the flu and is much lower than the CFR for the ancestral strain or any of the other variants. And as we know, the CFR is always higher than the infection fatality rate (IFR), because many of the mildest and asymptomatic infections go undocumented, while the confirmed cases tend to have a bias toward those who are more evidently symptomatic.

In other words, Delta is literally the flu with a CFR identical to it. This is exactly what every respiratory pandemic has done through history: morphed into more transmissible and less virulent form that forces the other mutations out since you get that one. Nothing about masks, lockdowns, or experimental shots did this. To the extent this really is more transmissible, it’s going to be less deadly, as is the case with the common cold. To the extent that there are areas below the herd immunity threshold (for example, in Scotland and the northwestern parts of the U.K.) they will likely get the Delta variant (until something else supplants it), but fatalities will continue to go down.

According to the above-mentioned report, the Delta variant represented more than 75% of all cases in the U.K. since mid-May. If it really was that deadly, it should have been wreaking havoc over the past few weeks.

img-1

You can see almost a perfect inverse relationship between hospitalization rates throughout April and May plummeting as the Delta variant became the dominant strain of the virus in England. Some areas might see a slight oscillation from time to time as herd immunity fills in, regardless of which variant is floating around. However, the death burden is well below that of a flu season and is no longer an epidemic.

Thus, the good news is that now that most countries have reached a large degree of herd immunity, there is zero threat of hospitals being overrun by any seasonal increase in various areas, no matter the variant. The bad news is that after Delta, there are Epsilon and 19 other letters of the Greek alphabet, which will enable the circuitous cycle of misinformation, fear, panic, and control to continue. And remember, as there is already a “Delta+,” the options are endless until our society finally achieves immunity to COVID panic porn.

Footnote:  The last paragraph was prescient:  Now the TV is blathering about the “lambda” variant from Peru.

SEC Warned Off Climate Disclosures

Warning banner

David Burton writes to the Securities Exchange Commission explaining the hazards they will be taking on needlessly should they continue desiring to impose Climate Change Disclosures on publicly traded enterprises.  His document was sent to the SEC Chairman entitled Re: Comments on Climate Disclosure.  Excerpts in italics with my bolds.

Summary of Key Points

1. Climate Change Disclosure Would Impede the Commission’s Important Mission.

The important mission of the U.S. Securities and Exchange Commission is to protect investors, maintain fair, orderly, and efficient markets, and facilitate capital formation. Mandatory climate change disclosure would impede rather than further that mission. It would affirmatively harm investors, impede capital formation and do nothing to improve the efficiency of capital markets.

2. Immaterial Climate Change “Disclosure” Would Obfuscate Rather than Inform.

The concept of materiality has been described as the cornerstone of the disclosure system established by the federal securities laws. Disclosure of material climate-related information is already required under ordinary securities law principles and Regulation SK. Mandatory “disclosure” of immaterial, highly uncertain, highly disputable information would obfuscate rather than inform. It will harm rather than hurt investors.

3. Climate Models and Climate Science are Highly Uncertain.

There is a massive amount of variance among various climate models and uncertainty regarding the future of the climate.

4. Economic Modeling of Climate Change Effects is Even More Uncertain.

There is an even higher degree of variance and uncertainty associated with attempts to model or project the economic impact of highly divergent and uncertain climate models. Any estimate of the economic impact of climate change would have to rely on highly uncertain and divergent climate model results discussed below. In addition to this high degree of uncertainty would be added an entirely new family of economic ambiguity and uncertainty. Any economic estimate of the impact of climate change would also have to choose a discount rate to arrive at the present discounted value of future costs and benefits of climate change and to estimate the future costs and benefits of various regulatory or private responses. The choice of discount rate is controversial and important. Estimates would need to be made of the cost of various aspects of climate change (sea level rises, the impact on agriculture, etc.). Estimates would need to be made of the cost of various remediation techniques. Guesses would need to be made about the rate of technological change. Guesses would need to be made about the regulatory, tax and other responses of a myriad of governments. Estimates would need to be made using conventional economic techniques regarding the economic impact of those changes which, in turn, would reflect a wide variety of techniques and in many cases a thin or non-existent empirical literature. Guesses would need to be made of market responses to all of these changes since market participants will not stand idly by and do nothing as markets and the regulatory environment change. Then, after making decisions regarding all of these extraordinarily complex, ambiguous and uncertain issues, issuers would then need to assess the likely impact of climate change on their specific business years into the future – a business that may by then bear little resemblance to the issuers’ existing business.

Then, the Commission would need to assess the veracity of the issuers’ “disclosure” based on this speculative house of cards. The idea that all of this can be done in a way that will meaningfully improve investors’ decision making is not credible.

5. The Commission Does Not Possess the Expertise to Competently Assess Climate Models or the Economic Impact of Climate Change.

The Commission has neither the expertise to assess climate models nor the expertise to assess economic models purporting to project the economic impact of divergent and uncertain climate projections.

6. The Commission Has Neither the Expertise nor the Administrative Ability to Assess the Veracity of Issuer Climate Change Disclosures.

The Commission does not have the expertise or administrative ability to assess the veracity, or lack thereof, of issuer “disclosures”  based on firm-specific speculation regarding the impact of climate change which would be based on firm-specific choices regarding highly divergent and uncertain economic models projecting the economic impact of climate changes based on firm-specific choices regarding highly divergent and uncertain climate models.

7. Commission Resources Are Better Spent Furthering Its Mission.

Imposing these requirements and developing the expertise to police such climate disclosure by thousands of issuers will involve the expenditure of very substantial resources. These resources would be much better spent furthering the Commission’s important mission.

8. The Costs Imposed on Issuers Would be Large.

Requiring all public companies to develop climate modeling expertise, the ability to make macroeconomic projections based on these models and then make firm-specific economic assessments based on these climate and economic models will be expensive, imposing costs that will amount to billions of dollars on issuers. These expenses would harm investors by reducing shareholder returns.

9.Climate Change Disclosure Requirements Would Further Reduce the Attractiveness of Becoming a Public Company,
Harming Ordinary Investors and Entrepreneurial Capital Formation.

Such requirements would further reduce the attractiveness of being a registered, public company. They would exacerbate the decline in the number of public companies and the trend of companies going public later in their life cycle. This, in turn, would deny to ordinary (unaccredited) investors the opportunity to invest in dynamic, high-growth, profitable companies until most of the money has already been made by affluent accredited investors. It would further impede entrepreneurial access to public capital markets.

10. Climate Change Disclosure Requirements Would Create a New Compliance Eco-System and a New Lobby to Retain the Requirements.

The imposition of such requirements would result in the creation of a new compliance eco-system and pro-complexity lobby composed of the economists, accountants, attorneys and compliance officers that live off of the revised Regulation S-K.

11. Climate Change Disclosure Requirements Would Result in Much Litigation.

The imposition of such requirements would result in much higher litigation risk and expense as private lawsuits are filed challenging the veracity of climate disclosures. These lawsuits are virtually assured since virtually no climate models have accurately predicated future climate and the economic and financial projections based on these climate models are even more uncertain. Litigation outcomes would be as uncertain as the underlying climate science, economics and the associated financial projections. This would harm investors and entrepreneurial capital formation.

12. Material Actions by Management in Furtherance of Social and Political Objectives that Reduce Returns must be Disclosed.

Many environmentally constructive corporate actions will occur in the absence of any government mandate or required disclosure. For example, energy conservation measures may reduce costs as well as emissions. No new laws or regulations are necessary to induce firms to take these actions. Assuming they are not utterly pointless, climate change disclosure laws presumably would be designed to induce management to take action that they would not otherwise take. To the extent management takes material actions in furtherance of social and political objectives (including ESG objectives) that reduce shareholder returns, whether induced by climate change disclosure requirements or taken for other reasons, they need to disclose that information. The Commission should ensure that they do so. Absent some drastic change in the underlying law by Congress, this principle would apply to any reduction in returns whether induced by ESG disclosures (climate change related or otherwise) or taken by management on its own initiative to achieve social and political objectives.

13. Fund Managers Attempts to Profit from SRI at the Expense of Investors Should be Policed.

Fund management firms are generally compensated from either sales commissions (often called loads) or investment management fees that are typically based on assets under management. Their compensation is not closely tied to performance. Thus, these firms will often see a financial advantage in selling “socially responsible” products that perform no better and often worse than conventional investments. It is doubtful that this is consistent with Regulation BI. Their newfound interest in socially responsible investing should be taken with the proverbial grain of salt. The Commission should monitor their efforts to profit from SRI at the expense of investors.

14. Duties of Fund Managers Should be Clarified.

The extreme concentration in the proxy advisory and fund management business is cause for concern. As few as 20 firms may exercise effective control over most public companies. The Commission should make it clear that investment advisers managing investment funds, including retirement funds or accounts, have a duty to manage those funds and to vote the shares held by the funds in the financial, economic or pecuniary interest of the millions of small investors that invest in, or are beneficiaries of, those funds and that the funds may not be managed to further the managers’ preferred political or social objectives.

15. Securities Laws are a Poor Mechanism to Address Externalities.

Externalities, such as pollution, should be addressed by either enhancing property rights or, in the case of unowned resources such as the air and waterways, by a regulatory response that carefully assesses the costs and benefits of the regulatory response. Securities disclosure is the wrong place to try to address externalities. Policing externalities is far outside of the scope of Commission’s mission and the purpose of the securities laws.

16. Climate Change Disclosure Requirements Would Have No Meaningful Impact on the Climate.

When all is said and done, climate change disclosure requirements will have somewhere between a trivial impact and no impact on climate change.

17. Efforts to Redefine Materiality or the Broader Purpose of Business should be Opposed.

Simply because some politically motivated investors seek to impose a disclosure requirement on issuers does not make such a requirement material. The effort to redefine materiality in the securities laws is part of an increasingly strident effort to redefine the purpose of businesses more generally to achieve various social or political objectives unrelated to earning a return, satisfying customers, or treating workers or suppliers fairly. This is being done under the banner of social justice; corporate social responsibility (CSR); stakeholder theory; environmental, social and governance (ESG) criteria; socially responsible investing (SRI); sustainability; diversity; business ethics; common-good capitalism; or corporate actual responsibility. The social costs of ESG and broader efforts to repurpose business firms will be considerable. Wages will decline or grow more slowly, firms will be less productive and less internationally competitive, investor returns will decline, innovation will slow, goods and services quality will decline and their prices will increase.

18.  ESG Requirements will Make Management Even Less Accountable.

In large, modern corporations there is a separation of ownership and control. There is a major agent/principal problem because management and the board of directors often, to varying degrees, pursue their own interest rather than the interests of shareholders. Profitability is, however, a fairly clear measure of the success or failure of management and the board. If a firm become unprofitable or lags considerably in profitability, the board may well replace management, shareholders may replace the board or another firm may attempt a takeover. Systematic implementation of regulatory ESG or CSR requirements will make management dramatically less accountable since such requirements will come at the expense of profitability and the metrics relating to success or failure of achieving ESG or CSR requirements will be largely unquantifiable. For that matter, ESG or CSR requirements themselves tend to be amorphous and ever changing.

esg-smoke-and-mirrors

NOAA Loses 1M km2 of Arctic Ice in July

Arctic2021185

NOAA’s Sea Ice Index (shown in orange above) dramatically lost 1M km2 of Arctic sea ice extent in just the last four days.  Meanwhile MASIE from the National Ice Center (NIC) (cyan color) declined much less, ~400k km2, a more typical decline since the two datasets were nearly the same on June 30, 2021.  Note that extreme drops in ice extents can happen in July, as seen last year (purple line after day 190), so the issue bears watching. There is history of satellite difficulties discriminating between open water and surface melt water during both the melting and refreezing seasons.  A background post below explains differences between the two datasets.

Background from previous post Support MASIE Arctic Ice Dataset

MASIE: “high-resolution, accurate charts of ice conditions”
Walt Meier, NSIDC, October 2015 article in Annals of Glaciology.

Update February 4, 2017 Below

The home page for MASIE (here) invites visitors to show their interest in the dataset and analysis tools since continued funding is not assured. The page says:
NSIDC has received support to develop MASIE but not to maintain MASIE. We are actively seeking support to maintain the Web site and products over the long term. If you find MASIE helpful, please let us know with a quick message to NSIDC User Services.

For the reasons below, I hope people will go there and express their support.

1. MASIE is Rigorous.

Note on Sea Ice Resolution:

Northern Hemisphere Spatial Coverage

Sea Ice Index (SII) from NOAA is based on 25 km cells and 15% ice coverage. That means if a grid cell 25X25, or 625 km2 is estimated to have at least 15% ice, then 625 km2 is added to the total extent. In the mapping details, grid cells vary between 382 to 664 km2 with latitudes.  And the satellites’ Field of View (FOV) is actually an ellipsoid ranging from 486 to 3330 km2 depending on the channel and frequency.  More info is here.

MASIE is based on 4 km cells and 40% ice coverage. Thus, for MASIE estimates, if a grid cell is deemed to have at least 40% ice, then 16 km2 is added to the total extent.

The significantly higher resolution in MASIE means that any error in detecting ice cover at the threshold level affects only 16 km2 in the MASIE total, compared to at least 600 km2 variation in SII.  A few dozen SII cells falling below the 15% threshold is reported as a sizable loss of ice in the Arctic.

2. MASIE is Reliable.

2017029google

MASIE is an operational ice product developed from multiple sources to provide the most accurate possible description of Arctic ice for the sake of ships operating in the region.

Operational analyses combine a variety of remote-sensing inputs and other sources via manual integration to create high-resolution, accurate charts of ice conditions in support of navigation and operational forecast models. One such product is the daily Multisensor Analyzed Sea Ice Extent (MASIE). The higher spatial resolution along with multiple input data and manual analysis potentially provide more precise mapping of the ice edge than passive microwave estimates.  From Meier et al., link below.

Some people have latched onto a line from the NSIDC background page:
Use the Sea Ice Index when comparing trends in sea ice over time or when consistency is important. Even then, the monthly, not the daily, Sea Ice Index views should be used to look at trends in sea ice. The Sea Ice Index documentation explains how linear regression is used to say something about trends in ice extent, and what the limitations of that method are. Use MASIE when you want the most accurate view possible of Arctic-wide ice on a given day or through the week.

That statement was not updated to reflect recent developments:
“In June 2014, we decided to make the MASIE product available back to 2006. This was done in response to user requests, and because the IMS product output, upon which MASIE is based, appeared to be reasonably consistent.”

The fact that MASIE employs human judgment is discomforting to climatologists as a potential source of error, so Meier and others prefer that the analysis be done by computer algorithms. Yet, as we shall see, the computer programs are themselves human inventions and when applied uncritically by machines produce errors of their own.

3. MASIE serves as Calibration for satellite products.

The NSIDC Background cites as support a study by Partington et al (2003).  Reading that study, one finds that the authors preferred the MASIE data and said this:

“Passive microwave sensors from the U.S. Defense Meteorological Satellite Program have long provided a key source of information on Arctic-wide sea ice conditions, but suffer from some known deficiencies, notably a tendency to underestimate ice concentrations in summer. With the recent release of digital and quality controlled ice charts extending back to 1972 from the U.S. National Ice Center (NIC), there is now an alternative record of late twentieth century Northern Hemisphere sea ice conditions to compare with the valuable, but imperfect, passive microwave sea ice record.”

“This analysis has been based on ice chart data rather than the more commonly analyzed passive microwave derived ice concentrations. Differences between the NIC ice chart sea ice record and the passive microwave sea ice record are highly significant despite the fact that the NIC charts are semi-dependent on the passive microwave data, and it is worth noting these differences. . .In summer, the difference between the two sources of data rises to a maximum of 23% peaking in early August, equivalent to ice coverage the size of Greenland. (my bold)  For clarity: the ice chart data show higher extents than passive microwave data.

The differences are even greater for Canadian regions.

“More than 1380 regional Canadian weekly sea-ice charts for four Canadian regions and 839 hemispheric U.S. weekly sea-ice charts from 1979 to 1996 are compared with passive microwave sea-ice concentration estimates using the National Aeronautics and Space Administration (NASA) Team algorithm. Compared with the Canadian regional ice charts, the NASA Team algorithm underestimates the total ice-covered area by 20.4% to 33.5% during ice melt in the summer and by 7.6% to 43.5% during ice growth in the late fall.”

From: The Use of Operational Ice Charts for Evaluating Passive Microwave Ice Concentration Data, Agnew and Howell  http://www.tandfonline.com/doi/pdf/10.3137/ao.410405

More recently Walter Meier, who is in charge of SII, and several colleagues compared SII and MASIE and published their findings October 2015 (here).  The purpose of the analysis was stated thus:
Our comparison is not meant to be an extensive validation of either product, but to illustrate as guidance for future use how the two products behave in different regimes.

The abstract concludes:
Comparisons indicate that MASIE shows higher Arctic-wide extent values throughout most of the year, largely because of the limitations of passive microwave sensors in some conditions (e.g. surface melt). However, during some parts of the year, MASIE tends to indicate less ice than estimated by passive microwave sensors. These comparisons yield a better understanding of operational and research sea-ice data products; this in turn has important implications for their use in climate and weather models.

A more extensive comparison of MASIE from NIC and SII from NOAA is here.

4. MASIE continues a long history of Arctic Ice Charts.

Naval authorities have for centuries prepared ice charts for the safety of ships operating in the Arctic.  There are Russian, Danish, Norwegian, and Canadian charts, in addition to MASIE, the US version.  These estimates rely on multiple sources of data, including the NASA reports.  Charts are made with no climate ax to grind, only to get accurate locations and extents of Arctic ice each day.

Figure 16-3: Time series of April sea-ice extent in Nordic Sea (1864-1998) given by 2-year running mean and second-order polynomial curves. Top: Nordic Sea; middle: eastern area; bottom: western area (after Vinje, 2000). IPCC Third Assessment Report

Figure 16-3: Time series of April sea-ice extent in Nordic Sea (1864-1998) given by 2-year running mean and second-order polynomial curves. Top: Nordic Sea; middle: eastern area; bottom: western area (after Vinje, 2000). IPCC Third Assessment Report

Since these long-term records show a quasi-60 year cycle in ice extents, it is vital to have a modern dataset based on the same methodology, albeit with sophisticated modern tools.

Summary

Measuring anything in the Arctic is difficult, and especially sea ice that is constantly moving around.  It is a good thing to have independent measures using different methodologies, since any estimate is prone to error.

Please take the time to express your appreciation for NIC’s contribution and your support for their products at MASIE  home page.

Update February 4, 2017

In the comments Neven said MASIE was unusable because it was biased low before 2010 and high afterward.  I have looked into that and he is mistaken.  Below is the pattern that is observed most months.  March is the annual maximum and coming up soon.

march-masie-sii

As the graph shows, the two datasets were aligned through 2010, and then SII began underestimating ice extent, resulting in a negative 11-year trend.  MASIE shows the same fluctuations, but with higher extents and a slightly positive trend for March extents.  The satellite sensors have a hard time with mixed ice/water conditions (well-documented).

More on the two datasets NOAA has been Losing Arctic Ice

Ivermectin Invictus: The Unsung Covid Victor

Invictus Games Toronto 2017

The triumph for Ivermectin over Covid19 is reviewed in biznews Studies on Ivermectin show positive results as Covid-19 treatment.  Excerpts in italics with my bolds.

The use of Ivermectin for the prevention and treatment of Covid-19 has been the subject of much debate. The World Health Organisation‘s recommendation against Ivermectin as an alternative treatment for Covid-19 is shrouded in suspicion as the WHO’s second biggest donor is the Bill and Melinda Gates Foundation (BMGF). Bill Gates also founded and funds The Vaccine Alliance (GAVI). The connection and clear conflict of interest is thus astounding. This 3,000 word synopsis, done by Rubin van Niekerk, is on Bryant’s peer reviewed meta analysis published in the American Journal of Therapeutics about 60 studies on the treatment impact of Ivermectin on Covid-19. Van Niekerk notes that:

‘Ivermectin studies vary widely, which makes the consistently positive results even more remarkable.”

Ivermectin Meta Analysis Synopsis By Rubin van Niekerk*

Meta analysis of 60 studies on Ivermectin and Covid 19 by Bryant, published in the American Journal of Therapeutics. (Version 93 Updated 21/6/21)

This is a brief 3000-word synopsis of the analysis of all significant studies concerning the use of ivermectin for COVID-19. Search methods, inclusion criteria, effect extraction criteria (more serious outcomes have priority), all individual study data, PRISMA answers, and statistical methods are detailed. Random effects of meta-analysis results for all studies, for studies within each treatment stage, for mortality results, for COVID-19 case results, for viral clearance results, for peer-reviewed studies, for Randomized Controlled Trials (RCTs), and after exclusions are presented.

Please read the original 18 000-word comprehensive research analysis should you need more detail and insight into the methodology on Ivermectin for COVID-19: real-time meta analysis of 61 studies 

♦ Meta analysis using the most serious outcome reported shows 76% and 85% improvement for early treatment and prophylaxis (RR 0.24 [0.14-0.41] and 0.15 [0.09-0.25]), with similar results after exclusion based sensitivity analysis, restriction to peer-reviewed studies, and restriction to Randomized Controlled Trials.
81% and 96% lower mortality is observed for early treatment and prophylaxis (RR 0.19 [0.07-0.54] and 0.04 [0.00-0.58]). Statistically significant improvements are seen for mortality, ventilation, hospitalization, cases, and viral clearance. 28 studies show statistically significant improvements in isolation.

Ivermectin meta analysis

•The probability that an ineffective treatment generated results as positive as the 60 studies to date is estimated to be 1 in 2 trillion (p = 0.00000000000045).

•Heterogeneity arises from many factors including treatment delay, population, effect measured, variants, and regimens. The consistency of positive results is remarkable. Heterogeneity is low in specific cases, for example early treatment mortality.

•While many treatments have some level of efficacy, they do not replace vaccines and other measures to avoid infection. Only 27% of ivermectin studies show zero events in the treatment arm.

•Elimination of COVID-19 is a race against viral evolution. No treatment, vaccine, or intervention is 100% available and effective for all current and future variants. All practical, effective, and safe means should be used. Not doing so increases the risk of COVID-19 becoming endemic; and increases mortality, morbidity, and collateral damage.

Pillars Needed Missing

•Administration with food, often not specified, may significantly increase plasma and tissue concentration.

•The evidence base is much larger and has much lower conflict of interest than typically used to approve drugs.

•All data to reproduce this paper and sources are in the appendix. See [Bryant, Hariyanto, Hill, Kory, Lawrie, Nardelli] for other meta analyses, all with similar results confirming effectiveness.

Parasite Drug Analyzed as Possible Covid Treatment in U.K. Trial

cpdfrs7bik441

Leftists Obsessed with Bogus Numbers

5-year-plan-in-four-years

Lubos Motl writes with insight gained from the Czech experience with imposed Communism in his blog article CO2 emissions, “cases”, … fanatical leftists love to worship meaningless quantities as measures of well-being.  Excerpts in italics with my bolds.

Leftists hate money and the conversion of things to money. Why is it so? In the old times, the leftists were the losers who didn’t have much money. The decision based on the “maximization of money” was a decision usually made by “some other people, e.g. the capitalists”, and those may have had different interests than the Marxist losers, and that’s why the Marxist losers generally didn’t like the decisions based on the maximization of the financial benefits. They had a low influence on the society’s decision making (because they were broke) and the interests of the capitalists weren’t always the same as the interests of the Marxist losers. (In reality, what was in the interest in the capitalists was ultimately good for the Marxist losers as well but the latter just didn’t understand it.)

That is the likely reason why the leftists always wanted to switch to some “more objective” measures of well-being. They saw all “subjective” (i.e. money-based) decisions to be dominated by evil people, the class of enemies. Where did this leftist strategy go?

Well, during the 40 years of communism in Czechoslovakia,
the communist party often mindlessly wanted to

maximize the production of coal and steel in tons.

Steel and coal are just two major examples that were used to “objectively measure the well-being”. You may see that within a limited context, there was a grain of truth in it. The more machines we make, the more hard work they may replace, and we need steel and coal for all those good things. But the range of validity of this reasoning was unavoidably very limited. They could have used the U.S. dollars (e.g. the total GDP, or in sustainable salaries) to measure the well-being (that should be maximized by the communist plans) but that would already be bad according to their ideology. Needless to say, it was a road to hell because in the long run, there is no reason why “tons of steel or coal” should be the same thing as “well-being” or “happiness”. And it’s not. We kept on producing lots of steel and coal that was already obsolete, that was helping to preserve technologies and industries that were no longer needed, helpful, or competitive, and the production of coal and steel substantially decreased after communism fell in 1989. We found out that we could get richer despite producing less steel and coal!

In 1989, communism was defeated and humiliated but almost all the communist rats survived. This collective trash has largely moved to the environmentalist movement that became a global warehouse for the Bolshevik human feces, also known as the watermelons. They are green on the surface but red (Bolsheviks) inside. They were willing to modify some details of their ideology or behavior but not the actual core substance. The detail that they modified was to “largely switch the sign” and consider the coal and steel to be evil.

Instead of maximizing steel and coal, the goal became to minimize the CO2 emissions.

The obsession with the CO2 emissions (which now carry the opposite sign: CO2 emissions are claimed to be bad!) is similar to the obsession of the Leninists and Stalinists with the maximization of the steel and coal production except that the current watermelons, the gr@tins of the world, are far more fanatical and unhinged than the Leninists and Stalinists have ever been. And one more thing has changed: these new, green Marxists promote these “objective measures of well-being” because it reduces the freedom, wealth, and power of everyone else. In that sense, they are still Marxists. However, they don’t protest against some people’s getting very rich as long as it is them. By this not so subtle change, we are facing a new class of Marxists who are still Marxists (more fanatical than the old ones) but who are often very rich, too. It is an extremely risky combination when such creatures become both powerful and rich.

Needless to say, the CO2 emissions aren’t the same thing as “evil”, the reduction of the CO2 emissions is in no way the same thing as “well-being”. Instead, if you are at least a little bit rational, you know damn too well that the CO2 emissions are totally obviously positively correlated with the well-being. The more CO2, the better. CO2 is the gas we call life. Its increase by 50% since 1750 AD has allowed the plants to have fewer pores (through which they suck CO2 from the air) which is why they are losing less water and they are better at water management (and at withstanding possible drought). Just the higher CO2 has increased the agricultural yields per squared kilometer by some 20% (greater increases were added by genetic engineering, fight against pests etc.). And the man-made CO2 has freed us from back-breaking labor etc.

15-3.1

The obsession to minimize the CO2 emissions is completely irrational and insane, more insane than the maximization of steel and coal has ever been – but its advocates are more fanatical than the steel and coal comrades used to be. On top of that, most of the projects proposed to lower the CO2 emissions don’t even achieve that because there are always some neglected sources or sinks of CO2 (and lots of cheating everywhere, contrived public “causes” are the ideal environment for corruption, too). Also, the price of one ton of CO2 emissions is as volatile as the Bitcoin and depends on the caps that may be basically arbitrarily chosen by the rogue politicians.

Tons of CO2 are a different quantity to be extremized than tons of coal or steel. But the obsession to “mindlessly minimize or maximize these quantites” is exactly the same and builds on the leftists’ infinite hatred (often just pretended hatred, however) to money as an invention. The hatred towards money is equivalent to the hatred towards the “subjective conversion of costs and benefits to the same unit”. Leftists hate the subjective considerations like that (which are equivalent to counting the costs and benefits in the Czech crowns) because they hate the “subjective thinking” in general. Well, they hate it because the subjective thinking is the thinking of the free people – i.e. people who aren’t politically obedient in general. They prefer “objective thinking”, i.e. an imbecile or a clique of imbeciles who are in charge, have the total power over everybody, and tell everybody “what they should want and do”! When whole nations behave as herds of obedient sheep or other useless animals, the leftists are happy.

Such a general scheme is bound to lead to a decline of the society,
regardless of the detailed choice of the quantity that is worshiped
as the “objective measure of the human well-being”.

In 2020, the epoch of Covidism, if I use the term of the Czech ex-president Václav Klaus, began. The most characteristic yet crazy quantity that the new leftist masters want to minimize (in this case, like the CO2 emissions, it “should be” minimized) are the “cases” of Covid-19, i.e. the number of positive PCR tests (or sometimes all tests, including Ag tests). From the beginning, it’s been insane because most people who are PCR tested positive for Covid-19 aren’t seriously sick. A fraction is completely asymptomatic, a great majority suffers through a very mild disease. On top of that, the number of positive tests depends on the number of people who are tested (because most positive people are unavoidably overlooked unless everyone is tested at least once a week); on the number of “magnifying” cycles in the PCR process; on the strategy to pick the candidates for testing, and lots of other things.

These are the reasons why it has been insane to be focused on the number of “cases” from 2020. But when the methodology to pick the people is constant, when the percentage of the positive tests is roughly kept constant, and when the virus doesn’t change, it becomes fair to use the number of “cases” as a measure of the total proliferation of the disease, Covid-19, in a nation or a population. However, there’s an even deeper problem, one that is related to the main topic of this essay:

Even when the testing frequency and techniques (including the selection) are constant, the number of cases may in no way be considered a measure of the well-being.

The reason is that “being PCR positive” is just a condition that increases the probability that one becomes sick; or one dies. And the number of deaths from Covid-19 is clearly a more important measure of the Covid-related losses than the number of cases – the filthy Coronazis love to obscure even elementary statements such as this one, however. The conversion factor e.g. from the “cases” to “deaths” is the case fatality rate (CFR) and that is not a universal constant. This is particularly important in the case of the Indian “delta” variant of the virus because it also belongs among the common cold viruses. It is a coronaviruses that causes a runny nose. This makes the disease much more contagious, like any common cold, and (in a totally non-immune, normally behaving urban, population). On the other hand, the nose cleans the breathing organs rather efficiently and the disease is unlikely to seriously invade the lungs where it really hurts. In fact, the runny nose indicates that this variant of the virus “likes” to play with the cosmetic problems such as the runny nose, it is not even attracted to the lungs. The same comments apply to any of the hundreds of rhinoviruses, coronaviruses… that cause common cold!

You may check the U.K. Covid graphs to see that despite the growing number of “cases” in recent weeks, the deaths are still near zero. The ratio of the two has decreased by more than one order of magnitude. A factor of 5 or so may be explained by the higher vaccination of the risk groups (older people); the remaining factor is due to the intrinsic lower case fatality rate of the delta variant. It is simply much lower than 0.1%, as every common cold virus is. That is much smaller than some 0.4% which is the expected fraction of the people in a civilized nation that die of Covid-19 (to make these estimates, I mainly use the Czech data which seem clean and I understand them extremely well: some 80% of Czechs have gone through Covid-19 and 0.3% of the population has died, so the case fatality rate must be around 0.4%).

So the conversion factor from a “case” to a “death” may have dropped by a factor of 30 or more in the U.K., relatively to the peak of the disease (the more classical variants of Covid-19). So it is just plain insane to pretend that “one case” is the same problem or “reduction of well-being” as “one case” half a year ago. The disease has turned into a common cold which is nearly harmless. But the society has been totally hijacked by the moronic, self-serving, brutally evil leftists who have simply become powerful assuming that they socially preserve the (totally false) idea that “the number of cases is an important quantity that must be minimized for the society’s well-being”. It is not important at all. The number of cases means absolutely nothing today because almost all the U.K. cases are just examples of a common cold that just happens to pass as a “Covid” through a test because this is how the test was idiotically designed. Everyone who tries to minimize the number of cases as we know them today is a dangerous deluded psychopath and must be treated on par with the war criminals, otherwise whole nations will be greatly damaged. The damage has already been grave but we face the risk of many years (like 40 years of the Czechoslovak communism) when a similar totally destructive way of thinking preserves itself by illegitimate tools that totally contradict even the most elementary Western values.

“Cases” mean nothing, especially when the character of the disease that is detected by the tests becomes vastly less serious. They mean even less than the “CO2 emissions” and even that favorite quantity of the moronic fanatical leftists hasn’t ever been a good measure of anything we should care about. Stop this insanity and treat the people “fighting to lower the cases” as war criminals right now. Thank you very much.

cg5b5e89d87e5cd-1