No, Guardian, Ivermectin Not Discredited by Elgazzar Retraction

97178-ivermectin

The hits against Ivermectin keep on coming.  Dr. Colleen Aldous and Dr. Warren Parker explain this latest smear campaign in their article Ivermectin — front-line doctors vs bureaucrats.  Excerpts in italics with my bolds.

Given the safety profile of Ivermectin, there is nothing to lose and there’s a good possibility of saving many lives and slowing the pandemic

The Ivermectin battle of ideologies on safety and efficacy pits a group of doctors who deal with dying patients every day against bureaucrat academic clinicians. These academic clinicians have dismissed all evidence, favouring a single, large randomised trial that is entirely appropriate for novel drug development but not for pandemics.

This is akin to a person suffering a heart attack and refusing to be taken to hospital in a Toyota, choosing to wait for a Rolls-Royce.

If science is pure, there should not really be a debate, but there is, and it’s purely on the interpretation of science. The Ivermectin meta-analyses have shown that subjectivity in science does happen, something the layperson is made to believe is not possible.

Unfortunately, scientific fraud has also muddied the picture on both sides of the Ivermectin divide. The Elgazzar Ivermectin study, which showed Ivermectin to be highly effective, has been removed from the preprint website for unethical scientific reporting. If this is found to be true it is unforgivable and the authors need to be dealt with.

I’ve no doubt that this will be used to discredit Ivermectin, but it is one of many trials showing efficacy and will be shown to have little weight in the meta-analyses. Just because one lawyer is guilty of corruption does not mean all lawyers are corrupt. In the same vein, a study published in leading medical journal Lancet, showed that hydroxychloroquine as a treatment for Covid-19 was associated with an increased risk of death in patients hospitalised with the disease. However, it was found to be fraudulent and the Lancet was forced to retract the paper.

Bias can come in selecting studies to include in the analysis and the interpretation of the results. Ivermectin can be shown to work by a careful selection of studies that support it. It can be discredited by selecting studies that show it is ineffective.

The SA National Essential Medicines List Committee (NEMLC), which has published its methods on its website, has produced an in-house rapid-review on Ivermectin, which continues to find that Ivermectin should not be used outside clinical trials. This review is not peer-reviewed. The scientific community emphasises the importance of peer-review publication, but our regulatory authorities seem not to. To illustrate the degree of subjectivity, I was in a meeting with one of the authors from the Bryant paper and a NEMLC member. In the discussion the latter stated that while they are aware of the work done in their preprint paper, they disagree with it. Simple!

The methods used in the Ivermectin meta-analyses by Bryant et al are exact. They have a very low risk of bias in themselves. Meta-analyses pool data from several studies to report for a larger sample size than the studies themselves. The heterogeneity of the studies is addressed with rigorous methods to reduce the effect of bias from the individual studies. Bryant et al have careers in data and research analysis. They have prepared decision-to-treat recommendations for international and country-level health bodies.

Their analysis included 24 randomised controlled trials that showed both positive and negative outcomes. The recommendation, among others, is that with moderate certainty Ivermectin could reduce mortality by an average of 62%. Moderate certainty means there is a good chance it is effective to this level.

From looking at their methods in their peer-reviewed publication I believe the selection and interpretation of results were unbiased and currently provide us with recommendations that are more than sufficient to validate the positive effects of Ivermectin for treating Covid-19.

Simply put, SA’s response is now guided by the recommendations of an in-house team over a peer-reviewed, rigorously prepared meta-analysis. The NEMLC document is the guidance observed by all health department facilities and also some private hospitals.

Concerning the Ivercor-Covid-19 trial, it’s a pity all those who have stated that this study is proof that Ivermectin doesn’t work did not read the paper in its entirety. The authors themselves declare in the limitations of their research that the doses given are were low.

As the pandemic has progressed, experience on the ground has shown that Ivermectin is effective at higher doses. Initial recommended doses were low, having been informed by the dosages for anti-parasite treatment. Unfortunately, many trials that are now being run or are completed are using low doses based on earlier assumptions. Even the upcoming Oxford Principle trial of Ivermectin follows low dose regimes that may be insufficient to show effect.

The Lopez-Medina study in Colombia is also often cited as demonstrating that Ivermectin is ineffective. Yet it was so fraught with protocol violations that I would not have submitted the article for publication if I were the principal investigator.

The NEMLC has put the health of our people at risk by recommending against the use of Ivermectin even though it is legally available in SA for off-label use or in the compassionate use programme. Proper evidence-based medicine involves looking at all current evidence conscientiously, not just at a few trials.

During the latter half of the last century our ways of doing science have developed in times of stability and relative prosperity. However, we are in chaos now. We need new thinking. Those in authority are still pushing for their conventional methods for science, which insists that “reality must obey our models… otherwise reality cannot be correct”.

We need more than just a few clinical experts making decisions for our country now that we are hitting this third wave. I believe it is time to put together a multidisciplinary team to examine the arguments of those saying that the totality of evidence points to the necessity of making a Type 1 decision now, roll out Ivermectin.

Given the safety profile of Ivermectin, with nearly 4bn doses given since the 1980s, there is nothing to lose. At worst, it would be like taking an aspirin to ease pain for a bee sting. It won’t harm, but it may help.

If Ivermectin is used, there is a good possibility of saving many lives and slowing down the pandemic. But suppose we have to wait for that elusive large double-blind, randomised control trial (the Rolls-Royce) that will provide the ultimate certainty of the gold standard. In that case, there may be many thousands of unnecessary deaths still to come.

• Dr Aldous is a professor and healthcare scientist at the University of KwaZulu-Natal Medical School, where she runs the doctoral academy at the College of Health Sciences. She has published over 130 peer-reviewed articles in rated journals. Dr Parker, an international public health specialist, has worked in more than 20 countries on health and development concerns, with a focus on translating research into strategic policy.

Footnote:  The Bryant et al. meta-analysis study is discussed here:  Ivermectin Invictus: The Unsung Covid Victor

Why Can’t They See that HCQ or Ivermectin + nutritional supplements
is the missing public health pillar?

Pillars Needed Missing

Climate Change Elevator Speech

On a recent post Judith Curry challenged commenters with this question: 

How would you explain the complexity and uncertainty surrounding climate change plus how we should respond (particularly with regards to CO2 emissions) in five minutes?

 The video was an impressive offering from John Shewchuk, and I thought it worth sharing here.

A Brief History of the Diversity Industry

screen-shot-2021-07-09-at-12.17.46-am
Heather Mac Donald explains the origins and preoccupations of Diversity, Inclusion and Equity (DIE).  Whoops, I mean Diversity, Equity and Inclusion (DEI)  which is now an academic degree you can acquire.  Her Quillette article is Almost Four Decades After Its Birth, The Diversity Industry Thrives on Its Own Failures.

The diversity business originated in 1984, when R. Roosevelt Thomas, a Harvard business school graduate, founded the American Institute for Managing Diversity at Morehouse College. Corporations had been practicing affirmative action for years, but the women and minorities whom employers had hired to meet equal-opportunity obligations weren’t advancing up the career ladder in acceptable numbers. Thomas came up with a novel explanation. The problem wasn’t that preferentially admitted recruits were underqualified; the problem was that their supervisors didn’t know how to “manage diversity.” It was those supervisors who needed remedial training—lots of it—not the affirmative-action beneficiaries themselves.

Managerial expectations about merit and performance often reflected cultural prejudices, Thomas and the consultants who followed him insisted. “‘Qualifications’ is a code word in the business world with very negative connotations,” a consultant with the professional-services firm of Towers Perrin (as it was then called) said in 1993. If minorities don’t meet existing employment criteria, then corporations need to expand their definition of what it means to be employable, said Alan Richter, creator of the 1991 board game, The Diversity Game. Promptness, precision, and a cogent communications style were among the attributes that diversity advisors deemed likely expendable.

A lucrative new consulting practice was born, its growth driven by a constant churn in terminology. “Valuing diversity” was different from “managing diversity.” Each newly spawned phrase came with a cadre of high-priced tutors. Lewis Griggs currently offers video trainings in such subjects as “Communicating Across Differences,” “Supervising and Managing Differences,” and “Creating, Managing, Valuing, and Leveraging Diversity,” with each video purporting to contain specialized content appropriate for different parts of an organization.

“Diversity” was eventually joined by “inclusion.” “Equity” was then added, thus yielding today’s DEI (Diversity, Equity, and Inclusion) triumvirate (sometimes also going as “EDI”). The most cutting-edge organizations have lately appended a “B” (for Belonging), as at the Juilliard School in New York City. Distinguishing these terms is a core function of diversity training—and now, at Bentley, of diversity scholarship. The university’s new DEI major, the Chronicle of Higher Education reports, will help graduates understand the “nuances of and differences between diversity, equity, inclusion, and justice.”

Even by 1993, half of Fortune 500 companies had a designated diversity officer, and 40 percent of American companies had instituted diversity training. Diversity conferences were occurring regularly, attracting government and business attendees. And yet many reporters, academics, corporate consultants, and activists still insist that managers not only fail to “value diversity,” but remain complicit in creating a dangerous environment for women and racial minorities.

Example: Levi Strauss & Co., which was recognized on Forbes’s list of “Best Employers for Diversity” in 2019. The company itself boasts: “In the 1960s, we integrated our factories a decade before it was required by law. In the early 1980s, we joined the fight against HIV/AIDS early on. Furthermore, our president and CEO, Chip Bergh, was one of the first company leaders to join the CEO Action for Diversity & Inclusion™ [in 2017], and has been on the front lines of efforts to protect Dreamers knowing that diversity and inclusivity makes our company better and our country stronger (after all, Levi Strauss himself was an immigrant).”

And yet the situation for minority employees at Levi Strauss is still so dire that the company has been hosting racially segregated healing sessions with professional mental health experts. As the Washington Free Beacon recently reported, its chief executive for DEI is trying to provide a “safe space for employees to express themselves” without feeling “triggered.”

Bentley University itself has yet to yield dividends from its longstanding diversity efforts. The school has been “working for decades on issues, challenges, and opportunities” pertaining to diversity, according to its Office of Diversity and Inclusion. Over 900 faculty and administrators have attended two-day diversity retreats; numerous committees, departments, and offices have focused on improving the school’s “diversity climate.” Bentley even has its own diversity consulting outfit, the Center for Women and Business, which advises employees and managers on such diversity pitfalls as being a mere “performative ally” of oppressed colleagues (as opposed to an active ally).

And yet, despite this effort, a Bentley Racial Justice Task Force recently found that the campus still did not understand how “race and racism” operate at the university. So difficult is it to be a diverse member of Bentley that the task force, formed in July 2020, began with a moment of “restoration,” providing to all “those who had been traumatized” at the school a “time to heal” and a time to “process the pain of racial injustice.”

One of Bentley’s biggest failings, according to the task force, has been its “false confidence” in “objectivity and meritocracy.” These are the norms of a “historically and predominantly white institution (HWI/PWI),” per the task force members. Typical of HWIs/PWIs, Bentley does not pay sufficient attention to the “systemic inequality” that such white norms engender. Equally dismaying, many students and professors apparently would rather study subjects other than racism, the task force lamented, thereby betraying their “lack of understanding about why the study of race is critical to the creation of a full academic experience.”

Diversity industry proponents would argue that white supremacy is simply too ingrained in America’s institutions to be rooted out within a mere three to four decades of diversity work.

But another possible reason why diversity training has not met its stated goals is that the field is intellectually bankrupt: Its practitioners peddle empty verbiage to fix a problem that is largely imaginary. I asked Bentley’s press office what the difference is between “diversity, equity, and inclusion.” The answer was a dodge: “Rather than give students one particular view of diversity, equity, inclusion and justice, Bentley’s DEI major encourages students to compare and contrast approaches to diversity, equity, inclusion and justice from across disciplines and perspectives and show how they intersect with one another.” Other questions—how the school defines a “real discipline,” what are the core texts of this new discipline, and why Bentley’s decades of diversity work have not lessened the school’s purported racism—were ignored entirely.

Bentley sociologist Gary David says that “more and more studies have shown” that diversity training and DEI perspectives make “good business sense.” But this oft-asserted claim rests on a few studies of dubious experimental design, lacking control groups. The one thing diversity trainees reliably learn is how to answer post-training survey questions “in the way the training said they ‘should,’” reports sociologist Musa al-Gharbi. As for actually changing behaviors in a diversity-approved direction, the training is not only ineffective, it is often counterproductive, according to al-Gharbi.

race-card

Far from being institutionally racist, Bentley University, like virtually every other American college today, is filled with well-meaning adults who want all their students to succeed. Corporations, law firms, Big Tech, and government agencies are bending over backwards to hire and promote as many underrepresented minorities (i.e., blacks and Hispanics) as possible. If the number of those minorities in a college or business organization is not proportional to their population share, that underrepresentation is due first and foremost to the academic skills gap. Mention of the skills gap is taboo in diversity circles, but it is real—repeatedly documented by the National Assessment of Educational Progress exams, the SAT, the LSAT, the GREs, the GMAT, and the MCAT—and it is consequential.

Hiring based on any extraneous selection criterion inevitably lowers the average qualifications of the resulting employee group. Hiring based on race entails a particularly significant deviation from a meritocratic ideal, since the only reason why color-conscious hiring is implemented in the first place is that merit hiring often fails to produce a critical mass of black and Hispanic employees. In essence, the diversity conceit is a perpetual motion machine: If underqualified diversity hires are promoted out of diversity pressure, resentment and obfuscation follow. If they hit a glass ceiling, accusations of bias are inevitable. In either situation, a diversity consultant is waiting in the wings to teach managers that their expectations and standards are racist.

The increasing power of college diversity bureaucrats over academic affairs since the 1990s has been stunning. Diversity vice-chancellors oversee faculty hiring searches, mandate quotas regarding whom search committees may interview, and sometimes even mandate quotas regarding whom they must hire. Chief inclusion officers track departmental race and sex demographics, pressuring department chairs to correct diversity deficits. Associate provosts for diversity coordinate campaigns for required courses on identity and grievance within the curriculum. Deans of inclusion teach students to recognize their place on the great totem pole of victimization. Vice presidents for equity monitor campus speech, on the lookout for punishable microaggressions. Senior advisors on race and community lead crusades against faculty who have allegedly threatened the safety of campus victim groups through non-orthodox statements regarding race and sex.

Now that the fictions underpinning this enterprise are being enshrined as an academic discipline, the possibility that the university will return to its status as an institution dedicated to the unfettered search for knowledge—and even, dare one say it, objectivity and meritocracy—will grow yet more remote.

university lightening

Inside the Sea Level Scare Machine

3047060508_737c7687bd_o.0.0

Such beach decorations exhibit the fervent belief of activists that sea levels are rising fast and will flood the coastlines if we don’t stop burning fossil fuels.  As we will see below there is a concerted effort to promote this notion empowered with slick imaging tools to frighten the gullible.  Of course there are frequent media releases sounding the alarms.  Recently for example:

From the Guardian Up to 410 million people at risk from sea level rises – study.  Excerpts in italics with my bolds.

The paper, published in Nature Communications, finds that currently 267 million people worldwide live on land less than 2 metres above sea level. Using a remote sensing method called Lidar, which pulsates laser light across coastal areas to measure elevation on the Earth’s surface, the researchers predicted that by 2100, with a 1 metre sea level rise and zero population growth, that number could increase to 410 million people.

The climate emergency has caused sea levels to rise and more frequent and severe storms to occur, both of which increase flood risks in coastal environments.

Last year, a survey published by Climate and Atmospheric Science, which aggregated the views of 106 specialists, suggested coastal cities should prepare for rising sea levels that could reach as high as 5 metres by 2300, which could engulf areas home to hundreds of millions of people.

The rest of this post provides a tour of seven US cities demonstrating how the sea level scare machine promotes fear among people living or invested in coastal properties.  In each case there are warnings published in legacy print and tv media, visual simulations powered by computers and desktop publishing, and a comparison of imaginary vs. observed sea level trends.

Prime US Cities on the “Endangered” List
Newport , R.I.

Examples of Media Warnings

Bangor Daily News:  In Maine’s ‘City of Ships,’ climate change’s coastal threat is already here

Bath, the 8,500-resident “City of Ships,” is among the places in Maine facing the greatest risks from increased coastal flooding because so much of it is low-lying. The rising sea level in Bath threatens businesses along Commercial and Washington streets and other parts of the downtown, according to an analysis by Climate Central, a nonprofit science and journalism organization.

Water levels reached their highest in the city during a record-breaking storm in 1978 at a little more than 4 feet over pre-2000 average high tides, and Climate Central’s sea level team found there’s a 1-in-4 chance of a 5-foot flood within 30 years. That level could submerge homes and three miles of road, cutting off communities that live on peninsulas, and inundate sites that manage wastewater and hazardous waste along with several museums.

UConn Today:  Should We Stay or Should We Go? Shoreline Homes and Rising Sea Levels in Connecticut

As global temperatures rise, so does the sea level. Experts predict it could rise as much as 20 inches by 2050, putting coastal communities, including those in Connecticut, in jeopardy.

One possible solution is a retreat from the shoreline, in which coastal homes are removed to take them out of imminent danger. This solution comes with many complications, including reductions in tax revenue for towns and potentially diminished real estate values for surrounding properties. Additionally, it can be difficult to get people to volunteer to relocate their homes.

Computer Simulations of the Future

Newport Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

Newport past & projected 2020

Boston, Mass.

Example of Media Warnings

From WBUR Radio Boston:  Rising Sea Levels Threaten MBTA’s Blue Line

Could it be the end of the Blue Line as we know it? The Blue Line, which features a mile-long tunnel that travels underwater, and connects the North Shore with Boston’s downtown, is at risk as sea levels rise along Boston’s coast. To understand the threat sea-level rise poses to the Blue Line, and what that means for the rest of the city, we’re joined by WBUR reporter Simón Ríos and Julie Wormser, Deputy Director at the Mystic River Watershed Association.

As sea levels continue to rise, the Blue Line and the whole MBTA system face an existential threat. The MBTA is also facing a serious financial crunch, still reeling from the pandemic, as we attempt to fully reopen the city and the region. Joining us to discuss is MBTA General Manager Steve Poftak.

Computer Simulations of the Future

Boston Obs Imaged2

 

Imaginary vs. Observed Sea Level Trends (2020 Update)

Boston Past and Projected 2020

 

New York City

Example of Media Warnings

From Quartz: Sea level rise will flood the neighborhood around the UN building with two degrees warming

Right now, of every US city, New York City has the highest population living inside a floodplain. By 2100, seas could rise around around the city by as much as six feet. Extreme rainfall is also predicted to rise, with roughly 1½ times more major precipitation events per year by the 2080s, according to a 2015 report by a group of scientists known as the New York City Panel on Climate Change.

But a two-degree warming scenario, which the world is on track to hit, could lock in dramatic sea level rise—possibly as much as 15 feet.

Computer Simulations of the Future

NYC Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

NYC past & projected 2020

Philadelphia, PA.

Example of Media Warnings

From NBCPhiladelphia:  Climate Change Studies Show Philly Underwater

NBC10 is looking at data and reading studies on climate change to showcase the impact. There are studies that show if the sea levels continue to rise at this rate, parts of Amtrak and Philadelphia International Airport could be underwater in 100 years.

Computer Simulations of the Future

Philly Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

Phil past & projected 2020

Miami, Florida

Examples of Media Warnings

From WLRN Miami: Miles Of Florida Roads Face ‘Major Problem’ From Sea Rise. Is State Moving Fast Enough?

One 2018 Department of Transportation study has already found that a two-foot rise, expected by mid-century, would imperil a little more than five percent — 250-plus miles — of the state’s most high-traffic highways. That may not sound like a lot, but protecting those highways alone could easily cost several billion dollars. A Cat 5 hurricane could be far worse, with a fifth of the system vulnerable to flooding. The impact to seaports, airports and railroads — likely to also be significant and expensive — is only now under analysis.

From Washington Post:  Before condo collapse, rising seas long pressured Miami coastal properties

Investigators are just beginning to try to unravel what caused the Champlain Towers South to collapse into a heap of rubble, leaving at least 159 people missing as of Friday. Experts on sea-level rise and climate change caution that it is too soon to speculate whether rising seas helped destabilize the oceanfront structure. The 40-year-old building was relatively new compared with others on its stretch of beach in the town of Surfside.

But it is already clear that South Florida has been on the front lines of sea-level rise and that the effects of climate change on the infrastructure of the region — from septic systems to aquifers to shoreline erosion — will be a management problem for years.

Computer Simulations of the Future

Florida Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

KW past & projected 2020

Houston, Texas

Example of Media Warnings

From Undark:  A $26-Billion Plan to Save the Houston Area From Rising Seas

As the sea rises, the land is also sinking: In the last century, the Texas coast sank about 2 feet into the sea, partly due to excessive groundwater pumping. Computer models now suggest that climate change will further lift sea levels somewhere between 1 and 6 feet over the next 50 years. Meanwhile, the Texas coastal population is projected to climb from 7 to 9 million people by 2050.

Protecting Galveston Bay is no simple task. The bay is sheltered from the open ocean by two low, sandy strips of land — Galveston Island and Bolivar Peninsula — separated by the narrow passage of Bolivar Roads. When a sufficiently big storm approaches, water begins to rush through that gap and over the island and peninsula, surging into the bay.

Computer Simulations of the Future

Galv Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

Galv past & projected 2020

San Francisco, Cal.

Example of Media Warnings

From San Francisco Chronicle:  Special Report: SF Bay Sea Level Rise–Hayward

Sea level rise is fueled by higher global temperatures that trigger two forces: Warmer water expands oceans while the increased temperatures hasten the melting of glaciers on Antarctica and Greenland and add yet more water to the oceans.

The California Ocean Protection Council, a branch of state government, forecasts a 1-in-7 chance that the average daily tides in the bay will rise 2 or more feet by 2070. This would cause portions of the marshes and bay trail in Hayward to be underwater during high tides. Add another 2 feet, on the higher end of the council’s projections for 2100 and they’d be permanently submerged. Highway 92 would flood during major storms. So would the streets leading into the power plant.

From San Francisco Chronicle Special Report: SF Bay Sea Level Rise–Mission Creek

Along San Francisco’s Mission Creek, sea level rise unsettles the waters.  Each section of this narrow channel must be tailored differently to meet an uncertain future. Do nothing, and the combination of heavy storms with less than a foot of sea level rise could send Mission Creek spilling over its banks in a half-dozen places, putting nearby housing in peril and closing the two bridges that cross the channel.

Whatever the response, we won’t know for decades if the city’s efforts can keep pace with the impact of global climatic forces that no local government can control.

Though Mission Creek is unique, the larger dilemma is one that affects all nine Bay Area counties.

Computer Simulations of the Future

SF Obs Imaged

Imaginary vs. Observed Sea Level Trends (2020 Update)

SF CA past & projected 2020

Summary: This is a relentless, high-tech communications machine to raise all kinds of scary future possibilities, based upon climate model projections, and the unfounded theory of CO2-driven global warming/climate change.  The graphs above are centered on the year 2000, so that the 21st century added sea level rise is projected from that year forward.  In addition, we now have observations at tidal gauges for the first 20 years, 1/5 of the total expected.  The gauges in each city are the ones with the longest continuous service record, and wherever possible the locations shown in the simulations are not far from the tidal gauge.  For example, NYC best gauge is at the Battery, and Fulton St. is also near the Manhattan southern tip.

Already the imaginary rises are diverging greatly from observations, yet the chorus of alarm goes on.  In fact, the added rise to 2100 from tidal gauges ranges from 6 to 9.5 inches, except for Galveston projecting 20.6 inches. Meanwhile models imagined rises from 69 to 108 inches. Clearly coastal settlements must adapt to evolving conditions, but also need reasonable rather than fearful forecasts for planning purposes.

Footnote:  The problem of urban flooding is discussed in some depth at a previous post Urban Flooding: The Philadelphia Story

Background on the current sea level campaign is at USCS Warnings of Coastal Floodings

And as always, an historical perspective is important:

post-glacial_sea_level

Still No Global Warming June 2021

a62edf0f39de560a219b7262163b0d45

The post below updates the UAH record of air temperatures over land and ocean.  But as an overview consider how recent rapid cooling has now completely overcome the warming from the last 3 El Ninos (1998, 2010 and 2016).  The UAH record shows that the effects of the last one are now gone as of April 2021. (UAH baseline is now 1991-2020).

UAH Global 1995to202104 w co2 overlayFor reference I added an overlay of CO2 annual concentrations as measured at Mauna Loa.  While temperatures fluctuated up and down ending flat, CO2 went up steadily by ~55 ppm, a 15% increase.

Furthermore, going back to previous warmings prior to the satellite record shows that the entire rise of 0.8C since 1947 is due to oceanic, not human activity.

 

gmt-warming-events

The animation is an update of a previous analysis from Dr. Murry Salby.  These graphs use Hadcrut4 and include the 2016 El Nino warming event.  The exhibit shows since 1947 GMT warmed by 0.8 C, from 13.9 to 14.7, as estimated by Hadcrut4.  This resulted from three natural warming events involving ocean cycles. The most recent rise 2013-16 lifted temperatures by 0.2C.  Previously the 1997-98 El Nino produced a plateau increase of 0.4C.  Before that, a rise from 1977-81 added 0.2C to start the warming since 1947.

Importantly, the theory of human-caused global warming asserts that increasing CO2 in the atmosphere changes the baseline and causes systemic warming in our climate.  On the contrary, all of the warming since 1947 was episodic, coming from three brief events associated with oceanic cycles. 

mc_wh_gas_web20210423124932

See Also Worst Threat: Greenhouse Gas or Quiet Sun?

June Update Ocean and Land Air Temps Continue Down

banner-blog

With apologies to Paul Revere, this post is on the lookout for cooler weather with an eye on both the Land and the Sea.  While you will hear a lot about 2020 temperatures matching 2016 as the highest ever, that spin ignores how fast is the cooling setting in.  The UAH data analyzed below shows that warming from the last El Nino is now fully dissipated with chilly temperatures setting in all regions.  Last month despite some warming in NH, temps in the Tropics and SH dropped sharply.

UAH has updated their tlt (temperatures in lower troposphere) dataset for June.  Previously I have done posts on their reading of ocean air temps as a prelude to updated records from HADSST3. This month also has a separate graph of land air temps because the comparisons and contrasts are interesting as we contemplate possible cooling in coming months and years. Again last month showed air over land warmed slightly while oceans dropped down further.

Note:  UAH has shifted their baseline from 1981-2010 to 1991-2020 beginning with January 2021.  In the charts below, the trends and fluctuations remain the same but the anomaly values change with the baseline reference shift.

Presently sea surface temperatures (SST) are the best available indicator of heat content gained or lost from earth’s climate system.  Enthalpy is the thermodynamic term for total heat content in a system, and humidity differences in air parcels affect enthalpy.  Measuring water temperature directly avoids distorted impressions from air measurements.  In addition, ocean covers 71% of the planet surface and thus dominates surface temperature estimates.  Eventually we will likely have reliable means of recording water temperatures at depth.

Recently, Dr. Ole Humlum reported from his research that air temperatures lag 2-3 months behind changes in SST.  Thus the cooling oceans now portend cooling land air temperatures to follow.  He also observed that changes in CO2 atmospheric concentrations lag behind SST by 11-12 months.  This latter point is addressed in a previous post Who to Blame for Rising CO2?

After a technical enhancement to HadSST3 delayed updates Spring 2020, May resumed a pattern of HadSST updates toward the following month end.  For comparison we can look at lower troposphere temperatures (TLT) from UAHv6 which are now posted for April.  The temperature record is derived from microwave sounding units (MSU) on board satellites like the one pictured above. Recently there was a change in UAH processing of satellite drift corrections, including dropping one platform which can no longer be corrected. The graphs below are taken from the new and current dataset.

The UAH dataset includes temperature results for air above the oceans, and thus should be most comparable to the SSTs. There is the additional feature that ocean air temps avoid Urban Heat Islands (UHI).  The graph below shows monthly anomalies for ocean temps since January 2015.

UAH Oceans 202106

Note 2020 was warmed mainly by a spike in February in all regions, and secondarily by an October spike in NH alone. End of 2020 November and December ocean temps plummeted in NH and the Tropics. In January SH dropped sharply, pulling the Global anomaly down despite an upward bump in NH. An additional drop in March has SH matching the coldest in this period. March drops in the Tropics and NH made those regions at their coldest since 01/2015.  In June 2021 despite an uptick in NH, the Global anomaly dropped back down due to a record low in SH along with a Tropical cooling.

Land Air Temperatures Tracking Downward in Seesaw Pattern

We sometimes overlook that in climate temperature records, while the oceans are measured directly with SSTs, land temps are measured only indirectly.  The land temperature records at surface stations sample air temps at 2 meters above ground.  UAH gives tlt anomalies for air over land separately from ocean air temps.  The graph updated for June is below.

UAH Land 202106aHere we have fresh evidence of the greater volatility of the Land temperatures, along with an extraordinary departure by SH land.  Land temps are dominated by NH with a 2020 spike in February, followed by cooling down to July.  Then NH land warmed with a second spike in November.  Note the mid-year spikes in SH winter months.  In December all of that was wiped out.

Then January 2021 showed a sharp drop in SH, but a rise in NH more than offset, pulling the Global anomaly upward.  In February NH and the Tropics cooled further, pulling down the Global anomaly, despite slight SH land warming.  March continued to show all regions roughly comparable to early 2015, prior to the 2016 El Nino.  Then in April NH land dropped sharply along with the Tropics, bringing Global Land anomaly down by nearly 0.2C.  Now a remarkable divergence with NH rising in May and June, while SH drops sharply to a new low, along with Tropical cooling. With NH having most of the land mass, the Global land anomaly ticked upward.

The Bigger Picture UAH Global Since 1995

UAH Global 1995to202106

The chart shows monthly anomalies starting 01/1995 to present.  The average anomaly is 0.04, since this period is the same as the new baseline, lacking only the first 4 years.  1995 was chosen as an ENSO neutral year.  The graph shows the 1998 El Nino after which the mean resumed, and again after the smaller 2010 event. The 2016 El Nino matched 1998 peak and in addition NH after effects lasted longer, followed by the NH warming 2019-20, with temps now returning again to the mean.

TLTs include mixing above the oceans and probably some influence from nearby more volatile land temps.  Clearly NH and Global land temps have been dropping in a seesaw pattern, more than 1C lower than the 2016 peak.  Since the ocean has 1000 times the heat capacity as the atmosphere, that cooling is a significant driving force.  TLT measures started the recent cooling later than SSTs from HadSST3, but are now showing the same pattern.  It seems obvious that despite the three El Ninos, their warming has not persisted, and without them it would probably have cooled since 1995.  Of course, the future has not yet been written.

 

2021 Update: Fossil Fuels ≠ Global Warming

gas in hands

Previous posts addressed the claim that fossil fuels are driving global warming. This post updates that analysis with the latest (2020) numbers from BP Statistics and compares World Fossil Fuel Consumption (WFFC) with three estimates of Global Mean Temperature (GMT). More on both these variables below.

WFFC

2020 statistics are now available from BP for international consumption of Primary Energy sources. 2021 Statistical Review of World Energy. 

The reporting categories are:
Oil
Natural Gas
Coal
Nuclear
Hydro
Renewables (other than hydro)

Note:  British Petroleum (BP) now uses Exajoules to replace MToe (Million Tonnes of oil equivalents.) It is logical to use an energy metric which is independent of the fuel source. OTOH renewable advocates have no doubt pressured BP to stop using oil as the baseline since their dream is a world without fossil fuel energy.

From BP conversion table 1 exajoule (EJ) = 1 quintillion joules (1 x 10^18). Oil products vary from 41.6 to 49.4 tonnes per gigajoule (10^9 joules).  Comparing this annual report with previous years shows that global Primary Energy (PE) in MToe is roughly 24 times the same amount in Exajoules.  The conversion factor at the macro level varies from year to year depending on the fuel mix. The graphs below use the new metric.

This analysis combines the first three, Oil, Gas, and Coal for total fossil fuel consumption world wide (WFFC).  The chart below shows the patterns for WFFC compared to world consumption of Primary Energy from 1965 through 2020.

WFFC 2020

The graph shows that global Primary Energy (PE) consumption from all sources has grown continuously over 5 decades. Since 1965  oil, gas and coal (FF, sometimes termed “Thermal”) averaged 89% of PE consumed, ranging from 94% in 1965 to 84% in 2019.  Note that last year, 2020, PE dropped 25 EJ (4%) slightly below 2017 consumption.  WFFC for 2020 dropped 27 EJ (6%), 83% of PE and matching 2013 WFFC consumption. For the 55 year period, the net changes were:

Oil 168%
Gas 506%
Coal 161%
WFFC 218%
PE 259%
Global Mean Temperatures

Everyone acknowledges that GMT is a fiction since temperature is an intrinsic property of objects, and varies dramatically over time and over the surface of the earth. No place on earth determines “average” temperature for the globe. Yet for the purpose of detecting change in temperature, major climate data sets estimate GMT and report anomalies from it.

UAH record consists of satellite era global temperature estimates for the lower troposphere, a layer of air from 0 to 4km above the surface. HadSST estimates sea surface temperatures from oceans covering 71% of the planet. HADCRUT combines HadSST estimates with records from land stations whose elevations range up to 6km above sea level.

Both GISS LOTI (land and ocean) and HADCRUT4 (land and ocean) use 14.0 Celsius as the climate normal, so I will add that number back into the anomalies. This is done not claiming any validity other than to achieve a reasonable measure of magnitude regarding the observed fluctuations.

No doubt global sea surface temperatures are typically higher than 14C, more like 17 or 18C, and of course warmer in the tropics and colder at higher latitudes. Likewise, the lapse rate in the atmosphere means that air temperatures both from satellites and elevated land stations will range colder than 14C. Still, that climate normal is a generally accepted indicator of GMT.

Correlations of GMT and WFFC

The next graph compares WFFC to GMT estimates over the five decades from 1965 to 2020 from HADCRUT4, which includes HadSST3.

WFFC and Hadcrut 2020

Since 1965 the increase in fossil fuel consumption is dramatic and monotonic, steadily increasing by 218% from 146 to 463 exajoules.  Meanwhile the GMT record from Hadcrut shows multiple ups and downs with an accumulated rise of 0.9C over 55 years, 7% of the starting value.

The graph below compares WFFC to GMT estimates from UAH6, and HadSST3 for the satellite era from 1980 to 2020, a period of 40 years.

WFFC and UAH HadSST 2020

In the satellite era WFFC has increased at a compounded rate of nearly 2% per year, for a total increase of 82% since 1979. At the same time, SST warming amounted to 0.52C, or 3.7% of the starting value.  UAH warming was 0.7C, or 5% up from 1979.  The temperature compounded rate of change is 0.1% per year, an order of magnitude less than WFFC.  Even more obvious is the 1998 El Nino peak and flat GMT since.

Summary

The climate alarmist/activist claim is straight forward: Burning fossil fuels makes measured temperatures warmer. The Paris Accord further asserts that by reducing human use of fossil fuels, further warming can be prevented.  Those claims do not bear up under scrutiny.

It is enough for simple minds to see that two time series are both rising and to think that one must be causing the other. But both scientific and legal methods assert causation only when the two variables are both strongly and consistently aligned. The above shows a weak and inconsistent linkage between WFFC and GMT.

Going further back in history shows even weaker correlation between fossil fuels consumption and global temperature estimates:

wfc-vs-sat

Figure 5.1. Comparative dynamics of the World Fuel Consumption (WFC) and Global Surface Air Temperature Anomaly (ΔT), 1861-2000. The thin dashed line represents annual ΔT, the bold line—its 13-year smoothing, and the line constructed from rectangles—WFC (in millions of tons of nominal fuel) (Klyashtorin and Lyubushin, 2003). Source: Frolov et al. 2009

In legal terms, as long as there is another equally or more likely explanation for the set of facts, the claimed causation is unproven. The more likely explanation is that global temperatures vary due to oceanic and solar cycles. The proof is clearly and thoroughly set forward in the post Quantifying Natural Climate Change.

Background context for today’s post is at Claim: Fossil Fuels Cause Global Warming.

June 2021 N. Atlantic Finally Cooling?

RAPID Array measuring North Atlantic SSTs.

For the last few years, observers have been speculating about when the North Atlantic will start the next phase shift from warm to cold.

Source: Energy and Education Canada

An example is this report in May 2015 The Atlantic is entering a cool phase that will change the world’s weather by Gerald McCarthy and Evan Haigh of the RAPID Atlantic monitoring project. Excerpts in italics with my bolds.

This is known as the Atlantic Multidecadal Oscillation (AMO), and the transition between its positive and negative phases can be very rapid. For example, Atlantic temperatures declined by 0.1ºC per decade from the 1940s to the 1970s. By comparison, global surface warming is estimated at 0.5ºC per century – a rate twice as slow.

In many parts of the world, the AMO has been linked with decade-long temperature and rainfall trends. Certainly – and perhaps obviously – the mean temperature of islands downwind of the Atlantic such as Britain and Ireland show almost exactly the same temperature fluctuations as the AMO.

Atlantic oscillations are associated with the frequency of hurricanes and droughts. When the AMO is in the warm phase, there are more hurricanes in the Atlantic and droughts in the US Midwest tend to be more frequent and prolonged. In the Pacific Northwest, a positive AMO leads to more rainfall.

A negative AMO (cooler ocean) is associated with reduced rainfall in the vulnerable Sahel region of Africa. The prolonged negative AMO was associated with the infamous Ethiopian famine in the mid-1980s. In the UK it tends to mean reduced summer rainfall – the mythical “barbeque summer”.Our results show that ocean circulation responds to the first mode of Atlantic atmospheric forcing, the North Atlantic Oscillation, through circulation changes between the subtropical and subpolar gyres – the intergyre region. This a major influence on the wind patterns and the heat transferred between the atmosphere and ocean.

The observations that we do have of the Atlantic overturning circulation over the past ten years show that it is declining. As a result, we expect the AMO is moving to a negative (colder surface waters) phase. This is consistent with observations of temperature in the North Atlantic.

Cold “blobs” in North Atlantic have been reported, but they are usually a winter phenomena. For example in April 2016, the sst anomalies looked like this

But by September, the picture changed to this

And we know from Kaplan AMO dataset, that 2016 summer SSTs were right up there with 1998 and 2010 as the highest recorded.

As the graph above suggests, this body of water is also important for tropical cyclones, since warmer water provides more energy.  But those are annual averages, and I am interested in the summer pulses of warm water into the Arctic. As I have noted in my monthly HadSST3 reports, most summers since 2003 there have been warm pulses in the north atlantic.

AMO June 2021
The AMO Index is from from Kaplan SST v2, the unaltered and not detrended dataset. By definition, the data are monthly average SSTs interpolated to a 5×5 grid over the North Atlantic basically 0 to 70N.  The graph shows warming began after 1970s up to 1998, with a series of matching years since.  Since 2016, June SSTs have backed down despite an upward bump in 2020. Because McCarthy refers to hints of cooling to come in the N. Atlantic, let’s take a closer look at some AMO years in the last 2 decades.

AMO decade 062021

This graph shows monthly AMO temps for some important years. The Peak years were 1998, 2010 and 2016, with the latter emphasized as the most recent. The other years show lesser warming, with 2007 emphasized as the coolest in the last 20 years. Note the red 2018 line is at the bottom of all these tracks.  Note that 2020 tracked the 2016 highs, even exceeding those temps the first 4 months.  Now 2021 is starting tracking the much cooler 2018.

With all the talk of AMOC slowing down and a phase shift in the North Atlantic, we await SST measurements for July, August and September to confirm if cooling is starting to set in.

July 2021 Heat Records Silly Season Again

Photo illustration by Slate. Photos by Thinkstock.

A glance at the news aggregator shows the silly season is in full swing.  A partial listing of headlines today proclaiming the hottest whatever.

  • Last Month Was the Hottest June in North America in Recent Recorded History TIME
  • Global Warming Is Increasing the Likelihood of Frost Damage in Vineyards Martha Stewart
  • Heat records smashed in Moscow and Helsinki CGTN
  • Alberta glacial melt about 3 times higher than average during heat wave: expert The Weather Network
  • US and Canada heatwave ‘impossible’ without climate change, analysis shows Sky News
  • Heat waves caused warmest June ever in North America The Independent
  • Glacial melt: the European Alps New Age
  • The North American heatwave shows we need to know how climate change will change our weather Cyprus Mail
  • Global evidence links rise in extreme precipitation to human-driven climate change Phys.org
  • Drought-Stricken Western Districts Plan New Ways to Store Water Bloomberg
  • Amid record heat, Equilibrium Capital raises $1 billion for second greenhouse fund ImpactAlpha
  • Last Month Was Hottest June on Record in North America MTV Lebanon
  • Superior National Forest could provide refuge to wildlife as the climate warms Yale Climate Connections
  • How climate change is exacerbating record heatwaves The Telegraph
  • Lapland records hottest day for more than a century as heatwave grips region Sky News
  • Heatwave stokes North America’s warmest June on record The Raw Story

Time for some Clear Thinking about Heat Records (Previous Post)

Here is an analysis using critical intelligence to interpret media reports about temperature records this summer. Daniel Engber writes in Slate Crazy From the Heat

The subtitle is Climate change is real. Record-high temperatures everywhere are fake.  As we shall see from the excerpts below, The first sentence is a statement of faith, since as Engber demonstrates, the notion does not follow from the temperature evidence. Excerpts in italics with my bolds.

It’s been really, really hot this summer. How hot? Last Friday, the Washington Post put out a series of maps and charts to illustrate the “record-crushing heat.” All-time temperature highs have been measured in “scores of locations on every continent north of the equator,” the article said, while the lower 48 states endured the hottest-ever stretch of temperatures from May until July.

These were not the only records to be set in 2018. Historic heat waves have been crashing all around the world, with records getting shattered in Japan, broken on the eastern coast of Canada, smashed in California, and rewritten in the Upper Midwest. A city in Algeria suffered through the highest high temperature ever recorded in Africa. A village in Oman set a new world record for the highest-ever low temperature. At the end of July, the New York Times ran a feature on how this year’s “record heat wreaked havoc on four continents.” USA Today reported that more than 1,900 heat records had been tied or beaten in just the last few days of May.

While the odds that any given record will be broken may be very, very small, the total number of potential records is mind-blowingly enormous.

There were lots of other records, too, lots and lots and lots—but I think it’s best for me to stop right here. In fact, I think it’s best for all of us to stop reporting on these misleading, imbecilic stats. “Record-setting heat,” as it’s presented in news reports, isn’t really scientific, and it’s almost always insignificant. And yet, every summer seems to bring a flood of new superlatives that pump us full of dread about the changing climate. We’d all be better off without this phony grandiosity, which makes it seem like every hot and humid August is unparalleled in human history. It’s not. Reports that tell us otherwise should be banished from the news.

It’s true the Earth is warming overall, and the record-breaking heat that matters most—the kind we’d be crazy to ignore—is measured on a global scale. The average temperature across the surface of the planet in 2017 was 58.51 degrees, one-and-a-half degrees above the mean for the 20th century. These records matter: 17 of the 18 hottest years on planet Earth have occurred since 2001, and the four hottest-ever years were 2014, 2015, 2016, and 2017. It also matters that this changing climate will result in huge numbers of heat-related deaths. Please pay attention to these terrifying and important facts. Please ignore every other story about record-breaking heat.

You’ll often hear that these two phenomena are related, that local heat records reflect—and therefore illustrate—the global trend. Writing in Slate this past July, Irineo Cabreros explained that climate change does indeed increase the odds of extreme events, making record-breaking heat more likely. News reports often make this point, linking probabilities of rare events to the broader warming pattern. “Scientists say there’s little doubt that the ratcheting up of global greenhouse gases makes heat waves more frequent and more intense,” noted the Times in its piece on record temperatures in Algeria, Hong Kong, Pakistan, and Norway.

Yet this lesson is subtler than it seems. The rash of “record-crushing heat” reports suggest we’re living through a spreading plague of new extremes—that the rate at which we’re reaching highest highs and highest lows is speeding up. When the Post reports that heat records have been set “at scores of locations on every continent,” it makes us think this is unexpected. It suggests that as the Earth gets ever warmer, and the weather less predictable, such records will be broken far more often than they ever have before.

But that’s just not the case. In 2009, climatologist Gerald Meehl and several colleagues published an analysis of records drawn from roughly 2,000 weather stations in the U.S. between 1950 and 2006. There were tens of millions of data points in all—temperature highs and lows from every station, taken every day for more than a half-century. Meehl searched these numbers for the record-setting values—i.e., the days on which a given weather station saw its highest-ever high or lowest-ever low up until that point. When he plotted these by year, they fell along a downward-curving line. Around 50,000 new heat records were being set every year during the 1960s; then that number dropped to roughly 20,000 in the 1980s, and to 15,000 by the turn of the millennium.

From Meehl et al 2009.

This shouldn’t be surprising. As a rule, weather records will be set less frequently as time goes by. The first measurement of temperature that’s ever taken at a given weather station will be its highest (and lowest) of all time, by definition. There’s a good chance that the same station’s reading on Day 2 will be a record, too, since it only needs to beat the temperature recorded on Day 1. But as the weeks and months go by, this record-setting contest gets increasingly competitive: Each new daily temperature must now outdo every single one that came before. If the weather were completely random, we might peg the chances of a record being set at any time as 1/n, where n is the number of days recorded to that point. In other words, one week into your record-keeping, you’d have a 1 in 7 chance of landing on an all-time high. On the 100th day, your odds would have dropped to 1 percent. After 56 years, your chances would be very, very slim.

The weather isn’t random, though; we know it’s warming overall, from one decade to the next. That’s what Meehl et al. were looking at: They figured that a changing climate would tweak those probabilities, goosing the rate of record-breaking highs and tamping down the rate of record-breaking lows. This wouldn’t change the fundamental fact that records get broken much less often as the years go by. (Even though the world is warming, you’d still expect fewer heat records to be set in 2000 than in 1965.) Still, one might guess that climate change would affect the rate, so that more heat records would be set than we’d otherwise expect.

That’s not what Meehl found. Between 1950 and 2006, the rate of record-breaking heat seemed unaffected by large-scale changes to the climate: The number of new records set every year went down from one decade to the next, at a rate that matched up pretty well with what you’d see if the odds were always 1/n. The study did find something more important, though: Record-breaking lows were showing up much less often than expected. From one decade to the next, fewer records of any kind were being set, but the ratio of record lows to record highs was getting smaller over time. By the 2000s, it had fallen to about 0.5, meaning that the U.S. was seeing half as many record-breaking lows as record-breaking highs. (Meehl has since extended this analysis using data going back to 1930 and up through 2015. The results came out the same.)

What does all this mean? On one hand, it’s very good evidence that climate change has tweaked the odds for record-breaking weather, at least when it comes to record lows. (Other studies have come to the same conclusion.) On the other hand, it tells us that in the U.S., at least, we’re not hitting record highs more often than we were before, and that the rate isn’t higher than what you’d expect if there weren’t any global warming. In fact, just the opposite is true: As one might expect, heat records are getting broken less often over time, and it’s likely there will be fewer during the 2010s than at any point since people started keeping track.

This may be hard to fathom, given how much coverage has been devoted to the latest bouts of record-setting heat. These extreme events are more unusual, in absolute terms, than they’ve ever been before, yet they’re always in the news. How could that be happening?

While the odds that any given record will be broken may be very, very small, the total number of potential records that could be broken—and then reported in the newspaper—is mind-blowingly enormous. To get a sense of how big this number really is, consider that the National Oceanic and Atmospheric Administration keeps a database of daily records from every U.S. weather station with at least 30 years of data, and that its website lets you search for how many all-time records have been set in any given stretch of time. For instance, the database indicates that during the seven-day period ending on Aug. 17—the date when the Washington Post published its series of “record-crushing heat” infographics—154 heat records were broken.

 

That may sound like a lot—154 record-high temperatures in the span of just one week. But the NOAA website also indicates how many potential records could have been achieved during that time: 18,953. In actuality, less than one percent of these were broken. You can also pull data on daily maximum temperatures for an entire month: I tried that with August 2017, and then again for months of August at 10-year intervals going back to the 1950s. Each time the query returned at least about 130,000 potential records, of which one or two thousand seemed to be getting broken every year. (There was no apparent trend toward more records being broken over time.)

Now let’s say there are 130,000 high-temperature records to be broken every month in the U.S. That’s only half the pool of heat-related records, since the database also lets you search for all-time highest low temperatures. You can also check whether any given highest high or highest low happens to be a record for the entire month in that location, or whether it’s a record when compared across all the weather stations everywhere on that particular day.

Add all of these together and the pool of potential heat records tracked by NOAA appears to number in the millions annually, of which tens of thousands may be broken. Even this vastly underestimates the number of potential records available for media concern. As they’re reported in the news, all-time weather records aren’t limited to just the highest highs or highest lows for a given day in one location. Take, for example, the first heat record mentioned in this column, reported in the Post: The U.S. has just endured the hottest May, June, and July of all time. The existence of that record presupposes many others: What about the hottest April, May and June, or the hottest March, April, and May? What about all the other ways that one might subdivide the calendar?

Geography provides another endless well of flexibility. Remember that the all-time record for the hottest May, June, and July applied only to the lower 48 states. Might a different set of records have been broken if we’d considered Hawaii and Alaska? And what about the records spanning smaller portions of the country, like the Midwest, or the Upper Midwest, or just the state of Minnesota, or just the Twin Cities? And what about the all-time records overseas, describing unprecedented heat in other countries or on other continents?

Even if we did limit ourselves to weather records from a single place measured over a common timescale, it would still be possible to parse out record-breaking heat in a thousand different ways. News reports give separate records, as we’ve seen, for the highest daily high and the highest daily low, but they also tell us when we’ve hit the highest average temperature over several days or several weeks or several months. The Post describes a recent record-breaking streak of days in San Diego with highs of at least 83 degrees. (You’ll find stories touting streaks of daily highs above almost any arbitrary threshold: 90 degrees, 77 degrees, 60 degrees, et cetera.) Records also needn’t focus on the temperature at all: There’s been lots of news in recent weeks about the fact that the U.K. has just endured its driest-ever early summer.

“Record-breaking” summer weather, then, can apply to pretty much any geographical location, over pretty much any span of time. It doesn’t even have to be a record—there’s an endless stream of stories on “near-record heat” in one place or another, or the “fifth-hottest” whatever to happen in wherever, or the fact that it’s been “one of the hottest” yadda-yaddas that yadda-yadda has ever seen. In the most perverse, insane extension of this genre, news outlets sometimes even highlight when a given record isn’t being set.

Loose reports of “record-breaking heat” only serve to puff up muggy weather and make it seem important. (The sham inflations of the wind chill factor do the same for winter months.) So don’t be fooled or flattered by this record-setting hype. Your summer misery is nothing special.

Summary

This article helps people not to confuse weather events with climate.  My disappointment is with the phrase, “Climate Change is Real,” since it is subject to misdirection.  Engber uses that phrase referring to rising average world temperatures, without explaining that such estimates are computer processed reconstructions since the earth has no “average temperature.”  More importantly the undefined “climate change” is a blank slate to which a number of meanings can be attached.

Some take it to mean: It is real that rising CO2 concentrations cause rising global warming.  Yet that is not supported by temperature records.
Others think it means: It is real that using fossil fuels causes global warming.  This too lacks persuasive evidence.
WFFC and Hadcrut 2018Over the last five decades the increase in fossil fuel consumption is dramatic and monotonic, steadily increasing by 234% from 3.5B to 11.7B oil equivalent tons. Meanwhile the GMT record from Hadcrut shows multiple ups and downs with an accumulated rise of 0.74C over 53 years, 5% of the starting value.

Others know that Global Mean Temperature is a slippery calculation subject to the selection of stations.

Graph showing the correlation between Global Mean Temperature (Average T) and the number of stations included in the global database. Source: Ross McKitrick, U of Guelph

Global warming estimates combine results from adjusted records.
Conclusion

The pattern of high and low records discussed above is consistent with natural variability rather than rising CO2 or fossil fuel consumption. Those of us not alarmed about the reported warming understand that “climate change” is something nature does all the time, and that the future is likely to include periods both cooler and warmer than now.

Background Reading:

The Climate Story (Illustrated)

2020 Update: Fossil Fuels ≠ Global Warming

Man Made Warming from Adjusting Data

What is Global Temperature? Is it warming or cooling?

NOAA US temp 2019 2021

Be Afraid of Covid Variants. Be very Afraid.

image-13

This campfire ghost story has been around since last March, but is revived and promoted now to maintain public anxiety and perpetuate the medical-industrial complex. Some background and background resources are in Daniel Horowitz’s Blaze article The Delta deception: New COVID variant might be less deadly.  Excerpts in italics with my bolds.

“This COVID variant will be the one to really get us. No, it’s this one. Well, Alpha, Beta, and Gamma weren’t a problem, but I promise you ‘the Delta’ spells the end of civilization.”

That is essentially the panic porn dressed up as science that we have been treated to ever since the virus declined in January following the winter spread, which appears to have given us a great deal of herd immunity. Despite the advent of the British and South African variants, cases, not to mention fatalities, have continued to plummet in all of the places where those variants were supposedly common. Which is why they are now repeating the same mantra about the “Delta” variant from India.

The headlines are screaming with panic over the impending doom of the Delta variant hitting the U.S.:

“WHO says delta is the fastest and fittest Covid variant and will ‘pick off’ most vulnerable” (CNBC)
“Highly contagious Delta variant could cause next COVID-19 wave: ‘This virus will still find you'” (CBS)
“Delta Variant Gains Steam in Undervaccinated U.S. Counties” (Bloomberg)
“The Delta variant might pose the biggest threat yet to vaccinated people” (Business Insider)
And of course, no list would be complete without the headline from Dr. Fauci yesterday, stating that the “Delta variant is the greatest threat in the US.”

The implication from these headlines is that somehow this variant is truly more transmissible and deadly (as the previous variants were falsely portrayed to be), they escape natural immunity and possibly the vaccine — and therefore, paradoxically, you must get vaccinated and continue doing all the things that failed to work for the other variants!

After each city and country began getting ascribed its own “variant,” I think the panic merchants realized that the masses would catch on to the variant scam, so they decided to rename them Alpha (British), Beta (South African), Gamma (Brazilian), and Delta (Indian), which sounds more like a hierarchy of progression and severity rather than each region simply getting hit when it’s in season until the area reaches herd immunity.

However, if people would actually look at the data, they’d realize that the Delta variant is actually less deadly. These headlines are able to gain momentum only because of the absurd public perception that somehow India got hit worse than the rest of the world. In reality, India has one-seventh the death rate per capita of the U.S.; it’s just that India got the major winter wave later, when the Western countries were largely done with it, thereby giving the illusion that India somehow suffered worse. Now, the public health Nazis are transferring their first big lie about what happened in India back to the Western world.

Fortunately, the U.K. government has already exposed these headlines as a lie, for those willing to take notice. On June 18, Public Health England published its 16th report on “SARS-CoV-2 variants of concern and variants under investigation in England,” this time grouping the variants by Greek letters.

img

As you can see, the Delta variant has a 0.1% case fatality rate (CFR) out of 31,132 Delta sequence infections confirmed by investigators. That is the same rate as the flu and is much lower than the CFR for the ancestral strain or any of the other variants. And as we know, the CFR is always higher than the infection fatality rate (IFR), because many of the mildest and asymptomatic infections go undocumented, while the confirmed cases tend to have a bias toward those who are more evidently symptomatic.

In other words, Delta is literally the flu with a CFR identical to it. This is exactly what every respiratory pandemic has done through history: morphed into more transmissible and less virulent form that forces the other mutations out since you get that one. Nothing about masks, lockdowns, or experimental shots did this. To the extent this really is more transmissible, it’s going to be less deadly, as is the case with the common cold. To the extent that there are areas below the herd immunity threshold (for example, in Scotland and the northwestern parts of the U.K.) they will likely get the Delta variant (until something else supplants it), but fatalities will continue to go down.

According to the above-mentioned report, the Delta variant represented more than 75% of all cases in the U.K. since mid-May. If it really was that deadly, it should have been wreaking havoc over the past few weeks.

img-1

You can see almost a perfect inverse relationship between hospitalization rates throughout April and May plummeting as the Delta variant became the dominant strain of the virus in England. Some areas might see a slight oscillation from time to time as herd immunity fills in, regardless of which variant is floating around. However, the death burden is well below that of a flu season and is no longer an epidemic.

Thus, the good news is that now that most countries have reached a large degree of herd immunity, there is zero threat of hospitals being overrun by any seasonal increase in various areas, no matter the variant. The bad news is that after Delta, there are Epsilon and 19 other letters of the Greek alphabet, which will enable the circuitous cycle of misinformation, fear, panic, and control to continue. And remember, as there is already a “Delta+,” the options are endless until our society finally achieves immunity to COVID panic porn.

Footnote:  The last paragraph was prescient:  Now the TV is blathering about the “lambda” variant from Peru.

SEC Warned Off Climate Disclosures

Warning banner

David Burton writes to the Securities Exchange Commission explaining the hazards they will be taking on needlessly should they continue desiring to impose Climate Change Disclosures on publicly traded enterprises.  His document was sent to the SEC Chairman entitled Re: Comments on Climate Disclosure.  Excerpts in italics with my bolds.

Summary of Key Points

1. Climate Change Disclosure Would Impede the Commission’s Important Mission.

The important mission of the U.S. Securities and Exchange Commission is to protect investors, maintain fair, orderly, and efficient markets, and facilitate capital formation. Mandatory climate change disclosure would impede rather than further that mission. It would affirmatively harm investors, impede capital formation and do nothing to improve the efficiency of capital markets.

2. Immaterial Climate Change “Disclosure” Would Obfuscate Rather than Inform.

The concept of materiality has been described as the cornerstone of the disclosure system established by the federal securities laws. Disclosure of material climate-related information is already required under ordinary securities law principles and Regulation SK. Mandatory “disclosure” of immaterial, highly uncertain, highly disputable information would obfuscate rather than inform. It will harm rather than hurt investors.

3. Climate Models and Climate Science are Highly Uncertain.

There is a massive amount of variance among various climate models and uncertainty regarding the future of the climate.

4. Economic Modeling of Climate Change Effects is Even More Uncertain.

There is an even higher degree of variance and uncertainty associated with attempts to model or project the economic impact of highly divergent and uncertain climate models. Any estimate of the economic impact of climate change would have to rely on highly uncertain and divergent climate model results discussed below. In addition to this high degree of uncertainty would be added an entirely new family of economic ambiguity and uncertainty. Any economic estimate of the impact of climate change would also have to choose a discount rate to arrive at the present discounted value of future costs and benefits of climate change and to estimate the future costs and benefits of various regulatory or private responses. The choice of discount rate is controversial and important. Estimates would need to be made of the cost of various aspects of climate change (sea level rises, the impact on agriculture, etc.). Estimates would need to be made of the cost of various remediation techniques. Guesses would need to be made about the rate of technological change. Guesses would need to be made about the regulatory, tax and other responses of a myriad of governments. Estimates would need to be made using conventional economic techniques regarding the economic impact of those changes which, in turn, would reflect a wide variety of techniques and in many cases a thin or non-existent empirical literature. Guesses would need to be made of market responses to all of these changes since market participants will not stand idly by and do nothing as markets and the regulatory environment change. Then, after making decisions regarding all of these extraordinarily complex, ambiguous and uncertain issues, issuers would then need to assess the likely impact of climate change on their specific business years into the future – a business that may by then bear little resemblance to the issuers’ existing business.

Then, the Commission would need to assess the veracity of the issuers’ “disclosure” based on this speculative house of cards. The idea that all of this can be done in a way that will meaningfully improve investors’ decision making is not credible.

5. The Commission Does Not Possess the Expertise to Competently Assess Climate Models or the Economic Impact of Climate Change.

The Commission has neither the expertise to assess climate models nor the expertise to assess economic models purporting to project the economic impact of divergent and uncertain climate projections.

6. The Commission Has Neither the Expertise nor the Administrative Ability to Assess the Veracity of Issuer Climate Change Disclosures.

The Commission does not have the expertise or administrative ability to assess the veracity, or lack thereof, of issuer “disclosures”  based on firm-specific speculation regarding the impact of climate change which would be based on firm-specific choices regarding highly divergent and uncertain economic models projecting the economic impact of climate changes based on firm-specific choices regarding highly divergent and uncertain climate models.

7. Commission Resources Are Better Spent Furthering Its Mission.

Imposing these requirements and developing the expertise to police such climate disclosure by thousands of issuers will involve the expenditure of very substantial resources. These resources would be much better spent furthering the Commission’s important mission.

8. The Costs Imposed on Issuers Would be Large.

Requiring all public companies to develop climate modeling expertise, the ability to make macroeconomic projections based on these models and then make firm-specific economic assessments based on these climate and economic models will be expensive, imposing costs that will amount to billions of dollars on issuers. These expenses would harm investors by reducing shareholder returns.

9.Climate Change Disclosure Requirements Would Further Reduce the Attractiveness of Becoming a Public Company,
Harming Ordinary Investors and Entrepreneurial Capital Formation.

Such requirements would further reduce the attractiveness of being a registered, public company. They would exacerbate the decline in the number of public companies and the trend of companies going public later in their life cycle. This, in turn, would deny to ordinary (unaccredited) investors the opportunity to invest in dynamic, high-growth, profitable companies until most of the money has already been made by affluent accredited investors. It would further impede entrepreneurial access to public capital markets.

10. Climate Change Disclosure Requirements Would Create a New Compliance Eco-System and a New Lobby to Retain the Requirements.

The imposition of such requirements would result in the creation of a new compliance eco-system and pro-complexity lobby composed of the economists, accountants, attorneys and compliance officers that live off of the revised Regulation S-K.

11. Climate Change Disclosure Requirements Would Result in Much Litigation.

The imposition of such requirements would result in much higher litigation risk and expense as private lawsuits are filed challenging the veracity of climate disclosures. These lawsuits are virtually assured since virtually no climate models have accurately predicated future climate and the economic and financial projections based on these climate models are even more uncertain. Litigation outcomes would be as uncertain as the underlying climate science, economics and the associated financial projections. This would harm investors and entrepreneurial capital formation.

12. Material Actions by Management in Furtherance of Social and Political Objectives that Reduce Returns must be Disclosed.

Many environmentally constructive corporate actions will occur in the absence of any government mandate or required disclosure. For example, energy conservation measures may reduce costs as well as emissions. No new laws or regulations are necessary to induce firms to take these actions. Assuming they are not utterly pointless, climate change disclosure laws presumably would be designed to induce management to take action that they would not otherwise take. To the extent management takes material actions in furtherance of social and political objectives (including ESG objectives) that reduce shareholder returns, whether induced by climate change disclosure requirements or taken for other reasons, they need to disclose that information. The Commission should ensure that they do so. Absent some drastic change in the underlying law by Congress, this principle would apply to any reduction in returns whether induced by ESG disclosures (climate change related or otherwise) or taken by management on its own initiative to achieve social and political objectives.

13. Fund Managers Attempts to Profit from SRI at the Expense of Investors Should be Policed.

Fund management firms are generally compensated from either sales commissions (often called loads) or investment management fees that are typically based on assets under management. Their compensation is not closely tied to performance. Thus, these firms will often see a financial advantage in selling “socially responsible” products that perform no better and often worse than conventional investments. It is doubtful that this is consistent with Regulation BI. Their newfound interest in socially responsible investing should be taken with the proverbial grain of salt. The Commission should monitor their efforts to profit from SRI at the expense of investors.

14. Duties of Fund Managers Should be Clarified.

The extreme concentration in the proxy advisory and fund management business is cause for concern. As few as 20 firms may exercise effective control over most public companies. The Commission should make it clear that investment advisers managing investment funds, including retirement funds or accounts, have a duty to manage those funds and to vote the shares held by the funds in the financial, economic or pecuniary interest of the millions of small investors that invest in, or are beneficiaries of, those funds and that the funds may not be managed to further the managers’ preferred political or social objectives.

15. Securities Laws are a Poor Mechanism to Address Externalities.

Externalities, such as pollution, should be addressed by either enhancing property rights or, in the case of unowned resources such as the air and waterways, by a regulatory response that carefully assesses the costs and benefits of the regulatory response. Securities disclosure is the wrong place to try to address externalities. Policing externalities is far outside of the scope of Commission’s mission and the purpose of the securities laws.

16. Climate Change Disclosure Requirements Would Have No Meaningful Impact on the Climate.

When all is said and done, climate change disclosure requirements will have somewhere between a trivial impact and no impact on climate change.

17. Efforts to Redefine Materiality or the Broader Purpose of Business should be Opposed.

Simply because some politically motivated investors seek to impose a disclosure requirement on issuers does not make such a requirement material. The effort to redefine materiality in the securities laws is part of an increasingly strident effort to redefine the purpose of businesses more generally to achieve various social or political objectives unrelated to earning a return, satisfying customers, or treating workers or suppliers fairly. This is being done under the banner of social justice; corporate social responsibility (CSR); stakeholder theory; environmental, social and governance (ESG) criteria; socially responsible investing (SRI); sustainability; diversity; business ethics; common-good capitalism; or corporate actual responsibility. The social costs of ESG and broader efforts to repurpose business firms will be considerable. Wages will decline or grow more slowly, firms will be less productive and less internationally competitive, investor returns will decline, innovation will slow, goods and services quality will decline and their prices will increase.

18.  ESG Requirements will Make Management Even Less Accountable.

In large, modern corporations there is a separation of ownership and control. There is a major agent/principal problem because management and the board of directors often, to varying degrees, pursue their own interest rather than the interests of shareholders. Profitability is, however, a fairly clear measure of the success or failure of management and the board. If a firm become unprofitable or lags considerably in profitability, the board may well replace management, shareholders may replace the board or another firm may attempt a takeover. Systematic implementation of regulatory ESG or CSR requirements will make management dramatically less accountable since such requirements will come at the expense of profitability and the metrics relating to success or failure of achieving ESG or CSR requirements will be largely unquantifiable. For that matter, ESG or CSR requirements themselves tend to be amorphous and ever changing.

esg-smoke-and-mirrors