Wacky “War on Nature”

The usual suspects reported U.N. Secretary General Antonio Guterres announcing that humans are at war upon nature.  For example, NY Daily News (in italics with my bolds):

The United Nations is calling on people worldwide to stop “waging war on nature” as the planet achieves disturbing milestones in the battle against climate change.

In a speech at Columbia University, UN Secretary-General Antonio Guterres said, “The state of the planet is broken. … This is suicidal,” The Associated Press reported Wednesday.

Guterres pointed to “apocalyptic fires and floods, cyclones and hurricanes” that have only become more frequent in recent years, and in particular, during 2020, one of the three hottest years on record.

“Human activities are at the root of our descent towards chaos,” he explained, noting this also means humans are the ones who “can solve it.”

John Osbourne explains at Real Markets how crazy is this latest meme in his article The Utterly Nonsensical View That Humanity Is Waging War on Nature. Excerpts in italics with my bolds.

The narrative that humanity is waging a ‘war on nature’ is nonsense.

At Columbia University on December 2, 2020, U.N. Secretary General Antonio Guterres claimed that humanity’s “war” on the environment was coming to a head. Guterres said, “We are facing a devastating pandemic, new heights of global heating, new lows of ecological degradation and new setbacks in our work towards global goals for more equitable, inclusive and sustainable development… To put it simply, the state of the planet is broken.”

Is humanity facing the crisis that Guterres claims? Is the planet ‘broken’? Most importantly, is there any scientific basis for these claims of doom and gloom?

To answer these questions: No. Global living standards continue to rise, despite alarmists constant failed predictions of a dreary future. Greater prosperity has allowed developed countries to devote time and money to remediating existing damage and improving the environment; developing countries have no such luxury. Contrary to alarmists’ claims, climate and temperature changes are extensively documented and perfectly natural. Guterres’ belief that man is at war with nature is unsubstantiated.

Facts, rather than beliefs, should be the foundation of public policy.

In his speech, Guterres highlights nearly every woe in the world. He implies that any problem in the natural world, be it fires, flooding, cyclones, hurricanes, pollution, disease, changes in sea ice and ocean temperatures, can be blamed on climate change. And in Guterres’ view, humanity is the sole driver of climate change. This couldn’t be further from the truth.

Nor is Guterres the sole proponent of such ideas. Media outlets such as CNN, the Huffington Post, and NPR repeat these tropes while rarely citing facts for their claims of impending calamity.

An examination of the science and history of natural disasters will show that deaths from natural disasters are at one of the lowest rates in history. While natural disasters are more expensive than they once were, the reasons are mundane and expected.

Climate has changed throughout history, well before humanity had any significant impact. Even with CO2’s warming effect, UN IPCC and U.S. Government data show no increase in rates most natural disasters from the period of natural warming (1900 to 1950) to the period the IPCC claims is one of largely human-caused warming (1950 to 2018).

This calls into question not just claims of current CO2-driven “climate crisis” but projections of future damage by those who promote the ‘war on nature’ narrative.

Guterres continues, “Air and water pollution are killing 9 million people annually.” Not only is his number wrong, but his reasoning is flawed. In developed countries (which have access to cheap reliable energy), air and water are cleaner than ever because of the economic growth driven by fossil fuels. Furthermore, his solutions condemn the populations of less developed countries to continue to suffer public health problems arising from lack of affordable energy. Nearly half of the deaths cited in the WHO’s number are caused by cooking indoors with dirty fuels. In the CO2 Coalition’s latest white paper, New-Tech American Coal Fired Electricity for Africa: Clean Air, Indoors and Out we offer a solution to the polluted indoor air: cheap, reliable energy using local resources. President Trump could reverse by executive order the Obama-era ban on U.S. exports of clean-coal technology to coal-rich Africa, saving thousands of lives.

As to ocean health, Guterres says, “The carbon dioxide they absorb is acidifying the seas…” Again, he misses the mark. The CO2 Coalition has analyzed decades of research and found that CO2, which is plankton food that enriches sea life, does not cause “ocean acidification” and that the term itself is misleading.

Guterres concludes by advocating humanity “flick the green switch” and transform the world’s economy by using ‘renewable energy’ to drive sustainability. While this sounds wonderful, the reality is that so-called ‘renewables’ are anything but renewable. Flipping the green switch requires us to depend on energy that is unreliable, expensive, and requires the use of dangerous pollutants. His proposed solution would be harmful to both health and global prosperity.

One thing is certain: the General Secretary is wrong on the science and wrong on the economics. His ‘war on nature’ narrative is bunk.

 

.

 

 

In Denial of Election Fraud

Meaning of In Denial:  Refusing to look at information to avoid considering an undesirable reality.  Antonym: Facing Facts

The judiciary has now signaled from the very top Supreme Justices along with state and district judges that they will avert their eyes from any cases questioning the 2020 election.  The media as well attaches the adjective “baseless” to all claims of election irregularities.  The results are “valid” they say, without any effort to confront all the evidence of “invalid” activity before, during, and after November 3.  The undesirable reality is that a pre-meditated, nationally organized criminal enterprise stole the US election, meaning that one of the US major political parties should now be called the Undemocrats.  The infographic below presents the extensive legal case against this election process and results, summarized from numerous hearings attended by officials who actually want to know the truth.

Previous Post;  The Phishy Election

The age of distributed computers and internet connectivity results in everyone from time to time receiving phishing emails.  Just opening the link can get malware installed on your notebook, and can even generate a ransom demand from those who kidnapped your device.  The same kind of criminals working 24/7 to steal from you are suspected of using their methods to steal the Office of the President of the US.

As we know from TV shows and reading crime novels, any criminal investigation seeks to prove the perpetrators possessed three factors:  Motive, Opportunity, and Means.  No one doubts the Democrats had plenty of Motive, as evidenced by a four-year slow-moving coup against Trump from the moment the 2016 results were confirmed.  As we now see, the Opportunity came in Democratic control over big city strongholds in the large battleground states.  And as the legal affidavits confirm, the Means consisted of both Old School ballot stuffing fraud (ballot harvesting and overwriting) and New School counting fraud (using computer algorithms to warp the results). The allegations summarized in the exhibit above show how perpetrators added votes in big cities and suppressed votes in the countryside.

There is a short timetable for exposing these illegal tactics, and the media is looking to play out the clock, after having already declared Biden the winner.  An example is the recent Washington Post article Swing-state counties that used Dominion voting machines mostly voted for Trump.  At least they are finally admitting that the election results are questionable.  But like all the MSM, they are resolutely averting their eyes from anything that could sour the Biden victory they so covet. Excerpts in italics with my bolds.

A review of 10 key states (Arizona, Colorado, Florida, Georgia, Michigan, Minnesota, Nevada, North Carolina, Pennsylvania and Wisconsin) finds that Dominion systems were used in 351 of 731 counties. Trump won 283 of those counties, 81 percent of the total. He won 79 percent of the counties that didn’t use Dominion systems.

The idea that Trump only lost, say, Pennsylvania, because of Dominion voting systems has to reconcile with the fact that Trump actually won more votes in counties that used Dominion systems (beating Biden by about 74,000 votes in those counties) but lost the state because he was beaten by 154,000 votes in non-Dominion counties. That same pattern holds in Wisconsin as well.

In other words, there’s nothing to suggest that counties using Dominion systems looked significantly different from counties that didn’t. The idea that Biden is president-elect because of some nefarious calculations simply doesn’t match the reality of the county-level vote results.

The WP conclusion ignores the analyses showing how the algorithms distort the results.  In order to shift votes from Trump to Biden, the perpetrators needed to identify large pools of Trump-only votes (ballots cast only for the President race) that could be switched to Biden-only votes.  By skimming in this way, Trump wins those counties as expected, but by enough fewer votes to lose the state. As well, it appears that forensic testing of seized ballot machines confirms that vote tabulations are converted from whole numbers to decimals so that weighting can be applied.  In one example, Biden votes were weighted 1.2 while Trump votes weighted at 0.8.  Thus the race results effectively shift votes from Trump to Biden.  Meanwhile in the big city precincts, large Biden margins were already organized through production of mail-in ballots stuffed into the machines.

See: Inside the Election Black Box

The forensics report for Antrim County makes for an interesting read.

A forensic audit of Dominion Voting Systems machines and software in Michigan showed that they were designed to create fraud and influence election results, a data firm said Monday (Dec. 14)

“We conclude that the Dominion Voting System is intentionally and purposefully designed with inherent errors to create systemic fraud and influence election results,” Russell Ramsland Jr., co-founder of Allied Security Operations Group, said in a preliminary report.

“The system intentionally generates an enormously high number of ballot errors. The electronic ballots are then transferred for adjudication. The intentional errors lead to bulk adjudication of ballots with no oversight, no transparency, and no audit trail. This leads to voter or election fraud. Based on our study, we conclude that The Dominion Voting System should not be used in Michigan. We further conclude that the results of Antrim County should not have been certified,” he added.

Footnote:

Today, in the midst of an uproar over election violations, Covid restrictions,  and also illicit relations with Chinese Communists, this song came up on my playlist.

Lyrics:

Everybody knows that the dice are loaded
Everybody rolls with their fingers crossed
Everybody knows that the war is over
Everybody knows the good guys lost
Everybody knows the fight was fixed
The poor stay poor, the rich get rich
Thats how it goes
Everybody knows

Everybody knows that the boat is leaking
Everybody knows that the captain lied
Everybody got this broken feeling
Like their father or their dog just died
Everybody talking to their pockets
Everybody wants a box of chocolates
And a long stem rose
Everybody knows

Everybody knows that you love me baby
Everybody knows that you really do
Everybody knows that you’ve been faithful
Ah give or take a night or two
Everybody knows you’ve been discreet
But there were so many people you just had to meet
Without your clothes
And everybody knows

Everybody knows, everybody knows
Thats how it goes
Everybody knows

And everybody knows that its now or never
Everybody knows that its me or you
And everybody knows that you live forever
Ah when you’ve done a line or two
Everybody knows the deal is rotten
Old black joe’s still pickin’ cotton
For your ribbons and bows
And everybody knows

And everybody knows that the plague is coming
Everybody knows that its moving fast
Everybody knows that the naked man and woman
Are just a shining artifact of the past
Everybody knows the scene is dead
But there’s gonna be a meter on your bed
That will disclose
What everybody knows

And everybody knows that you’re in trouble
Everybody knows what you’ve been through
From the bloody cross on top of calvary
To the beach of Malibu
Everybody knows its coming apart
Take one last look at this sacred heart
Before it blows
And everybody knows

Everybody knows, everybody knows
Thats how it goes
Everybody knows

Footnote:

I doubt Leonard Cohen had politics or climate change in mind when he wrote this masterpiece.  But he did have a pertinent poetic insight; namely, that social proof is an unreliable guide to the truth.

 

Fear Not Rising Temperatures or Ocean Levels


Dominick T. Armentano writes at the Independent Institute Are Temperatures and Ocean Levels Rising Dangerously? Not Really. Excerpts in italics with my bolds.  H/T John Ray

There are two widely held climate-change beliefs that are simply not accurate. The first is that there has been a statistically significant warming trend in the U.S. over the last 20 years. The second is that average ocean levels are rising alarmingly due to man-made global warming. Neither of these perspectives is true; yet both remain important, nonetheless, since both are loaded with very expensive public policy implications.

To refute the first view, we turn to data generated by the National Oceanic and Atmospheric Administration (NOAA) for the relevant years under discussion. The table below reports the average mean temperature in the continental U.S. for the years 1998 through 2019*:

1998 54.6 degrees
1999 54.5 degrees
2000 54.0 degrees
2001 54.3 degrees
2002 53.9 degrees
2003 53.7 degrees
2004 53.5 degrees
2005 54 degrees
2006 54.9 degrees
2007 54.2 degrees
2008 53.0 degrees
2009 53.1 degrees
2010 53.8 degrees
2011 53.8 degrees
2012 55.3 degrees
2013 52.4 degrees
2014 52.6 degrees
2015 54.4 degrees
2016 54.9 degrees
2017 54.6 degrees
2018 53.5 degrees
2019 52.7 degrees

*National Climate Report – Annual 2019

It is apparent from the data that there has been no consistent warming trend in the U.S. over the last 2 decades; average mean temperatures (daytime and nighttime) have been slightly higher in some years and slightly lower in other years. On balance–and contrary to mountains of uninformed social and political commentary—annual temperatures on average in the U.S. were no higher in 2019 than they were in 1998.

The second widely accepted climate view—based on wild speculations from some op/ed writers and partisan politicians–is that average sea levels are increasing dangerously and rationalize an immediate governmental response. But as we shall demonstrate below, this perspective is simply not accurate.

There is a wide scientific consensus (based on satellite laser altimeter readings since 1993) that the rate of increase in overall sea levels has been approximately .12 inches per year.

To put that increase in perspective, the average sea level nine years from now (in 2029) is likely to be approximately one inch higher than it is now (2020). One inch is roughly the distance from the tip of your finger to the first knuckle. Even by the turn of the next century (in 2100), average ocean levels (at that rate of increase) should be only a foot or so higher than they are at present.

NYC past & projected 2020

None of this sounds particularly alarming for the general society and little of it can justify any draconian regulations or costly infrastructure investments. The exception might be for very low- lying ocean communities or for properties (nuclear power plants) that, if flooded, would present a wide-ranging risk to the general population. But even here there is no reason for immediate panic. Since ocean levels are rising in small, discrete marginal increments, private and public decision makers would have reasonable amounts of time to prepare, adjust and invest (in flood abatement measures, etc.) if required.

But are sea levels actually rising at all? Empirical evidence of any substantial increases taken from land-based measurements has been ambiguous. This suggests to some scientists that laser and tidal-based measurements of ocean levels over time have not been particularly accurate.

For example, Professor Niles-Axel Morner (Stockholm University) is infamous in climate circles for arguing–based on his actual study of sea levels in the Fiji Islands–that “there are no traces of any present rise in sea levels; on the contrary, full stability.” And while Morner’s views are controversial, he has at least supplied peer reviewed empirical evidence to substantiate his nihilist position on the sea-level increase hypothesis.

The world has many important societal problems and only a limited amount of resources to address them. What we don’t need are overly dramatic climate-change claims that are unsubstantiated and arrive attached to expensive public policies that, if enacted, would fundamentally alter the foundations of the U.S. economic system.

DOMINICK T. ARMENTANO is a Research Fellow at the Independent Institute and professor emeritus in economics at the University of Hartford (CT).

Update Dec.11: USCRN Comparable Temperature Results

In response to Graeme Weber’s Question, this information is presented:

Anthony Watts:

NOAA’s U.S. Climate Reference Network (USCRN) has the best quality climate data on the planet, yet it never gets mentioned in the NOAA/NASA press releases. Commissioned in 2005, it has the most accurate, unbiased, and un-adjusted data of any climate dataset.

The USCRN has no biases, and no need for adjustments, and in my opinion represents a ground truth for climate change.

In this graph of the contiguous United States updated for 2019 comes out about 0.75°F cooler than the start of the dataset in 2005.

See Also Fear Not For Fiji

Setting the Global Temperature Record Straight

Setting the Global Temperature Record Straight

Figure 4. As in Fig. 3 except for seasonal station and global anomalies. As noted in the text, the inhabitants of the Earth experience the anomalies as noted by the black circles, not the yellow squares.

The CO2 Coalition does the world a service by publishing a brief public information about the temperature claims trumpeted in the media to stir up climate alarms.  The pdf pamphlet is The Global Mean Temperature Anomaly Record  How it works and why it is misleading by Richard S. Lindzen and John R. Christy.  H/T John Ray.  Excerpts in italics with my bolds.

Overview

At the center of most discussions of global warming is the record of the global mean surface temperature anomaly—often somewhat misleadingly referred to as the global mean temperature record. This paper addresses two aspects of this record. First, we note that this record is only one link in a fairly long chain of inference leading to the claimed need for worldwide reduction in CO2 emissions. Second, we explore the implications of the way the record is constructed and presented, and show why the record is misleading.

This is because the record is often treated as a kind of single, direct instrumental measurement. However, as the late Stan Grotch of the Laurence Livermore Laboratory pointed out 30 years ago, it is really the average of widely scattered station data, where the actual data points are almost evenly spread between large positive and negative values.

The average is simply the small difference of these positive and negative excursions, with the usual problem associated with small differences of large numbers: at least thus far, the approximately one degree Celsius increase in the global mean since 1900 is swamped by the normal variations at individual stations, and so bears little relation to what is actually going on at a particular one.

The changes at the stations are distributed around the one-degree global average increase. Even if a single station had recorded this increase itself, this would take a typical annual range of temperature there, for example, from -10 to 40 degrees in 1900, and replace it with a range today from -9 to 41. People, crops, and weather at that station would find it hard to tell this difference. However, the increase looks significant on the charts used in almost all presentations, because they omit the range of the original data points and expand the scale in order to make the mean change look large.

The record does display certain consistent trends, but it is also quite noisy, and fluctuations of a tenth or two of a degree are unlikely to be significant. In the public discourse, little attention is paid to magnitudes; the focus is rather on whether this anomaly is increasing or decreasing. Given the noise and sampling errors, it is rather easy to “adjust” such averaging, and even change the sign of a trend from positive to negative.

The Global Temperature Record and its Role

The earth’s climate system is notoriously complex. We know, for example, that this system undergoes multiyear variations without any external forcing at all other than the steady component of the sun’s radiation (for example, the El Niño Southern Oscillation and the Quasibiennial Oscillation of the tropical stratosphere). We know, moreover, that these changes are hardly describable simply by some global measure of temperature. Indeed, what is presented is actually something else. You may have noticed that it is referred to as the global mean temperature anomaly.

What is being averaged is the deviation of the surface temperature from some 30-year mean at stations non-randomly scattered around the globe. As we will soon see, this average bears rather little relation to the changes at the individual stations. Moreover, as noted by Christy and McNider (2017), the temperature anomaly of the lower troposphere (measured by satellites) relative to the surface temperature is much better sampled and represents the “more climate-relevant quantity of heat content, a change in which is a [theorized] consequence of enhanced GHG forcing.”

However imprecise and lightly-relevant the surface temperature is to the physics of the issue, the narrative of a global warming disaster uses the record as the first in a sequence of often comparably questionable assumptions. The narrative first claims that changes in this dubious metric are almost entirely due to variations in CO2, even though there are quite a few other factors whose common variations are as large as or larger than the impact of changes in CO2 (for example, modest changes in the area of upper and lower level clouds or changes in the height of upper level clouds).

Then the narrative asserts that changes in CO2 were primarily due to man’s activities. There is indeed evidence that this link is likely true for changes over the past two hundred years. However, over Earth’s history, there were radical changes in CO2 levels, and these changes were largely uncorrelated with changes in temperature.

Presentations of the Global Mean Temperature Anomaly Record

In order to obscure the fact that the global means are small residues of large numbers whose precision is questionable, the common presentations plot the global mean anomalies without the scattered points and expand the scale so as to make the changes look large. These expanded graphs of global means are shown in Figures 5 and 6.

Figure 6. Global seasonal anomalies of temperature from Fig. 4 without station anomalies. Note
the range here is -0.8 to +1.2 °C, or 9 times less than Figs. 2 and 4.

The frequently cited trends are evident in these graphs–most notably, the pre-CO2 warming from 1920-1940 and the warming that has been attributed to man from 1978-1998. We also see a reduced rate from 1998 (best seen in Fig. 6) until the major El Niño of 2016 occurred. Even if one could attribute all the 1978-1998 warming to the increases in CO2 , the slowdown clearly shows that there is something going on that is at least as large as the response to CO2 . This contradicts the IPCC attribution studies that assume, based on model results, that other sources of variability since 1950 are negligible.

Note that the results in Figures 5 and 6 are quite noisy, with large interseasonal and interannual fluctuations. This noise contributes to the uncertainty of the values, in addition to the usual sampling errors. The graphs one usually sees are a lot smoother looking than what we see in Figures 5 and 6; these have resulted from taking running means over 5 or more years. The results of such smoothing are shown in Figure 7 (smoothed over 11 years) and 8 (smoothed over 21 seasons, or about 5 years). They look much cleaner and presumably more authoritative than the unsmoothed results or the scatter diagrams, but this tends to disguise the uncertainty, which is likely on the order of 0.1-0.2 degrees. (For example, Figure 7 substantially disguises the pause following 1998; Figure 8 does this less because it is averaged over only about 5 years.)

Obviously, warmings or coolings of a tenth or two of a degree are without significance since possible adjustments can easily lead to changes of sign from positive to negative, yet in the popular literature much is made of such small changes. Like with sausage, you might not want to know what went into these graphs, but, in this case, it is important that you do.

Some Concluding Remarks

An examination of the data that goes into calculating the global mean temperature anomaly clearly shows that any place on earth is almost as likely, at any given time, to be warmer or cooler than average. The anomaly is the small residue of the generally larger excursions we saw in Figures 1 and 2. This residue (which is popularly held to represent “climate”) is also much smaller than the temperature variations that all life on Earth regularly experiences. Figure 9 illustrates this for 14 major cities in the United States.

Indeed, the 1.2 degree Celsius global temperature change in the past 120 years, depicted as alarming in Figure 7, is only equivalent to the thickness of the “Average” line in Figure 9. As the figure shows, the difference in average temperature from January to July in these major cities ranges from just under ten degrees in Los Angeles to nearly 30 degrees in Chicago. And the average difference between the coldest and warmest moments each year ranges from about 25 degrees in Miami (a 45 degree Fahrenheit change) to 55 degrees in Denver (a 99 degree Fahrenheit change)

Figure 9. Temperature Changes People Know How to Handle

At the very least, we should keep the large natural changes in Figure 9 in mind, and not attribute them to the small residue, the global mean temperature anomaly, or obsess over its small changes.

See Also  Temperature Misunderstandings

Clive Best provides this animation of recent monthly temperature anomalies which demonstrates how most variability in anomalies occur over northern continents.

 

 

Yes, HCQ Works Against Covid19

Updated December 2020 is this report from hcqmeta.com HCQ is effective for COVID-19 when used early: meta analysis of 156 studies  (Version 28, December 4, 2020).  Excerpts in italics with my bolds.

HCQ is effective for COVID-19. The probability that an ineffective treatment generated results as positive as the 156 studies to date is estimated to be 1 in 36 trillion (p = 0.000000000000028).

Early treatment is most successful, with 100% of studies reporting a positive effect and an estimated reduction of 65% in the effect measured (death, hospitalization, etc.) using a random effects meta-analysis, RR 0.35 [0.27-0.46].

100% of Randomized Controlled Trials (RCTs) for early, PrEP, or PEP treatment report positive effects, the probability of this happening for an ineffective treatment is 0.00098.

There is evidence of bias towards publishing negative results. 89% of prospective studies report positive effects, and only 76% of retrospective studies do.

Significantly more studies in North America report negative results compared to the rest of the world, p = 0.0005.

Study results ordered by date, with the line showing the probability that the observed frequency of positive results occurred due to random chance from an ineffective treatment.

We analyze all significant studies concerning the use of HCQ (or CQ) for COVID-19, showing the effect size and associated p value for results comparing to a control group. Methods and study results are detailed in Appendix 1. Typical meta analyses involve subjective selection criteria, effect extraction rules, and bias evaluation, requiring an understanding of the criteria and the accuracy of the evaluations. However, the volume of studies presents an opportunity for a simple and transparent analysis aimed at detecting efficacy.

If treatment was not effective, the observed effects would be randomly distributed (or more likely to be negative if treatment is harmful). We can compute the probability that the observed percentage of positive results (or higher) could occur due to chance with an ineffective treatment (the probability of >= k heads in n coin tosses, or the one-sided sign test / binomial test). Analysis of publication bias is important and adjustments may be needed if there is a bias toward publishing positive results. For HCQ, we find evidence of a bias toward publishing negative results.

Figure 2 shows stages of possible treatment for COVID-19. Pre-Exposure Prophylaxis (PrEP) refers to regularly taking medication before being infected, in order to prevent or minimize infection. In Post-Exposure Prophylaxis (PEP), medication is taken after exposure but before symptoms appear. Early Treatment refers to treatment immediately or soon after symptoms appear, while Late Treatment refers to more delayed treatment.

Table 1. Results by treatment stage. 2 studies report results for a subset with early treatment, these are not included in the overall results.

We also note a bias towards publishing negative results by certain journals and press organizations, with scientists reporting difficulty publishing positive results [Boulware, Meneguesso]. Although 124 studies show positive results, The New York Times, for example, has only written articles for studies that claim HCQ is not effective [The New York Times, The New York Times (B), The New York Times (C)]. As of September 10, 2020, The New York Times still claims that there is clear evidence that HCQ is not effective for COVID-19 [The New York Times (D)]. As of October 9, 2020, the United States National Institutes of Health recommends against HCQ for both hospitalized and non-hospitalized patients [United States National Institutes of Health].

Treatment details. We focus here on the question of whether HCQ is effective or not for COVID-19. Significant differences exist based on treatment stage, with early treatment showing the greatest effectiveness. 100% of early treatment studies report a positive effect, with an estimated reduction of 65% in the effect measured (death, hospitalization, etc.) in the random effects meta-analysis, RR 0.35 [0.27-0.46]. Many factors are likely to influence the degree of effectiveness, including the dosing regimen, concomitant medications such as zinc or azithromycin, precise treatment delay, the initial viral load of patients, and current patient conditions.

The Phishy Election

The age of distributed computers and internet connectivity results in everyone from time to time receiving phishing emails.  Just opening the link can get malware installed on your notebook, and can even generate a ransom demand from those who kidnapped your device.  The same kind of criminals working 24/7 to steal from you are suspected of using their methods to steal the Office of the President of the US.

As we know from TV shows and reading crime novels, any criminal investigation seeks to prove the perpetrators possessed three factors:  Motive, Opportunity, and Means.  No one doubts the Democrats had plenty of Motive, as evidenced by a four-year slow-moving coup against Trump from the moment the 2016 results were confirmed.  As we now see, the Opportunity came in Democratic control over big city strongholds in the large battleground states.  And as the legal affidavits confirm, the Means consisted of both Old School ballot stuffing fraud (ballot harvesting and overwriting) and New School counting fraud (using computer algorithms to warp the results). The allegations summarized in the exhibit below show how perpetrators added votes in big cities and suppressed votes in the countryside.

There is a short timetable for exposing these illegal tactics, and the media is looking to play out the clock, after having already declared Biden the winner.  An example is the recent Washington Post article Swing-state counties that used Dominion voting machines mostly voted for Trump.  At least they are finally admitting that the election results are questionable.  But like all the MSM, they are resolutely averting their eyes from anything that could sour the Biden victory they so covet. Excerpts in italics with my bolds.

A review of 10 key states (Arizona, Colorado, Florida, Georgia, Michigan, Minnesota, Nevada, North Carolina, Pennsylvania and Wisconsin) finds that Dominion systems were used in 351 of 731 counties. Trump won 283 of those counties, 81 percent of the total. He won 79 percent of the counties that didn’t use Dominion systems.

The idea that Trump only lost, say, Pennsylvania, because of Dominion voting systems has to reconcile with the fact that Trump actually won more votes in counties that used Dominion systems (beating Biden by about 74,000 votes in those counties) but lost the state because he was beaten by 154,000 votes in non-Dominion counties. That same pattern holds in Wisconsin as well.

In other words, there’s nothing to suggest that counties using Dominion systems looked significantly different from counties that didn’t. The idea that Biden is president-elect because of some nefarious calculations simply doesn’t match the reality of the county-level vote results.

The WP conclusion ignores the analyses that show how the algorithms effects show up in the results.  In order to shift votes from Trump to Biden, the perpetrators needed to identify large pools of Trump-only votes (ballots cast only for the President race) that could be switched to Biden-only votes.  By skimming in this way, Trump wins those counties as expected, but by enough fewer votes to lose the state. As well, it appears that forensic testing of seized ballot machines will confirm that vote tabulations are converted from whole numbers to decimals so that weighting can be applied.  In one example, Biden votes were weighted 1.3 while Trump votes weighted at 0.7.  Thus the race results effectively shift votes from Trump to Biden.

See: Inside the Election Black Box

Meanwhile in the big city precincts, large Biden margins are already organized through production of mail-in ballots stuffed into the machines.

From Previous Post:  Election Fraud Too Big to Fail?

Arctic Ice Fears Erased in November

As noted in a previous post, alarms were raised over slower than average Arctic refreezing in October.  Those fears are now laid to rest by ice extents roaring back in November.  The image above shows the ice gains completed from October 31 to November 30, 2020. In fact 3.5 Wadhams of sea ice were added during the month.  (The metric 1 Wadham = 1 M km2 comes from the professor’s predictions of an ice-free Arctic, meaning less than 1 M km2 extent)

Some years ago reading a thread on global warming at WUWT, I was struck by one person’s comment: “I’m an actuary with limited knowledge of climate metrics, but it seems to me if you want to understand temperature changes, you should analyze the changes, not the temperatures.” That rang bells for me, and I applied that insight in a series of Temperature Trend Analysis studies of surface station temperature records. Those posts are available under this heading. Climate Compilation Part I Temperatures

This post seeks to understand Arctic Sea Ice fluctuations using a similar approach: Focusing on the rates of extent changes rather than the usual study of the ice extents themselves. Fortunately, Sea Ice Index (SII) from NOAA provides a suitable dataset for this project. As many know, SII relies on satellite passive microwave sensors to produce charts of Arctic Ice extents going back to 1979.  The current Version 3 has become more closely aligned with MASIE, the modern form of Naval ice charting in support of Arctic navigation. The SII User Guide is here.

There are statistical analyses available, and the one of interest (table below) is called Sea Ice Index Rates of Change (here). As indicated by the title, this spreadsheet consists not of monthly extents, but changes of extents from the previous month. Specifically, a monthly value is calculated by subtracting the average of the last five days of the previous month from this month’s average of final five days. So the value presents the amount of ice gained or lost during the present month.

These monthly rates of change have been compiled into a baseline for the period 1980 to 2010, which shows the fluctuations of Arctic ice extents over the course of a calendar year. Below is a graph of those averages of monthly changes during the baseline period. Those familiar with Arctic Ice studies will not be surprised at the sine wave form. December end is a relatively neutral point in the cycle, midway between the September Minimum and March Maximum.

The graph makes evident the six spring/summer months of melting and the six autumn/winter months of freezing.  Note that June-August produce the bulk of losses, while October-December show the bulk of gains. Also the peak and valley months of March and September show very little change in extent from beginning to end.

The table of monthly data reveals the variability of ice extents over the last 4 decades.

The values in January show changes from the end of the previous December, and by summing twelve consecutive months we can calculate an annual rate of change for the years 1979 to 2019.

As many know, there has been a decline of Arctic ice extent over these 40 years, averaging 40k km2 per year. But year over year, the changes shift constantly between gains and losses.

Moreover, it seems random as to which months are determinative for a given year. For example, much ado has been printed about October 2020 being slower than expected to refreeze and add ice extents. As it happens in this dataset, October has the highest rate of adding ice. The table below shows the variety of monthly rates in the record as anomalies from the 1980-2010 baseline. In this exhibit a red cell is a negative anomaly (less than baseline for that month) and blue is positive (higher than baseline).

Note that the  +/ –  rate anomalies are distributed all across the grid, sequences of different months in different years, with gains and losses offsetting one another.  Yes, October 2020 recorded a lower than average gain, but higher than 2016. The loss in July 2020 was the largest of the year, during the hot Siberian summer.  Note November 2020 ice gain anomaly exceeded the October deficit anomaly by more than twice as much.  The bottom line presents the average anomalies for each month over the period 1979-2020.  Note the rates of gains and losses mostly offset, and the average of all months in the bottom right cell is virtually zero.

Combining the months of October and November shows 2020 828k km2 more ice than baseline for the two months and matching 2019 ice recovery.

A final observation: The graph below shows the Yearend Arctic Ice Extents for the last 30 years.

Note: SII daily extents file does not provide complete values prior to 1988.

Year-end Arctic ice extents (last 5 days of December) show three distinct regimes: 1989-1998, 1998-2010, 2010-2019. The average year-end extent 1989-2010 is 13.4M km2. In the last decade, 2009 was 13.0M km2, and ten years later, 2019 was 12.8M km2. So for all the the fluctuations, the net loss was 200k km2, or 1.5%. Talk of an Arctic ice death spiral is fanciful.

These data show a noisy, highly variable natural phenomenon. Clearly, unpredictable factors are in play, principally water structure and circulation, atmospheric circulation regimes, and also incursions and storms. And in the longer view, today’s extents are not unusual.

 

 

Illustration by Eleanor Lutz shows Earth’s seasonal climate changes. If played in full screen, the four corners present views from top, bottom and sides. It is a visual representation of scientific datasets measuring Arctic ice extents.

Nov. 2020 Ocean Air Temps Cooling

banner-blog
With apologies to Paul Revere, this post is on the lookout for cooler weather with an eye on both the Land and the Sea.  UAH has updated their tlt (temperatures in lower troposphere) dataset for November 2020.  Previously I have done posts on their reading of ocean air temps as a prelude to updated records from HADSST3. This month also has a separate graph of land air temps because the comparisons and contrasts are interesting as we contemplate possible cooling in coming months and years.

Presently sea surface temperatures (SST) are the best available indicator of heat content gained or lost from earth’s climate system.  Enthalpy is the thermodynamic term for total heat content in a system, and humidity differences in air parcels affect enthalpy.  Measuring water temperature directly avoids distorted impressions from air measurements.  In addition, ocean covers 71% of the planet surface and thus dominates surface temperature estimates.  Eventually we will likely have reliable means of recording water temperatures at depth.

Recently, Dr. Ole Humlum reported from his research that air temperatures lag 2-3 months behind changes in SST.  He also observed that changes in CO2 atmospheric concentrations lag behind SST by 11-12 months.  This latter point is addressed in a previous post Who to Blame for Rising CO2?

HadSST3 results will be posted later this month.  For comparison we can look at lower troposphere temperatures (TLT) from UAHv6 which are now posted for November. The temperature record is derived from microwave sounding units (MSU) on board satellites like the one pictured above.

The UAH dataset includes temperature results for air above the oceans, and thus should be most comparable to the SSTs. There is the additional feature that ocean air temps avoid Urban Heat Islands (UHI). In 2015 there was a change in UAH processing of satellite drift corrections, including dropping one platform which can no longer be corrected. The graphs below are taken from the latest and current dataset, Version 6.0.

The graph above shows monthly anomalies for ocean temps since January 2015. After all regions peaked with the El Nino in early 2016, the ocean air temps dropped back down with all regions showing the same low anomaly August 2018.  Then a warming phase ensued peaking with NH and Tropics spikes in February, and a lesser rise May 2020. As was the case in 2015-16, the warming was driven by the Tropics and NH, with SH lagging behind.

Since the peak in February 2020, all ocean regions have trended downward in a sawtooth pattern, returning to a flat anomaly in the three Summer months, close to the 0.4C average for the period.  A small rise occurred in September, mostly due to SH. In October NH spiked, coincidental with all the storm activity in north Pacific and Atlantic.  Now in November that NH warmth is gone, and the global anomaly declined due also to dropping temps in the Tropics

Land Air Temperatures Showing Volatility.

We sometimes overlook that in climate temperature records, while the oceans are measured directly with SSTs, land temps are measured only indirectly.  The land temperature records at surface stations sample air temps at 2 meters above ground.  UAH gives tlt anomalies for air over land separately from ocean air temps.  The graph updated for November 2020 is below.

Here we see the noisy evidence of the greater volatility of the Land temperatures, along with extraordinary departures, first by NH land with SH often offsetting.   The overall pattern is similar to the ocean air temps, but obviously driven by NH with its greater amount of land surface. The Tropics synchronized with NH for the 2016 event, but otherwise follow a contrary rhythm.

SH seems to vary wildly, especially in recent months.  Note the extremely high anomaly November 2019,  cold in March 2020, and then again a spike in April. In June 2020, all land regions converged, erasing the earlier spikes in NH and SH, and showing anomalies comparable to the 0.5C average land anomaly this period.  After a relatively quiet Summer, land air temps rose Globally in September with spikes in both NH and SH. In October, the SH spike reversed, but November NH land is showing a warm spike, pulling up the Global anomaly slightly.  Next month will show whether NH land will cool off as rapidly as did the NH ocean temps.

The longer term picture from UAH is a return to the in for the period starting with 1995.  2019 average rose and caused 2020 to start warmly, but currently lacks any El Nino or NH warm blob to sustain it.

These charts demonstrate that underneath the averages, warming and cooling is diverse and constantly changing, contrary to the notion of a global climate that can be fixed at some favorable temperature.

TLTs include mixing above the oceans and probably some influence from nearby more volatile land temps.  Clearly NH and Global land temps have been dropping in a seesaw pattern, NH in July more than 1C lower than the 2016 peak.  TLT measures started the recent cooling later than SSTs from HadSST3, but are now showing the same pattern.  It seems obvious that despite the three El Ninos, their warming has not persisted, and without them it would probably have cooled since 1995.  Of course, the future has not yet been written.

Oh No! Greenland Melts in Virtual Reality “Experiments”

Phys.org sounds the alarm:  Greenland ice sheet faces irreversible melting.  But as is usual with announcements from that source, discretion and critical intelligence are advised.  The back story is a study that takes outputs from climate models projecting incredible warming and feeds them into ice models that assume melting from higher temperatures. Warning:  The paper has 93 references to “experiments” and all of them are virtual reality manipulations in the computers they programmed. Excerpts in italics with my bolds.

Professor Jonathan Gregory, Climate Scientist from the National Centre for Atmospheric Science and University of Reading, said: “Our experiments underline the importance of mitigating global temperature rise. To avoid partially irreversible loss of the ice sheet, climate change must be reversed—not just stabilized—before we reach the critical point where the ice sheet has declined too far.”

The paper is Large and irreversible future decline of the Greenland ice sheet.  And obviously, it is models all the way down.  Excerpts in italics with my bolds.

We run a set of 47 FiG experiments to study the SMB change (ΔSMB), rate of mass loss and eventual steady state of the Greenland ice sheet using the three different choices of FAMOUS–ice snow-albedo parameters, with 20-year climatological monthly mean sea surface BCs taken from the four selected CMIP5 AOGCMs for five climate scenarios (Table 2). These five are the late 20th century (1980–1999, called “historical”), the end of the 21st century under three representative concentration pathway (RCP) scenarios (as in the AR5; van Vuuren et al., 2011) and quadrupled pre-industrial CO2 (abrupt4xCO2, warmer than any RCP). The experiments have steady-state climates. This is unrealistic, but it simplifies the comparison and is reasonable since no-one can tell how climate will change over millennia into the future. Our simulations should be regarded only as indicative rather than as projections. Each experiment begins from the FiG spun-up state for MIROC5 historical climate with the appropriate albedo parameter. Although in most cases there is a substantial instantaneous change in BCs when the experiment begins, the land and atmosphere require only a couple of years to adjust.

Under constant climates that are warmer than the late 20th century, the ice sheet loses mass, its surface elevation decreases, and its surface climate becomes warmer. This gives a positive feedback on mass loss, but it is outweighed by the negative feedbacks due to declining ablation area and increasing cloudiness over the interior as the ice sheet contracts. In the ice sheet area integral, snowfall decreases less than ablation because the precipitation on the margins is enhanced by the topographic gradient and moves inland as the ice sheet retreats. Consequently, after many millennia under a constant warm climate, the ice sheet reaches a reduced steady state. Final GMSLR is less than 1.5 m in most late 21st-century RCP2.6 climates and more than 4 m in all late 21st-century RCP8.5 climates. For warming exceeding 3 K, the ice sheet would be mostly lost, and its contribution to GMSLR would exceed 5 m.

The reliability of our conclusions depends on the realism of our model. There are systematic uncertainties arising from assumptions made in its formulation. The atmosphere GCM has low resolution and comparatively simple parametrization schemes. The ice sheet model does not simulate rapid ice sheet dynamics; this certainly means that it underestimates the rate of ice sheet mass loss in coming decades, but we do not know what effect this has on the eventual steady states, which are our focus. The SMB scheme uses a uniform air temperature lapse rate and omits the phase change in precipitation in the downscaling from GCM to ice sheet model. The snow albedo is a particularly important uncertainty; with our highest choice of albedo, removal of the ice sheet is reversible.

What about observations instead of imaginary projections?  Previous post: Greenland Ice Varies, Don’t Panic

The scare du jour is about Greenland Ice Sheet (GIS) and how it will melt out and flood us all.  It’s declared that GIS has passed its tipping point, and we are doomed.  Typical is the Phys.org hysteria: Sea level rise quickens as Greenland ice sheet sheds record amount:  “Greenland’s massive ice sheet saw a record net loss of 532 billion tonnes last year, raising red flags about accelerating sea level rise, according to new findings.”

Panic is warranted only if you treat this as proof of an alarmist narrative and ignore the facts and context in which natural variation occurs. For starters, consider the last four years of GIS fluctuations reported by DMI and summarized in the eight graphs above.  Note the noisy blue lines showing how the surface mass balance (SMB) changes its daily weight by 8 or 10 gigatonnes (Gt) around the baseline mean from 1981 to 2010.  Note also the summer decrease between May and August each year before recovering to match or exceed the mean.

The other four graphs show the accumulation of SMB for each of the last four years including 2020.  Tipping Point?  Note that in both 2017 and 2018, SMB ended about 500 Gt higher than the year began, and way higher than 2012, which added nothing.  Then came 2019 dropping below the mean, but still above 2012.  Lastly, this year is matching the 30-year average.  Note also that the charts do not integrate from previous years; i.e. each year starts at zero and shows the accumulation only for that year.  Thus the gains from 2017 and 2018 do not result in 2019 starting the year up 1000 Gt, but from zero.

The Truth about Sliding Greenland Ice

Researchers know that the small flows of water from surface melting are not the main way GIS loses ice in the summer.  Neil Humphrey explains in this article from last year Nate Maier and Neil Humphrey Lead Team Discovering Ice is Sliding Toward Edges Off Greenland Ice Sheet  Excerpts in italics with my bolds.

While they may appear solid, all ice sheets—which are essentially giant glaciers—experience movement: ice flows downslope either through the process of deformation or sliding. The latest results suggest that the movement of the ice on the GIS is dominated by sliding, not deformation. This process is moving ice to the marginal zones of the sheet, where melting occurs, at a much faster rate.

“The study was motivated by a major unknown in how the ice of Greenland moves from the cold interior, to the melting regions on the margins,” Neil Humphrey, a professor of geology from the University of Wyoming and author of the study, told Newsweek. “The ice is known to move both by sliding over the bedrock under the ice, and by oozing (deforming) like slowly flowing honey or molasses. What was unknown was the ratio between these two modes of motion—sliding or deforming.

“This lack of understanding makes predicting the future difficult, since we know how to calculate the flowing, but do not know much about sliding,” he said. “Although melt can occur anywhere in Greenland, the only place that significant melt can occur is in the low altitude margins. The center (high altitude) of the ice is too cold for the melt to contribute significant water to the oceans; that only occurs at the margins. Therefore ice has to get from where it snows in the interior to the margins.

“The implications for having high sliding along the margin of the ice sheet means that thinning or thickening along the margins due to changes in ice speed can occur much more rapidly than previously thought,” Maier said. “This is really important; as when the ice sheet thins or thickens it will either increase the rate of melting or alternatively become more resilient in a changing climate.

“There has been some debate as to whether ice flow along the edges of Greenland should be considered mostly deformation or mostly sliding,” Maier says. “This has to do with uncertainty of trying to calculate deformation motion using surface measurements alone. Our direct measurements of sliding- dominated motion, along with sliding measurements made by other research teams in Greenland, make a pretty compelling argument that no matter where you go along the edges of Greenland, you are likely to have a lot of sliding.”

The sliding ice does two things, Humphrey says. First, it allows the ice to slide into the ocean and make icebergs, which then float away. Two, the ice slides into lower, warmer climate, where it can melt faster.

While it may sound dire, Humphrey notes the entire Greenland Ice Sheet is 5,000 to 10,000 feet thick.

In a really big melt year, the ice sheet might melt a few feet. It means Greenland is going to be there another 10,000 years,” Humphrey says. “So, it’s not the catastrophe the media is overhyping.”

Humphrey has been working in Greenland for the past 30 years and says the Greenland Ice Sheet has only melted 10 feet during that time span.

Summary

The Greenland ice sheet is more than 1.2 miles thick in most regions. If all of its ice was to melt, global sea levels could be expected to rise by about 25 feet. However, this would take more than 10,000 years at the current rates of melting.

Background from Previous Post: Greenland Glaciers: History vs. Hysteria

The modern pattern of environmental scares started with Rachel Carson’s Silent Spring claiming chemicals are killing birds, only today it is windmills doing the carnage. That was followed by ever expanding doomsday scenarios, from DDT, to SST, to CFC, and now the most glorious of them all, CO2. In all cases the menace was placed in remote areas difficult for objective observers to verify or contradict. From the wilderness bird sanctuaries, the scares are now hiding in the stratosphere and more recently in the Arctic and Antarctic polar deserts. See Progressively Scaring the World (Lewin book synopsis)

The advantage of course is that no one can challenge the claims with facts on the ground, or on the ice. Correction: Scratch “no one”, because the climate faithful are the exception. Highly motivated to go to the ends of the earth, they will look through their alarmist glasses and bring back the news that we are indeed doomed for using fossil fuels.

A recent example is a team of researchers from Dubai (the hot and sandy petro kingdom) going to Greenland to report on the melting of Helheim glacier there.  The article is NYUAD team finds reasons behind Greenland’s glacier melt.  Excerpts in italics with my bolds.

First the study and findings:

For the first time, warm waters that originate in the tropics have been found at uniform depth, displacing the cold polar water at the Helheim calving front, causing an unusually high melt rate. Typically, ocean waters near the terminus of an outlet glacier like Helheim are at the freezing point and cause little melting.

NYUAD researchers, led by Professor of Mathematics at NYU’s Courant Institute of Mathematical Sciences and Principal Investigator for NYU Abu Dhabi’s Centre for Sea Level Change David Holland, on August 5, deployed a helicopter-borne ocean temperature probe into a pond-like opening, created by warm ocean waters, in the usually thick and frozen melange in front of the glacier terminus.

Normally, warm, salty waters from the tropics travel north with the Gulf Stream, where at Greenland they meet with cold, fresh water coming from the polar region. Because the tropical waters are so salty, they normally sink beneath the polar waters. But Holland and his team discovered that the temperature of the ocean water at the base of the glacier was a uniform 4 degrees Centigrade from top to bottom at depth to 800 metres. The finding was also recently confirmed by Nasa’s OMG (Oceans Melting Greenland) project.

“This is unsustainable from the point of view of glacier mass balance as the warm waters are melting the glacier much faster than they can be replenished,” said Holland.

Surface melt drains through the ice sheet and flows under the glacier and into the ocean. Such fresh waters input at the calving front at depth have enormous buoyancy and want to reach the surface of the ocean at the calving front. In doing so, they draw the deep warm tropical water up to the surface, as well.

All around Greenland, at depth, warm tropical waters can be found at many locations. Their presence over time changes depending on the behaviour of the Gulf Stream. Over the last two decades, the warm tropical waters at depth have been found in abundance. Greenland outlet glaciers like Helheim have been melting rapidly and retreating since the arrival of these warm waters.

Then the Hysteria and Pledge of Alligiance to Global Warming

“We are surprised to learn that increased surface glacier melt due to warming atmosphere can trigger increased ocean melting of the glacier,” added Holland. “Essentially, the warming air and warming ocean water are delivering a troubling ‘one-two punch’ that is rapidly accelerating glacier melt.”

My comment: Hold on.They studied effects from warmer ocean water gaining access underneath that glacier. Oceans have roughly 1000 times the heat capacity of the atmosphere, so the idea that the air is warming the water is far-fetched. And remember also that long wave radiation of the sort that CO2 can emit can not penetrate beyond the first millimeter or so of the water surface. So how did warmer ocean water get attributed to rising CO2? Don’t ask, don’t tell.  And the idea that air is melting Arctic glaciers is also unfounded.

Consider the basics of air parcels in the Arctic.

The central region of the Arctic is very dry. Why? Firstly because the water is frozen and releases very little water vapour into the atmosphere. And secondly because (according to the laws of physics) cold air can retain very little moisture.

Greenland has the only veritable polar ice cap in the Arctic, meaning that the climate is even harsher (10°C colder) than at the North Pole, except along the coast and in the southern part of the landmass where the Atlantic has a warming effect. The marked stability of Greenland’s climate is due to a layer of very cold air just above ground level, air that is always heavier than the upper layers of the troposphere. The result of this is a strong, gravity-driven air flow down the slopes (i.e. catabatic winds), generating gusts that can reach 200 kph at ground level.

Arctic air temperatures

Some history and scientific facts are needed to put these claims in context. Let’s start with what is known about Helheim Glacier.

Holocene history of the Helheim Glacier, southeast Greenland

Helheim Glacier ranks among the fastest flowing and most ice discharging outlets of the Greenland Ice Sheet (GrIS). After undergoing rapid speed-up in the early 2000s, understanding its long-term mass balance and dynamic has become increasingly important. Here, we present the first record of direct Holocene ice-marginal changes of the Helheim Glacier following the initial deglaciation. By analysing cores from lakes adjacent to the present ice margin, we pinpoint periods of advance and retreat. We target threshold lakes, which receive glacial meltwater only when the margin is at an advanced position, similar to the present. We show that, during the period from 10.5 to 9.6 cal ka BP, the extent of Helheim Glacier was similar to that of todays, after which it remained retracted for most of the Holocene until a re-advance caused it to reach its present extent at c. 0.3 cal ka BP, during the Little Ice Age (LIA). Thus, Helheim Glacier’s present extent is the largest since the last deglaciation, and its Holocene history shows that it is capable of recovering after several millennia of warming and retreat. Furthermore, the absence of advances beyond the present-day position during for example the 9.3 and 8.2 ka cold events as well as the early-Neoglacial suggest a substantial retreat during most of the Holocene.

Quaternary Science Reviews, Holocene history of the Helheim Glacier, southeast Greenland
A.A.Bjørk et. Al. 1 August 2018

The topography of Greenland shows why its ice cap has persisted for millenia despite its southerly location.  It is a bowl surrounded by ridges except for a few outlets, Helheim being a major one.

And then, what do we know about the recent history of glacier changes. Two Decades of Changes in Helheim Glacier

Helheim Glacier is the fastest flowing glacier along the eastern edge of Greenland Ice Sheet and one of the island’s largest ocean-terminating rivers of ice. Named after the Vikings’ world of the dead, Helheim has kept scientists on their toes for the past two decades. Between 2000 and 2005, Helheim quickly increased the rate at which it dumped ice to the sea, while also rapidly retreating inland- a behavior also seen in other glaciers around Greenland. Since then, the ice loss has slowed down and the glacier’s front has partially recovered, readvancing by about 2 miles of the more than 4 miles it had initially ­retreated.

NASA has compiled a time series of airborne observations of Helheim’s changes into a new visualization that illustrates the complexity of studying Earth’s changing ice sheets. NASA uses satellites and airborne sensors to track variations in polar ice year after year to figure out what’s driving these changes and what impact they will have in the future on global concerns like sea level rise.

Since 1997, NASA has collected data over Helheim Glacier almost every year during annual airborne surveys of the Greenland Ice Sheet using an airborne laser altimeter called the Airborne Topographic Mapper (ATM). Since 2009 these surveys have continued as part of Operation IceBridge, NASA’s ongoing airborne survey of polar ice and its longest-running airborne mission. ATM measures the elevation of the glacier along a swath as the plane files along the middle of the glacier. By comparing the changes in the height of the glacier surface from year to year, scientists estimate how much ice the glacier has lost.

The animation begins by showing the NASA P-3 plane collecting elevation data in 1998. The laser instrument maps the glacier’s surface in a circular scanning pattern, firing laser shots that reflect off the ice and are recorded by the laser’s detectors aboard the airplane. The instrument measures the time it takes for the laser pulses to travel down to the ice and back to the aircraft, enabling scientists to measure the height of the ice surface. In the animation, the laser data is combined with three-dimensional images created from IceBridge’s high-resolution camera system. The animation then switches to data collected in 2013, showing how the surface elevation and position of the calving front (the edge of the glacier, from where it sheds ice) have changed over those 15 years.

Helheim’s calving front retreated about 2.5 miles between 1998 and 2013. It also thinned by around 330 feet during that period, one of the fastest thinning rates in Greenland.

“The calving front of the glacier most likely was perched on a ledge in the bedrock in 1998 and then something altered its equilibrium,” said Joe MacGregor, IceBridge deputy project scientist. “One of the most likely culprits is a change in ocean circulation or temperature, such that slightly warmer water entered into the fjord, melted a bit more ice and disturbed the glacier’s delicate balance of forces.”

Update September 1, 2020 Greenland Ice Math

Prompted by comments from Gordon Walleville, let’s look at Greenland ice gains and losses in context.  The ongoing SMB (surface mass balance) estimates ice sheet mass net from melting and sublimation losses and precipitation gains.  Dynamic ice loss is a separate calculation of calving chunks of ice off the edges of the sheet, as discussed in the post above.  The two factors are combined in a paper Forty-six years of Greenland Ice Sheet mass balance from 1972 to 2018 by Mouginot et al. (2019) Excerpt in italics. (“D” refers to dynamic ice loss.)

Greenland’s SMB averaged 422 ± 10 Gt/y in 1961–1989 (SI Appendix, Fig. S1H). It decreased from 506 ± 18 Gt/y in the 1970s to 410 ± 17 Gt/y in the 1980s and 1990s, 251 ± 20 Gt/y in 2010–2018, and a minimum at 145 ± 55 Gt/y in 2012. In 2018, SMB was above equilibrium at 449 ± 55 Gt, but the ice sheet still lost 105 ± 55 Gt, because D is well above equilibrium and 15 Gt higher than in 2017. In 1972–2000, D averaged 456 ± 1 Gt/y, near balance, to peak at 555 ± 12 Gt/y in 2018. In total, the mass loss increased to 286 ± 20 Gt/y in 2010–2018 due to an 18 ± 1% increase in D and a 48 ± 9% decrease in SMB. The ice sheet gained 47 ± 21 Gt/y in 1972–1980, and lost 50 ± 17 Gt/y in the 1980s, 41 ± 17 Gt/y in the 1990s, 187 ± 17 Gt/y in the 2000s, and 286 ± 20 Gt/y in 2010–2018 (Fig. 2). Since 1972, the ice sheet lost 4,976 ± 400 Gt, or 13.7 ± 1.1 mm SLR.

Doing the numbers: Greenland area 2.1 10^6 km2 80% ice cover, 1500 m thick in average- That is 2.5 Million Gton. Simplified to 1 km3 = 1 Gton

The estimated loss since 1972 is 5000 Gt (rounded off), which is 110 Gt a year.  The more recent estimates are higher, in the 200 Gt range.

200 Gton is 0.008 % of the Greenland ice sheet mass.

Annual snowfall: From the Lost Squadron, we know at that particular spot, the ice increase since 1942 – 1990 was 1.5 m/year ( Planes were found 75 m below surface)
Assume that yearly precipitation is 100 mm / year over the entire surface.
That is 168000 Gton. Yes, Greenland is Big!
Inflow = 168,000Gton. Outflow is 168,200 Gton.

So if that 200 Gton rate continued, (assuming as models do, despite air photos showing fluctuations), that ice loss would result in a 1% loss of Greenland ice in 800 years. (H/t Bengt Abelsson)

Comment:

Once again, history is a better guide than hysteria.  Over time glaciers advance and retreat, and incursions of warm water are a key factor.  Greenland ice cap and glaciers are part of the Arctic self-oscillating climate system operating on a quasi-60 year cycle.