WryHeat Climate Wisdom

A decade ago when I became curious about the issue of global warming/climate change, Jonathan DuHamel was one of the voices persuading me to look critically and investigate claims carefully.  He wrote a regular column in an Arizona newspaper (Arizona Daily Independent) under the banner WryHeat exploring a wide range of scientific issues, including but not limited to global warming.  This post is to celebrate his publishing a compilation of articles on climate concerns over the years, entitled  Summary Of Climate Change Principles & State Of The Science – A Rebuttal Of Climate Alarmism at the Arizona Independent News Network.

The excerpts below show the themes of articles DuHamel wrote.  To access the orginal published columns readers can go to the link in red above, where links to each article are provided.

This post collects several past articles which review climate science and bring together some main points on the state of the climate debate. These points show that the politically correct, carbon dioxide driven meme is wrong. Readers can use these articles to counter climate alarmists. Read each article for more details. (Note, many of these articles appeared in ADI, however the links below go to my Wryheat blog where the articles may be expanded and updated from the ADI versions. The articles also provide addition links to more articles.) [ The first heading below links to a summary pdf file with comprehensive discussion.]

Climate change in perspective

Climate change is a major issue of our times. Concern is affecting environmental, energy, and economic policy decisions. Many politicians are under the mistaken belief that legislation and regulation can significantly control our climate to forestall any deviation from “normal” and save us from a perceived crisis. This post is intended as a primer for politicians so they can cut through the hype and compare real observational data against the flawed model prognostications.
The data show that the current warming is not unusual, but part of a natural cycle; that greenhouse gases, other than water vapor, are not significant drivers of climate; that human emissions of carbon dioxide are insignificant when compared to natural emissions of greenhouse gases; and that many predictions by climate modelers and hyped by the media are simply wrong.

A simple question for climate alarmists – where is the evidence

“What physical evidence supports the contention that carbon dioxide emissions from burning fossil fuels are the principal cause of global warming since 1970?”
(Remember back in the 1970s, climate scientists and media were predicting a return to an “ice age.”)
I have posed that question to five “climate scientist” professors at the University of Arizona who claim that our carbon dioxide emissions are the principal cause of dangerous global warming. Yet, when asked the question, none could cite any supporting physical evidence.

Carbon dioxide is necessary for life on Earth

Rather than being a “pollutant.” Carbon dioxide is necessary for life on Earth as we know it. Earth’s climate has been changing for at least four billion years in cycles large and small. Few in the climate debate understand those changes and their causes. Many are fixated on carbon dioxide (CO2), a minor constituent of the atmosphere, but one absolutely necessary for life as we know it. Perhaps this fixation derives from ulterior political motives for controlling the global economy. For others, the true believers, perhaps this fixation derives from ignorance.

Carbon Dioxide and the Greenhouse Effect

The “greenhouse effect,” very simplified, is this: solar radiation penetrates the atmosphere and warms the surface of the earth. The earth’s surface radiates thermal energy (infrared radiation) back into space. Some of this radiation is absorbed and re-radiated back to the surface and into space by clouds, water vapor, methane, carbon dioxide, and other gases. Water vapor is the principle greenhouse gas; the others are minor players. It is claimed that without the greenhouse effect the planet would be an iceball, about 34∘C colder than it is.* The term “greenhouse effect” with respect to the atmosphere is an unfortunate usage because it is misleading. The interior of a real greenhouse (or your automobile parked with windows closed and left in the sun) heats up because there is a physical barrier to convective heat loss. There is no such physical barrier in the atmosphere.*There is an alternate hypothesis:

What keeps Earth warm – the greenhouse effect or something else?

Scottish physicist James Clerk Maxwell proposed in his 1871 book “Theory of Heat” that the temperature of a planet depends only on gravity, mass of the atmosphere, and heat capacity of the atmosphere. Temperature is independent of atmosphere composition. Greenhouse gases have nothing to do with it. Many publications since, have expounded on Maxwell’s theory and have shown that it applies to all planets in the Solar System.
The Grand Canyon of Arizona provides a practical demonstration of this principle.

Evidence that CO2 emissions do not intensify the greenhouse effect

The U.S. government’s National Climate Assessment report and the UN IPCC both claim that human carbon dioxide emissions are “intensifying” the greenhouse effect and causing global warming. The carbon dioxide driven global warming meme makes four specific predictions. Physical evidence shows that all four of these predictions are wrong.
“It doesn’t matter how beautiful your theory is; it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.” – Richard Feynmann

An examination of the relationship between temperature and carbon dioxide

In this article, we will examine the Earth’s temperature and the carbon dioxide (CO2) content of the atmosphere at several time scales to see if there is any relationship. I stipulate that the greenhouse effect does exist. I maintain, however, that the ability of CO2 emissions to cause global warming is tiny and overwhelmed by natural forces. The main effect of our “greenhouse” is to slow cooling.

How much global warming is dangerous?

The United Nation’s IPCC and other climate alarmists say all hell will break loose if the global temperature rises more than an additional 2º C (3.6ºF). That number, by the way, is purely arbitrary with no basis in science. It also ignores Earth’s geologic history which shows that for most of the time global temperatures have been much warmer than now. Let’s look back at a time when global temperatures are estimated to have been as much as 34ºF warmer than they are now. Hell didn’t break loose then.

Effects of global warming on humans

The EPA’s “endangerment finding” classified carbon dioxide as a pollutant and claimed that global warming will have adverse effects on human health. Real research says the opposite: cold is deadlier. The scientific evidence shows that warming is good for health.

Geology is responsible for some phenomena blamed on global warming

Melting of the Greenland and West Antarctic ice sheets have been blamed on global warming, but both have a geologic origin. The “Blob” a recent warm ocean area off the Oregon coast, responsible in part for the hot weather and drought in California, has been blamed on global warming, but that too may have a geologic cause.

The 97 percent consensus for human caused climate change debunked again

It has been claimed that 97% of climate scientists say humans are causing most of the global warming. An examination of the numbers and how those numbers have been reached show that only 8.2% of scientists polled explicitly endorse carbon dioxide as the principal driver.
Read also a more general article: On consensus in science

Conclusion:

The basic conclusion of this review is that carbon dioxide has little effect on climate and all attempts to control carbon dioxide will be a futile and expensive exercise to no end. All the dire predictions are based on flawed computer models. Carbon dioxide is a phantom menace.

“The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.” – H. L. Mencken

ABOUT THE AUTHOR Jonathan DuHamel

I am a retired economic geologist and have worked as an explorationist in search of economic mineral deposits, mainly copper, molybdenum, and gold. My exploration activities have been mainly in the Western U.S. including Alaska. I have also worked in Mexico, South Africa, Ireland, and Scotland.

Exploration geologists are trained not only in the geologic sciences, but also in chemistry, physics, botany, and geostatistics. I am also trained in the natural history of the Sonoran Desert.

After graduating from The Colorado School of Mines with a Geologic Engineering degree and Master of Science degree, and before practicing as a geologist, I served as an officer in the Army Chemical Corps assigned to a unit that tested experimental weapons and equipment.

I currently reside in Tucson, AZ.

Dr. Drew: Stop the Press to Stop Coronavirus Panic

At Real Clear Politics, Coronavirus Panic Must Stop, Press Needs to Be Held Accountable for Hurting People.  Excerpts in italics with my bolds.

Dr.Drew Pinsky talks with CBS Local’s DJ Sixsmith about coronavirus: “The panic must stop. And the press, they really somehow need to be held accountable because they are hurting people.”

CBS NEWS: “So you’ve seen pandemics over the decades, how does this one compare with everything?”

DR. DREW: “A bad flu season is 80,000 dead, we’ve got about 18,000 dead from influenza this year, we have a hundred from corona. Which should you be worried about influenza or Corona? A hundred versus 18,000? It’s not a trick question. And look, everything that’s going on with the New York cleaning the subways and everyone using Clorox wipes and get your flu shot, which should be the other message, that’s good. That’s a good thing, so I have no problem with the behaviors. What I have a problem with is the panic and the fact that businesses are getting destroyed that people’s lives are being upended, not by the virus, but by the panic. The panic must stop. And the press, they really somehow need to be held accountable because they are hurting people.”

CBS NEWS: “So, where do you think the panic started? Besides the press, like what was the impetus in terms of mass hysteria?”

DR. DREW: “I saw it, there’s a footage of me on a show called The Daily Blast Live a month ago, going ‘shouldn’t we be scared about this?’ and me going ‘no, there’s gonna be as potential for panic here, shut up everybody, stop talking about it, I could see the panic brewing, and I could just see it the way the innuendo and the every opportunity for drama by the press was twisted in that direction. Let me give you an example: so the World Health Organization is out now saying the fatality rate from the virus is 3.4%, right? Every publication from the WHO says 3.4% and we expect it to fall dramatically once we understand the full extent of the illness. No one ever reports the actual statement. We go 3.4% that’s 10 times more than the, whatever five times more than the flu virus and yeah it’s gonna be a little more [than the] flu probably. Still not a bad flu season.”

CBS NEWS: “Right, we’re gonna hear about more cases, more people died.”

DR. DREW: “There are probably several people in this building that probably have it and don’t know it.”

CBS NEWS: “Right, well it was also just the process of letting the public know, the stock market, the number of tests that were available, there was so much happening, I think people were freaking out as a result of that.”

DR. DREW: “I think there was it was a concerted effort by the press to capture your eyes and in doing so they did it by inducing panic. There’s, listen, the CDC and the WHO, they know what they are doing, they contain pandemics, that’s how they know how to do it, they’re doing an amazing job.”

CBS NEWS: “What about the global implications of this because we were talking off-camera about Italy, there’s China as well, there’s some little outbreaks where you should avoid.

DR. DREW: “There are, I would look out where there flus out breaking bad to. I ended up getting the bird flu, I got H1N1 and it was horrible. It was no fun. … There’s certain things having been a physician for almost forty years, there are certain things I just know … and there’s certain things I just know by virtue of all the experience I’ve had and so when I saw this one coming, the corona, I thought I know how this is gonna go, I see kind of what it is and then I saw the excessive reaction the press, so I have to respond and then people, the weird part on social media towards me as people are angry with me, angry with me for trying to get them to see reality and calm down.”

Then there are wise words from Czech Microbiologist Dr Václava Adámková , posted at Lubos Motl’s website Reference Frame Czech microbiologist on the Covid panic  Excerpts in italics with my bolds.

Well, I would criticize them for purposefully and uselessly manipulating with the populace of the laymen. And the tone in which the news are being presented – there is one case here… Well, there’s one case here, five cases a day or eight cases a day today. It’s 8 cases. During that time, much more serious infectious diseases, viral or bacterial ones, actually kill many more people. And that’s something that is not included in the context of that information. So the announcements seem populist, one-sided, and they resemble a politician’s campaign before the elections when the politician focuses on one topic and he escalates it.

I am not quite a virologist, closer to a bacteriologist. Anyway, coronaviruses have been with us from the beginning. It is a large group of viruses that cause respiratory diseases, runny nose, cough, exceptionally diseases of the lower respiratory tract. But when we statistically test the coronaviruses every year, they cause up to 18% of respiratory infections. No one talks about it. These viruses attack all age groups, from babies to seniors. That’s how things work. Sometimes they appear along with other viruses, most often with influenza viruses. The coronaviruses have always been here, are here, and will be here. When the virus mutates, merges the genes with something, that’s how Nature and biology works. They may do whatever seems good in their context. We see it in flu, too.

I don’t really believe that the Wuhan virus differs. If we look at it from the healthcare perspective, according to symptoms – Covid is mostly about mild symptoms in the upper respiratory tract, especially among young and not immunocompromised people. And even the fatalities described in the context of this virus are compatible with the biology of this virus. Even the other coronaviruses may kill a weakened individual. But the available mortality numbers, let’s accept them, simply describe the reality. In comparison with SARS and MERS, Covid has a much lower fatality rate. Nevertheless, SARS and MERS didn’t get this much attention.

Some 3 months ago, the WHO was just warning about the infectious disease, most likely a viral and not bacterial one, that may quickly spread due to the widespread travelling. The main WHO virologist just made this speculation. It’s interesting that this has happened. It may easily spread, in theory. However, in practice, the propagation of the news occurs much more quickly than the propagation of the virus itself. It is spreading like a computer virus, not a biological virus, because the numbers of infected ones remain low. Around 80,000 Chinese is a tiny fraction of China’s 1.4 billion people. If they published how many people have flu or tuberculosis at the same moment, the numbers would be vastly higher. So I think it is like the propagation of a Trojan horse or a computer virus.

Climate Lawsuit Dominos

Climate Dominos

Posted to Energy March 05, 2020 by Curt Levey writes at InsideSources Climate Change Lawsuits Collapsing Like Dominoes.  Excerpts in italics with my bolds.

Climate change activists went to court in California recently trying to halt a long losing streak in their quest to punish energy companies for aiding and abetting the world’s consumption of fossil fuels.

A handful of California cities — big consumers of fossil fuels themselves — asked the U.S. Court of Appeals for the Ninth Circuit to reverse the predictable dismissal of their public nuisance lawsuit seeking to pin the entire blame for global warming on five energy producers: BP, Chevron, ConocoPhillips, ExxonMobil and Royal Dutch Shell.

The cities hope to soak the companies for billions of dollars of damages, which they claim they’ll use to build sea walls, better sewer systems and the like in anticipation of rising seas and extreme weather that might result from climate change.

But no plaintiff has ever succeeded in bringing a public nuisance lawsuit based on climate change.

To the contrary, these lawsuits are beginning to collapse like dominoes as courts remind the plaintiffs that it is the legislative and executive branches — not the judicial branch — that have the authority and expertise to determine climate policy.

Climate change activists should have gotten the message in 2011 when the Supreme Court ruled against eight states and other plaintiffs who brought nuisance claims for the greenhouse gas emissions produced by electric power plants.

The Court ruled unanimously in American Electric Power v. Connecticut that the federal Clean Air Act, under which such emissions are subject to EPA regulation, preempts such lawsuits.

The Justices emphasized that “Congress designated an expert agency, here, EPA … [that] is surely better equipped to do the job than individual district judges issuing ad hoc, case-by-case injunctions” and better able to weigh “the environmental benefit potentially achievable [against] our Nation’s energy needs and the possibility of economic disruption.”

The Court noted that this was true of “questions of national or international policy” in general, reminding us why the larger trend of misusing public nuisance lawsuits is a problem.

The California cities, led by Oakland and San Francisco, tried to get around this Supreme Court precedent by focusing on the international nature of the emissions at issue.

But that approach backfired in 2018 when federal district judge William Alsup concluded that a worldwide problem “deserves a solution on a more vast scale than can be supplied by a district judge or jury in a public nuisance case.” Alsup, a liberal Clinton appointee, noted that “Without [fossil] fuels, virtually all of our monumental progress would have been impossible.”

In July 2018, a federal judge in Manhattan tossed out a nearly identical lawsuit by New York City on the same grounds. The city is appealing.

Meanwhile, climate lawfare is also being waged against energy companies by Rhode Island and a number of municipal governments, including Baltimore. Like the other failed cases, these governments seek billions of dollars.

Adding to the string of defeats was the Ninth Circuit’s rejection last month of the so-called “children’s” climate suit, which took a somewhat different approach by pitting a bunch of child plaintiffs against the federal government.

The children alleged “psychological harms, others impairment to recreational interests, others exacerbated medical conditions, and others damage to property” and sought an injunction forcing the executive branch to phase out fossil fuel emissions.

Judge Andrew Hurwitz, an Obama appointee, wrote for the majority that “such relief is beyond our constitutional power.” The case for redress, he said, “must be presented to the political branches of government.”

Yet another creative, if disingenuous, litigation strategy was attempted by New York State’s attorney general, who sued ExxonMobil for allegedly deceiving investors about the impact of future climate change regulations on profits by keeping two sets of books.

That lawsuit went down in flames in December when a New York court ruled that the state failed to prove any “material misstatements” to investors.

All these lawsuits fail because they are grounded in politics, virtue signaling and — in most cases — the hope of collecting billions from energy producers, rather than in sound legal theories or a genuine strategy for fighting climate change.

But in the unlikely event these plaintiffs prevail, would they use their billion dollar windfalls to help society cope with global warming?

It’s unlikely if past history is any indication.

State and local governments that have won large damage awards in successful non-climate-related public nuisance lawsuits — tobacco litigation is the most famous example — have notoriously blown most of the money on spending binges unrelated to the original lawsuit or on backfilling irresponsible budget deficits.

The question of what would happen to the award money will likely remain academic. Even sympathetic judges have repeatedly refused to be roped by weak public nuisance or other contorted legal theories into addressing a national or international policy issue — climate change — that is clearly better left to elected officials.

Like anything built on an unsound foundation, these climate lawsuits will continue to collapse.

Curt Levey is a constitutional law attorney and president of the Committee for Justice, a nonprofit organization dedicated to preserving the rule of law.

Update March 10

Honolulu joins the domino lineup with its own MeToo lawsuit: Honolulu Sues Petroleum Companies For Climate Change Damages to City

Honolulu city officials, lashing out at the fossil fuel industry in a climate change lawsuit filed Monday, accused oil producers of concealing the dangers that greenhouse gas emissions from petroleum products would create, while reaping billions in profits.

The lawsuit, against eight oil companies, says climate change already is having damaging effects on the city’s coastline, and lays out a litany of catastrophic public nuisances—including sea level rise, heat waves, flooding and drought caused by the burning of fossil fuels—that are costing the city billions, and putting its residents and property at risk.

“We are seeing in real time coastal erosion and the consequences,” Josh Stanbro, chief resilience officer and executive director for the City and County of Honolulu Office of Climate Change, Sustainability and Resiliency, told InsideClimate News. “It’s an existential threat for what the future looks like for islanders.”  [ I wonder if Stanbro’s salary matches the length of his job title, or if it is contingent on winning the case.]

Greta’s Spurious “Carbon Budget”

Many have noticed that recent speeches written for child activist Greta Thunberg are basing the climate “emergency” on the rapidly closing “carbon budget”. This post aims to summarize how alarmists define the so-called carbon budget, and why their claims to its authority are spurious. In the text and at the bottom are links to websites where readers can access both the consensus science papers and the analyses showing the flaws in the carbon budget notion. Excerpts are in italics with my bolds.

The 2019 update on the Global Carbon Budget was reported at Future Earth article entitled Global Carbon Budget Estimates Global CO2 Emissions Still Rising in 2019. The results were published by the Global Carbon Project in the journals Nature Climate Change, Environmental Research Letters, and Earth System Science Data. Excerpts below in italics with my bolds.

History of Growing CO2 Emissions

“Carbon dioxide emissions must decline sharply if the world is to meet the ‘well below 2°C’ mark set out in the Paris Agreement, and every year with growing emissions makes that target even more difficult to reach,” said Robbie Andrew, a Senior Researcher at the CICERO Center for International Climate Research in Norway.

Global emissions from coal use are expected to decline 0.9 percent in 2019 (range: -2.0 percent to +0.2 percent) due to an estimated 10 percent fall in the United States and a 10 percent fall in Europe, combined with weak growth in coal use in China (+0.8 percent) and India (+2 percent).

 

Shifting Mix of Fossil Fuel Consumption

“The weak growth in carbon dioxide emissions in 2019 is due to an unexpected decline in global coal use, but this drop is insufficient to overcome the robust growth in natural gas and oil consumption,” said Glen Peters, Research Director at CICERO.

“Global commitments made in Paris in 2015 to reduce emissions are not yet being matched by proportionate actions,” said Peters. “Despite political rhetoric and rapid growth in low carbon technologies such as solar and wind power, electric vehicles, and batteries, global fossil carbon dioxide emissions are likely to be more than four percent higher in 2019 than in 2015 when the Paris Agreement was adopted.

“Compared to coal, natural gas is a cleaner fossil fuel, but unabated natural gas merely cooks the planet more slowly than coal,” said Peters. “While there may be some short-term emission reductions from using natural gas instead of coal, natural gas use needs to be phased out quickly on the heels of coal to meet ambitious climate goals.”

Oil and gas use have grown almost unabated in the last decade. Gas use has been pushed up by declines in coal use and increased demand for gas in industry. Oil is used mainly to fuel personal transport, freight, aviation and shipping, and to produce petrochemicals.

“This year’s Carbon Budget underscores the need for more definitive climate action from all sectors of society, from national and local governments to the private sector,” said Amy Luers, Future Earth’s Executive Director. “Like the youth climate movement is demanding, this requires large-scale systems changes – looking beyond traditional sector-based approaches to cross-cutting transformations in our governance and economic systems.”

Burning gas emits about 40 percent less CO2 than coal per unit energy, but it is not a zero-carbon fuel. While CO2 emissions are likely to decline when gas displaces coal in electricity production, Global Carbon Project researchers say it is only a short-term solution at best. All CO2 emissions will need to decline rapidly towards zero.

The Premise: Rising CO2 Emissions Cause Global Warming

Atmospheric CO2 concentration is set to reach 410 ppm on average in 2019, 47 percent above pre-industrial levels.

Glen Peters on the carbon budget and global carbon emissions is a Future of Earth interview explaining the Carbon Budget notion. Excerpts in italics with my bolds.

In many ways, the global carbon budget is like any other budget. There’s a maximum amount we can spend, and it must be allocated to various countries and various needs. But how do we determine how much carbon each country can emit? Can developing countries grow their economies without increasing their emissions? And if a large portion of China’s emissions come from products made for American and European consumption, who’s to blame for those emissions? Glen Peters, Research Director at the Center for International Climate Research (CICERO) in Oslo, explains the components that make up the carbon budget, the complexities of its calculation, and its implications for climate policy and mitigation efforts. He also discusses how emissions are allocated to different countries, how emissions are related to economic growth, what role China plays in all of this, and more.

The carbon budget generally has two components: the source component, so what’s going into the atmosphere; and the sink component, so the components which are more or less going out of the atmosphere.

So in terms of sources, we have fossil fuel emissions; so we dig up coal, oil, and gas and burn them and emit CO2. We have cement, which is a chemical reaction, which emits CO2. That’s sort of one important component on the source side. We also have land use change, so deforestation. We’re chopping down a lot of trees, burning them, using the wood products and so on. And then on the other side of the equation, sort of the sink side, we have some carbon coming back out in a sense to the atmosphere. So the land sucks up about 25% of the carbon that we put into the atmosphere and the ocean sucks up about 25%. So for every ton we put into the atmosphere, then only about half a ton of CO2 remains in the atmosphere. So in a sense, the oceans and the land are cleaning up half of our mess, if you like.

The other half just stays in the atmosphere. Half a ton stays in the atmosphere; the other half is cleaned up. It’s that carbon that stays in the atmosphere which is causing climate change and temperature increases and changes in precipitation and so on.

The carbon budget is like a balance, so you have something coming in and something going out, and in a sense by mass balance, they have to equal. So if we go out and we take an estimate of how much carbon have we emitted by burning fossil fuels or by chopping down forests and we try and estimate how much carbon has gone into the ocean or the land, then we can measure quite well how much carbon is in the atmosphere. So we can add all those measurements together and then we can compare the two totals — they should equal. But they don’t equal. And this is sort of part of the science, if we overestimated emissions or if we over or underestimated the strength of the land sink or the oceans or something like that. And we can also cross check with what our models say.

My Comment:

Several things are notable about the carbon cycle diagram from GCP. It claims the atmosphere adds 18 GtCO2 per year and drives Global Warming. Yet estimates of emissions from burning fossil fuels and from land use combined range from 36 to 45 GtCO2 per year, or +/- 4.5. The uptake by the biosphere and ocean combined range from 16 to 25 GtCO2 per year, also +/- 4.5. The uncertainty on emissions is 11% while the natural sequestration uncertainty is 22%, twice as much.

Furthermore, the fluxes from biosphere and ocean are both presented as balanced with no error range. The diagram assumes the natural sinks/sources are not in balance, but are taking more CO2 than they release. IPCC reported: Gross fluxes generally have uncertainties of more than +/- 20%. (IPCC AR4WG1 Figure 7.3.) Thus for land and ocean the estimates range as follows:

Land: 440, with uncertainty between 352 and 528, a range of 176
Ocean: 330, with uncertainty between 264 and 396, a range of 132
Nature: 770, with uncertainty between 616 and 924, a range of 308

So the natural flux uncertainty is 7.5 times the estimated human emissions of 41 GtCO2 per year.

For more detail see CO2 Fluxes, Sources and Sinks and Who to Blame for Rising CO2?

The Fundamental Flaw: Spurious Correlation

Beyond the uncertainty of the amounts is a method error in claiming rising CO2 drives temperature changes. For this discussion I am drawing on work by chaam jamal at her website Thongchai Thailand. A series of articles there explain in detail how the mistake was invented and why it is faulty. A good starting point is The Carbon Budgets of Climate Science. Below is my attempt at a synopsis from her writings with excerpts in italics and my bolds.

Simplifying Climate to a Single Number

Figure 1 above shows the strong positive correlation between cumulative emissions and cumulative warming used by climate science and by the IPCC to track the effect of emissions on temperature and to derive the “carbon budget” for various acceptable levels of warming such as 2C and 1.5C. These so called carbon budgets then serve as policy tools for international climate action agreements and climate action imperatives of the United Nations. And yet, all such budgets are numbers with no interpretation in the real world because they are derived from spurious correlations. Source: Matthews et al 2009

Carbon budget accounting is based on the TCRE (Transient Climate Response to Cumulative Emissions). It is derived from the observed correlation between temperature and cumulative emissions. A comprehensive explanation of an application of this relationship in climate science is found in the IPCC SR 15 2018. This IPCC description is quoted below in paragraphs #1 to #7 where the IPCC describes how climate science uses the TCRE for climate action mitigation of AGW in terms of the so called the carbon budget. Also included are some of difficult issues in carbon budget accounting and the methods used in their resolution.

It has long been recognized that the climate sensitivity of surface temperature to the logarithm of atmospheric CO2 (ECS), which lies at the heart of the anthropogenic global warming and climate change (AGW) proposition, was a difficult issue for climate science because of the large range of empirical values reported in the literature and the so called “uncertainty problem” it implies.

The ECS uncertainty issue was interpreted in two very different ways. Climate science took the position that ECS uncertainty implies that climate action has to be greater than that implied by the mean value of ECS in order to ensure that higher values of ECS that are possible will be accommodated while skeptics argued that the large range means that we don’t really know. At the same time skeptics also presented convincing arguments against the assumption that observed changes in atmospheric CO2 concentration can be attributed to fossil fuel emissions.

A breakthrough came in 2009 when Damon Matthews, Myles Allen, and a few others almost simultaneously published almost identical papers reporting the discovery of a “near perfect” correlation (ρ≈1) between surface temperature and cumulative emissions {2009: Matthews, H. Damon, et al. “The proportionality of global warming to cumulative carbon emissions” Nature 459.7248 (2009): 829}. They had found that, irrespective of the timing of emissions or of atmospheric CO2 concentration, emitting a trillion tonnes of carbon will cause 1.0 – 2.1 C of global warming. This linear regression coefficient corresponding with the near perfect correlation between cumulative warming and cumulative emissions (note: temperature=cumulative warming), initially described as the Climate Carbon Response (CCR) was later termed the Transient Climate Response to Cumulative Emissions (TCRE).

Initially a curiosity, it gained in importance when it was found that it was in fact predicting future temperatures consistent with model predictions. The consistency with climate models was taken as a validation of the new tool and the TCRE became integrated into the theory of climate change. However, as noted in a related post the consistency likely derives from the assumption that emissions accumulate in the atmosphere.

Thereafter the TCRE became incorporated into the foundation of climate change theory particularly so in terms of its utility in the construction of carbon budgets for climate action plans for any given target temperature rise, an application for which the TCRE appeared to be tailor made. Most importantly, it solved or perhaps bypassed the messy and inconclusive uncertainty issue in ECS climate sensitivity that remained unresolved. The importance of this aspect of the TCRE is found in the 2017 paper “Beyond Climate Sensitivity” by prominent climate scientist Reto Knutti where he declared that the TCRE metric should replace the ECS as the primary tool for relating warming to human caused emissions {2017: Knutti, Reto, Maria AA Rugenstein, and Gabriele C. Hegerl. “Beyond equilibrium climate sensitivity.” Nature Geoscience 10.10 (2017): 727}. The anti ECS Knutti paper was not only published but received with great fanfare by the journal and by the climate science community in general.

The TCRE has continued to gain in importance and prominence as a tool for the practical application of climate change theory in terms of its utility in the construction and tracking of carbon budgets for limiting warming to a target such as the Paris Climate Accord target of +1.5C above pre-industrial. {Matthews, H. Damon. “Quantifying historical carbon and climate debts among nations.” Nature climate change 6.1 (2016): 60}. A bibliography on the subject of TCRE carbon budgets is included below at the end of this article (here).

However, a mysterious and vexing issue has arisen in the practical matter of applying and tracking TCRE based carbon budgets. The unsolved matter in the TCRE carbon budget is the remaining carbon budget puzzle {Rogelj, Joeri, et al. “Estimating and tracking the remaining carbon budget for stringent climate targets.” Nature 571.7765 (2019): 335-342}. It turns out that midway in the implementation of a carbon budget, the remaining carbon budget computed by subtraction does not match the TCRE carbon budget for the latter period computed directly using the Damon Matthews proportionality of temperature with cumulative emissions for that period. As it turns out, the difference between the two estimates of the remaining carbon budget has a rational explanation in terms of the statistics of a time series of cumulative values of another time series described in a related post

It is shown that a time series of the cumulative values of another time series has neither time scale nor degrees of freedom and that therefore statistical properties of this series can have no practical interpretation.

It is demonstrated with random numbers that the only practical implication of the “near perfect proportionality” correlation reported by Damon Matthews is that the two time series being compared (annual warming and annual emissions) tend to have positive values. In the case of emissions we have all positive values, and during a time of global warming, the annual warming series contains mostly positive values. The correlation between temperature (cumulative warming) and cumulative emissions derives from this sign bias as demonstrated with random numbers with and without sign bias.

Figure 4: Random Numbers without Sign Bias

Figure 5: Random Numbers with Sign Bias

The sign bias explains the correlation between cumulative values of time series data and also the remaining carbon budget puzzle. It is shown that the TCRE regression coefficient between these time series of cumulative values derives from the positive value bias in the annual warming data. Thus, during a period of accelerated warming, the second half of the carbon budget period may contain a higher percentage of positive values for annual warming and it will therefore show a carbon budget that exceeds the proportional budget for the second half computed from the full span regression coefficient that is based on a lower bias for positive values.

In short, the bias for positive annual warming is highest for the second half, lowest for the first half, and midway between these two values for the full span – and therein lies the simple statistics explanation of the remaining carbon budget issue that climate science is trying to solve in terms of climate theory and its extension to Earth System Models. The Millar and Friedlingstein 2018 paper is yet another in a long line of studies that ignore the statistical issues the TCRE correlation and instead try to explain its anomalous behavior in terms of climate theory whereas in fact their explanation lies in statistical issues that have been overlooked by these young scientists.

The fundamental problem with the construction of TCRE carbon budgets and their interpretation in terms of climate action is that the TCRE is a spurious correlation that has no interpretation in terms of a relationship between emissions and warming. Complexities in these carbon budgets such as the remaining carbon budget are best understood in these terms and not in terms of new and esoteric variables such as those in earth system models.

Footnote:

An independent study by Jamal Munshi come to a similar conclusion. Climate Sensitivity and the Responsiveness of Temperature to Atmospheric CO2

Detrended correlation analysis of global mean temperature observations and model projections are compared in a test for the theory that surface temperature is responsive to atmospheric CO2 concentration in terms of GHG forcing of surface temperature implied by the Climate Sensitivity parameter ECS. The test shows strong evidence of GHG forcing of warming in the theoretical RCP8.5 temperature projections made with CMIP5 forcings. However, no evidence of GHG forcing by CO2 is found in observational temperatures from four sources including two from satellite measurements. The test period is set to 1979-2018 so that satellite data can be included on a comparable basis. No empirical evidence is found in these data for a climate sensitivity parameter that determines surface temperature according to atmospheric CO2 concentration or for the proposition that reductions in fossil fuel emissions will moderate the rate of warming.

Postscript on Spurious Correlations

I am not a climate, environment, geology, weather, or physics expert. However, I am an expert on statistics. So, I recognize bad statistical analysis when I see it. There are quite a few problems with the use of statistics within the global warming debate. The use of Gaussian statistics is the first error. In his first movie Gore used a linear regression of CO2 and temperature. If he had done the same regression using the number of zoos in the world, or the worldwide use of atomic energy, or sunspots, he would have the same result. A linear regression by itself proves nothing.–Dan Ashley · PhD statistics, PhD Business, Northcentral University

 

Coronavirus Data is Still Misleading

The Streetlight Effect: Looking in the light is the first reaction to a crisis, but the truth may actually be in the darkness and yet to be discovered.

Joon Yu writes at Worth Coronavirus Data Is Still Misleading. Here’s What the Latest Numbers Don’t Tell You.  Excerpts in italics with my bolds.

When the existing prevalence of a virus is high and endemic, the rise in incidence of testing can create the appearance of a rise in incidence of a virus.

Photo courtesy of Shutterstock.com

The world is caught in the vortex of the coronavirus story. So what happens from here?

I don’t know, and no one else does either. That said, my intuition—based on the temporal and spatial dispersion of the first 16 domestic cases of coronavirus serologically confirmed in the United States—is that the situation is not inconsistent with a high-prevalence virus that has been endemic in America during this flu season and is still circulating. But what happens as more and more testing kits are delivered into an existing high-prevalence setting?

Prevalence starts getting counted as incidence, and that could send people running for the hills.

Consider the following analogy. Think about prevalence as the gold that was sitting in the Sierras in early 1848, and incidence as the collection of eureka moments thereafter. Just because gold diggers discover more and more gold in the Sierras doesn’t mean gold is spreading. What is spreading is the word about gold, which attracts more gold diggers, who discover more gold, forming a self-reinforcing frenzy.

The prevalence of coronavirus, of course, is more dynamic. Unlike gold, it does spread. But also unlike gold, it disappears when a patient gets better, which we know has been happening in the vast majority of cases so far. What we don’t know is the true prevalence, and how endemic it has been this season—it could be in the millions for Americans already—because we weren’t looking for it until this particular story entered our collective consciousness in recent weeks. And now the labs are playing catch up.

But here’s the catch. A surge in testing—one that seems poised to commence after a slow rollout and criticism—will inevitably show a significant increase in serologically confirmed cases. When the existing prevalence of a virus is high and endemic, the rise in incidence of testing can create the appearance of a rise in incidence of a virus.

Nonetheless, the demand for such circumspection, or any circumspection for that matter, during the current hysteria is understandably anemic. Instead, this is that part of the horror movie where the good intentions of good actors—the companies and agencies rising to the challenge of producing testing kits at an exponentially faster rate than during the 2003 SARS panic—end up serving the interest of the antagonist (the mob) rather than the protagonist (public interest). In an environment when the increasingly unhinging mob is already competing with each other to paint the worst possible portrait of the next several weeks, the bad-news industrial-complex is about to strike gold: They will soon get to spread the word “spread.”

From there, the panic can drive itself. As more cases are serologically confirmed, perceptions of a spreading plague will spread, triggering demand for more testing, which will lead to more confirmed cases in a self-fulfilling prophecy. Such vicious cycles that promote runaway growth of fear are the anathema of a society that relies on stability, security and confidence. Feed-forward loops are the preferred algorithms of all self-expanding beasts, including cancer.

Confidence is already in short supply in some quarters.

Even basic things like numbers and definitions are being called into question. Meanwhile, people are panic selling the stock market and panic buying the remaining stock in supermarkets. Discretionary events are being cancelled in droves and handshakes are becoming an etiquette indiscretion. Adults are working from home, and kids with sniffles of any origin are being sent home from school to join them. During this “seeing-UFOs” phase of mass hysteria, everything from allergies and anxiety can start to look like the coronavirus given the fluidity of definitions and overlapping symptoms. Imagine the specter of this potentially absurd situation: The background prevalence of endemic coronavirus may be falling as the flu season fades, but the bad news bearers keep pointing to the rising incidence of test-affirmed coronavirus.

The numbers are bound to look dramatically worse in the coming days and weeks, so the worst of the panic may be ahead of us.

If all of this feels a bit like we are in the Twilight Zone, that’s because we are. What I mean is that we are already in the twilight of the flu season. If SARS CoV2 turns out to be just a Kafka-esque guest who has been among us for the 2019 to 2020 flu season, then at some point the meticulously recorded and earnestly reported “incidence growth” of coronavirus will stall and fall—thereby releasing the spellbound public from self-captivity and other forms of quarantine. Before we know it everyone will be saying, “I knew it,” and this horror story about the plague of the century could fade into a vague memory as if it never happened.

But before that happens, we should really get to the bottom of this while we are caught in the vortex of fear lest we want to be visited by unwanted sequels every two to five years. At the center of this powerful vortex is the principal agent problem that infected human civilization at its roots at the end of the kin tribe age of human social evolution. Whereas humans were once fed, informed and governed by those who had our best interest at heart (a biological algorithm known as inclusive fitness), in post-diaspora melting pots we are fed, informed and governed by those who have their own best interest at heart. Without mutual kin skin in the game to protect against self-dealing, powerful institutions began arising all over the ancient world that ruled over instead of on behalf of the people. Today’s fake news, fake foods and fake leadership culture are all catalyzed by the same underlying cause of misaligned incentives that have been derailing human sociality and befuddling revolutionaries for thousands of years. It was The Who—not to be confused with the WHO—who pointed out that the new boss is always the same as the old boss.

So what I hope happens to the story from here is that we begin addressing the first-order cause of human social dysfunctions rather than whack-a-moling its second-order symptoms. Simply put, our family values did not scale as we globalized, but virality has. The aggregate sum of everyone’s wonderful instincts to provide for family—the profit motive in today’s world—has produced the unintended externality of the principal agent problem in the post kin tribe era of human evolution. We propose a radically different path forward: by innovating new forms of inclusive stakeholding beyond just kin skin in the game—to align institutions with the people and people with each other—competition and natural inclinations will select for race-to-the-top global outcomes rather than race-to-the-bottom ones.

That’s a self-reinforcing trend I can get behind.

Joon Yun, MD, is the president of Palo Alto Investors and coauthor of the book Essays on Inclusive Stakeholding.

Footnote: Facts on the 2003 Global SARS Outbreak (Source: CDC)

How many people contracted SARS worldwide during the 2003 outbreak? How many people died of SARS worldwide?
During November 2002 through July 2003, a total of 8,098 people worldwide became sick with severe acute respiratory syndrome that was accompanied by either pneumonia or respiratory distress syndrome (probable cases), according to the World Health Organization (WHO). Of these, 774 died. By late July 2003, no new cases were being reported, and WHO declared the global outbreak to be over. For more information on the global SARS outbreak of 2003, visit WHO’s SARS websiteExternal.

How many people contracted SARS in the United States during the 2003 outbreak? How many people died of SARS in the United States?
In the United States, only eight persons were laboratory-confirmed as SARS cases. There were no SARS-related deaths in the United States. All of the eight persons with laboratory-confirmed SARS had traveled to areas where SARS-CoV transmission was occurring.

 

 

 

 

 

 

 

 

Climate Beauty Pageants

Exxon CEO Calls Rivals’ Climate Goals a ‘Beauty Competition’ reported in the Houston Chronicle. Excerpts in italics with my bolds.

“Individual companies setting targets and then selling assets to another company so that their portfolio has a different carbon intensity has not solved the problem for the world,” Exxon Mobil CEO Darren Woods says.

Exxon Mobil Corp. dismissed long-term pledges by some of its Big Oil rivals to reduce carbon dioxide emissions as nothing more than a “beauty competition” that would do little to halt climate change.

Energy companies need to focus on global, systemic efforts to reduce greenhouse gases, rather than just replacing their own emissions-heavy assets with cleaner ones to make themselves look good, Chief Executive Officer Darren Woods said in New York on Thursday.

“Individual companies setting targets and then selling assets to another company so that their portfolio has a different carbon intensity has not solved the problem for the world,” Woods said at Exxon’s analyst day. Exxon is focused on “taking steps to solve the problem for society as a whole and not try and get into a beauty competition.”  Woods’ remarks, which echo those made by Chevron Corp. CEO Mike Wirth earlier this week, underscore the divide between U.S. and European oil explorers in their approach to addressing climate change.

Both American companies see oil and gas demand growing for decades and refuse to compete in a crowded market for renewables where they have little expertise.

Much-derided plastic even came in for some praise, with Exxon Senior Vice President Jack Williams arguing that it’s “a net benefit to society and to the environment.”

By contrast Royal Dutch Shell Plc, Repsol SA and Eni SpA have pledged to make large reductions in carbon emissions over the long term, while last month BP Plc went a step further with a target to become carbon neutral by 2050.

Companies changing their production mix “doesn’t change the demand” for oil and gas, Woods said. “If you don’t have a viable alternative set, all you’re doing is moving out from one company or one country to someplace else. It doesn’t solve the problem.”

Exxon sees world demand for oil and gas growing substantially out to 2040, even under the goals of the Paris Agreement, which seeks to limit temperature rise to 2 degrees Celsius above pre-industrial levels. Renewables such as wind and solar won’t be enough to meet demand growth on their own, according to Exxon.

In any case, it remains to be seen whether oil giants can generate big profits by producing carbon-free energy. Solar, wind and battery storage projects haven’t shown they can fund the huge dividends that underpins the industry’s investment case.

To underscore his point, Woods said that global emissions have risen 4% since the Paris Agreement was signed four years ago and energy demand is up 6%.

For the energy industry to truly address climate change, Woods believes major technological breakthroughs are needed in the fields of carbon capture, alternative fuels in transport and re-thinking industrial processes. The company is investing in all of these fields but admits that progress will take time.

Exxon is also taking steps to reduce emissions from its own operations including reducing methane emissions and gas flaring.

Speaking at the company’s annual investor day meeting, CEO Darren Wood stated that Exxon is “mindful of the current market environment.” However, Woods said that Exxon plans to maintain its current strategy of “leaning into this market when others have pulled back.”

Exxon intends to use “the strength of our balance sheet to invest through the cycle,” according to Woods. As such, it will outspend its cash flow when necessary to maintain its investment pace while also continuing to increase its dividend as it has for the last 37 consecutive years. It also aims to sell $15 billion in assets to help finance its investment plan.

While Exxon isn’t making any changes to its planned investment level, it is adjusting its development plan. Most notably, it will operate at a reduced pace in the Permian Basin over the next two years compared to its previous outlook. However, it still expects to produce more than 1 million barrels of oil equivalent per day from the region by 2024.

Exxon fully believes that energy demand will grow in the coming years. That’s why it’s taking advantage of the current environment to invest while costs are lower so that it can cash in on more favorable future market conditions.

Activists attempt to storm the Exxon Mobil bastion, here seen without their shareholder disguises.

Arctic Ice Exceeds Expectations March 5

Update on Large Arctic Ice Extents as of March 5, 2020

After crashing through the 15M km2 ceiling, both MASIE and SII show the extents holding over that amount.

In addition to surpluses in Bering and Okhotsk Seas in the Pacific, Barents Sea is now growing significant ice on the European side. At 823k km2, Bering is 145% of 2019 maximum, while Barents is 94% of 2019 max.

Background from Previous Posts

As noted in a previous February post, March marks the moment of truth regarding the Arctic maximum extent. Ten days later 2020 met the challenge.

For ice extent in the Arctic, the bar is set at 15M km2. The average in the last 13 years occurs on day 62 at 15.04M before descending. Six of the last 13 years were able to clear 15M, but recently only 2014 and 2016 ice extents cleared the bar at 15M km2; the others came up short.

As of yesterday, 2020 cleared 15M km2 as recorded both by MASIE and SII.

During February MASIE and SII both show ice extent hovering around the 13 year average, matching it exactly on day 52 at 14.85M km2. Then the ice cover shrank before growing strongly the last five days to overtake the 13 year average on day 61 at 15.05M km2.

Region 2020060 Day 060 Average 2020-Ave. 2018060 2020-2018
 (0) Northern_Hemisphere 14999007 14987840 11167 14535979 463028
 (1) Beaufort_Sea 1070655 1070222 433 1070445 210
 (2) Chukchi_Sea 965972 963804 2168 965971 1
 (3) East_Siberian_Sea 1087137 1087039 98 1087120 18
 (4) Laptev_Sea 897845 897824 21 897845 0
 (5) Kara_Sea 919052 928455 -9403 921526 -2474
 (6) Barents_Sea 735450 634497 100953 512601 222848
 (7) Greenland_Sea 596926 621572 -24646 518130 78796
 (8) Baffin_Bay_Gulf_of_St._Lawrence 1464407 1544205 -79798 1783076 -318669
 (9) Canadian_Archipelago 854282 853074 1209 853109 1174
 (10) Hudson_Bay 1260887 1260890 -2 1260838 49
 (11) Central_Arctic 3247904 3211522 36382 3087802 160103
 (12) Bering_Sea 746111 674028 72083 340789 405322
 (13) Baltic_Sea 30173 103770 -73598 134750 -104577
 (14) Sea_of_Okhotsk 1110709 1097753 12956 1079823 30886

As reported previously, Pacific sea ice is a big part of the story this year.  Out of the last 13 years, on day 52 only two years had Okhotsk ice extent higher than 2020, and only four years had higher Bering ice. Those surpluses offset a small deficit in Greenland Sea ice. And on day 61, the last push came from Bering and Okhotsk.

Typically, Arctic ice extent loses 67 to 70% of the March maximum by mid September, before recovering the ice in building toward the next March.

What will the ice do this year?  Where will 2020 rank in the annual Arctic Ice High Jump competition?

Drift ice in Okhotsk Sea at sunrise.

 

NH Land & Ocean Air Warms in February

banner-blog

With apologies to Paul Revere, this post is on the lookout for cooler weather with an eye on both the Land and the Sea.  UAH has updated their tlt (temperatures in lower troposphere) dataset for February 2020.  Previously I have done posts on their reading of ocean air temps as a prelude to updated records from HADSST3. This month also has a separate graph of land air temps because the comparisons and contrasts are interesting as we contemplate possible cooling in coming months and years.

Presently sea surface temperatures (SST) are the best available indicator of heat content gained or lost from earth’s climate system.  Enthalpy is the thermodynamic term for total heat content in a system, and humidity differences in air parcels affect enthalpy.  Measuring water temperature directly avoids distorted impressions from air measurements.  In addition, ocean covers 71% of the planet surface and thus dominates surface temperature estimates.  Eventually we will likely have reliable means of recording water temperatures at depth.

Recently, Dr. Ole Humlum reported from his research that air temperatures lag 2-3 months behind changes in SST.  He also observed that changes in CO2 atmospheric concentrations lag behind SST by 11-12 months.  This latter point is addressed in a previous post Who to Blame for Rising CO2?

After a technical enhancement to HadSST3 delayed March and April updates, May resumed a pattern of HadSST updates mid month.  For comparison we can look at lower troposphere temperatures (TLT) from UAHv6 which are now posted for February. The temperature record is derived from microwave sounding units (MSU) on board satellites like the one pictured above.

The UAH dataset includes temperature results for air above the oceans, and thus should be most comparable to the SSTs. There is the additional feature that ocean air temps avoid Urban Heat Islands (UHI). Recently there was a change in UAH processing of satellite drift corrections, including dropping one platform which can no longer be corrected. The graphs below are taken from the new and current dataset.

The graph below shows monthly anomalies for ocean temps since January 2015. After a June rise in ocean air temps, all regions dropped back down to May levels in July and August.  A spike occured in September, followed by plummenting October ocean air temps in the Tropics and SH. In November that drop partly warmed back, then leveling slightly downword with continued cooling in NH.

2020 started with NH warming slightly, still cooler than the previous months back to September.  SH and Tropics also rose slightly resulting in a Global rise. Now in February there is an anomaly spike of 0.32C in NH, rarely seen in the ocean data.  The Tropics and SH also rose, resulting in an uptick Globally.

Land Air Temperatures Showing a Seesaw Pattern

We sometimes overlook that in climate temperature records, while the oceans are measured directly with SSTs, land temps are measured only indirectly.  The land temperature records at surface stations sample air temps at 2 meters above ground.  UAH gives tlt anomalies for air over land separately from ocean air temps.  The graph updated for February 2020 is below.

Here we have freash evidence of the greater volatility of the Land temperatures, along with an extraordinary departures, first by SH land followed by NH  Despite the small amount of SH land, it spiked in July, then dropped in August so sharply along with the Tropics that it pulled the global average downward against slight warming in NH.  In November SH jumped up beyond any month in this period.  Despite this spike along with a rise in the Tropics, NH land temps dropped sharply.  The larger NH land area pulled the Global average downward.  December reversed the situation with the SH dropping as sharply as it rose, while NH rose to the same anomaly, pulling the Global up slightly.

2020 started with sharp drops in both SH and NH, with the Global anomaly dropping as a result.  Now in February comes a spike of 0.42C in NH land air, nearing 2016 levels. Meanwhile SH land continued dropping.  The behavior of SH and NH land temps is puzzling, to say the least.  it is also a reminder that global averages can conceal important underlying volatility.

The longer term picture from UAH is a return to the mean for the period starting with 1995.  2019 average rose but currently lacks any El Nino to sustain it.

TLTs include mixing above the oceans and probably some influence from nearby more volatile land temps.  Clearly NH and Global land temps have been dropping in a seesaw pattern, more than 1C lower than the 2016 peak, prior to these last several months. TLT measures started the recent cooling later than SSTs from HadSST3, but are now showing the same pattern.  It seems obvious that despite the three El Ninos, their warming has not persisted, and without them it would probably have cooled since 1995.  Of course, the future has not yet been written.

Look Before Leaping into Climate Policies

Ross McKitrick writes at National Post ‘Believing the science’ on climate change doesn’t mean any policy goes.  Excerpts in italics with my bolds

Mainstream science and economics do not support much of the current climate policy agenda and certainly not the radical extremes demanded by activist groups

There’s an assumption out there that if you “accept” the science of climate change, you are obliged to support drastic measures to cut greenhouse gas (GHG) emissions. This is not true. The one does not follow from the other. Mainstream science and economics do not support much of the current climate policy agenda and certainly not the radical extremes demanded by activist groups.

Elements of Integrated Assessment Models, or IAMs.

In a recent peer-reviewed paper, my co-authors and I proved this using one of the economic models governments and academics around the world rely on. Policy-makers compute the social costs of GHG emissions using tools called “integrated assessment models” (IAMs), which contain linked climate and economic models. They run the world forward in time for a few hundred years and estimate the value of damages from a tonne of GHGs emitted today. Pardon all the acronyms but that’s called the “social cost of carbon,” or SCC, and it represents an upper bound on what we should pay per tonne to cut emissions.

The higher the SCC, the more aggressive climate policy should be. During the Obama years the U.S. Environmental Protection Agency (EPA) convened an expert group to use the three best-known IAMs to estimate the SCC from now to the middle of this century to guide regulatory rule-making. Most of their results were in the US$20 to US$60 per tonne range, depending on the discount rate (which controls how much weight to put on far-future damages). The benefit of climate policy is to get rid of this future damage. If the damage is US$60 per tonne, then policies costing more than $60 per tonne of reduction don’t make sense. You wouldn’t spend more than a dollar to save a dollar.

Like all models, IAMs depend on key parameters that are drawn from the scientific literature. It has long been known that although CO2 is a greenhouse gas, it’s also food for plants. So extra CO2 in the air benefits plant growth. Yet two of the EPA’s three IAMs assumed that boosting the carbon dioxide content of the air has no effect on agriculture, which is overly pessimistic. Only one of the models allows for a small gain in agricultural productivity as CO2 levels rise, based on estimates from the 1990s of the size of the effect. So that’s the one we used.

However, we first updated the IAM to take account of the extensive research since the 1990s looking at effects on global plant growth from rising CO2 levels. Results from satellite-based surveys and field experiments have shown larger benefits than people predicted in the 1990s, even in a warming climate, especially for the rice crop in Asia.

Also, all the IAMs assume the climate will warm by three degrees Celsius with every CO2 doubling. This is based on simulations with large climate models, but there have been many recent studies in climate journals estimating lower sensitivity based on observed ground- and satellite-measured temperature changes. So we incorporated this information into the IAM as well.

Based on these updates alone, we showed that, even using a low discount rate, the social cost of carbon as of 2020 drops from US$32 per tonne to about 60 cents, and there’s a 50/50 chance it’s below zero.

It does grow over time but not by much. By 2050 it’s still under $3 per tonne and has a 46 per cent chance of being less than zero.

Note that we did not say “climate change is a hoax so we shouldn’t do anything.” We relied on scientific studies in mainstream journals, combined with one of the Obama-era EPA’s own preferred economic models, to determine if costly climate policies are justified. The answer is no, at least not for the next few decades.

Our paper was reviewed by three knowledgeable anonymous experts who were surprised by our findings and aggressively challenged them, with one strongly recommending our study be rejected. We had to rebut their extensive counterarguments in detail. We were able to defend our calculations and the journal decided in our favour.

If you don’t believe the science of climate change, then you obviously won’t support carbon taxes and other such policies. But it’s important to note that if you do accept the science, you aren’t obliged to support every policy, no matter how costly or inconvenient, that gets put forward. We should still focus on no-regrets strategies where the benefits outweigh the costs.

Ross McKitrick is a professor of economics at the University of Guelph and a senior fellow at the Fraser Institute.

Background:

As the stool above shows, the climate change package sits on three premises. The first is the science bit, consisting of an unproven claim that observed warming is caused by humans burning fossil fuels. The second part rests on impact studies from billions of research dollars spent uncovering any and all possible negatives from warming. And the third leg is climate initiatives (policies) showing how governments can “fight climate change.”

Many posts here address the follies of proposed climate policies under the theme Climate Initiatives

Arctic Ice Abounds March 1

As noted in a previous post ten days ago, March marks the moment of truth regarding the Arctic maximum extent. Now ten days later 2020 met the challenge.

For ice extent in the Arctic, the bar is set at 15M km2. The average in the last 13 years occurs on day 62 at 15.04M before descending. Six of the last 13 years were able to clear 15M, but recently only 2014 and 2016 ice extents cleared the bar at 15M km2; the others came up short.

As of yesterday, 2020 cleared 15M km2 as recorded both by MASIE and SII.

During February MASIE and SII both show ice extent hovering around the 13 year average, matching it exactly on day 52 at 14.85M km2. Then the ice cover shrank before growing strongly the last five days to overtake the 13 year average on day 61 at 15.05M km2.

Region 2020060 Day 060 Average 2020-Ave. 2018060 2020-2018
 (0) Northern_Hemisphere 14999007 14987840 11167 14535979 463028
 (1) Beaufort_Sea 1070655 1070222 433 1070445 210
 (2) Chukchi_Sea 965972 963804 2168 965971 1
 (3) East_Siberian_Sea 1087137 1087039 98 1087120 18
 (4) Laptev_Sea 897845 897824 21 897845 0
 (5) Kara_Sea 919052 928455 -9403 921526 -2474
 (6) Barents_Sea 735450 634497 100953 512601 222848
 (7) Greenland_Sea 596926 621572 -24646 518130 78796
 (8) Baffin_Bay_Gulf_of_St._Lawrence 1464407 1544205 -79798 1783076 -318669
 (9) Canadian_Archipelago 854282 853074 1209 853109 1174
 (10) Hudson_Bay 1260887 1260890 -2 1260838 49
 (11) Central_Arctic 3247904 3211522 36382 3087802 160103
 (12) Bering_Sea 746111 674028 72083 340789 405322
 (13) Baltic_Sea 30173 103770 -73598 134750 -104577
 (14) Sea_of_Okhotsk 1110709 1097753 12956 1079823 30886

As reported previously, Pacific sea ice is a big part of the story this year.  Out of the last 13 years, on day 52 only two years had Okhotsk ice extent higher than 2020, and only four years had higher Bering ice. Those surpluses offset a small deficit in Greenland Sea ice. And on day 61, the last push came from Bering and Okhotsk.

Typically, Arctic ice extent loses 67 to 70% of the March maximum by mid September, before recovering the ice in building toward the next March.

What will the ice do this year?  Where will 2020 rank in the annual Arctic Ice High Jump competition?

Drift ice in Okhotsk Sea at sunrise.