Summer “Hothouse” Silliness

This summer’s heat waves are having an unfortunate side effect. Some scientists who should know better are shouting wild claims as though their heads were exploding.  Paleoclimatologists use terms like “Hothouse” Earth and “Icehouse” Earth referring to our planet’s climate shifts over many eons.  One good old-fashioned hot summer is not a transition, or even an harbinger of an “Hothouse” world.  More importantly, the distribution of temperatures in a warmer world is not the hell on earth depicted by these folks who have lost their bearings.

A powerful post by Clive Best describes how earth’s surface temperatures change by means of changing meridional heat transfers. See Meridional Warming.

The key point for me was seeing how the best geological knowledge proves beyond the shadow of a doubt how the earth’s climate profile shifts over time, as presented in the diagram above.  It comes from esteemed paleoclimatologist Christopher Scotese.  His compete evidence and analysis can be reviewed in his article Some thoughts on Global Climate Change: The Transition from Icehouse to Hothouse (here).

In that essay Scotese shows where we are presently in this cycle between icehouse and hothouse.

As of 2015 earth is showing a GMT of 14.4C, compared to pre-industrial GMT of 13.8C.  According to the best geological evidence from millions of years of earth’s history, that puts us leaving the category “Severe Icehouse,” and nearing “Icehouse.”  So, thankfully we are warming up, albeit very slowly.

Moreover, and this is Clive Best’s point, progress toward a warming world means flattening the profile at the higher latitudes, especially the Arctic.  Equatorial locations remain at 23C throughout the millennia, while the gradient decreases in a warmer world.

A previous related post explained what is wrong with averaging temperature anomalies.  See Temperature Misunderstandings

Conclusion:

We have many, many centuries to go before the earth can warm up to the “Greenhouse” profile, let alone get to “Hothouse.”  Regional and local climates at higher latitudes will see slightly warming temperatures and smaller differences from equatorial climates.  These are facts based on solid geological evidence, not opinions or estimates from computer models.

It is still a very cold world, but we are moving in the right direction.  Stay the course.

Meanwhile, keep firing away Clive.

damaged-ship3

 

Fighting Plasticphobia


Despite the welcome presence of plastic items in our lives, there is mounting plasticphobia driven by the usual suspects: Multi Million Dollar enterprises like Greenpeace, Sierra Club, Natural Resources Defense Council, etc. The media and websites stoke fears and concerns about traces of chemicals used in making plastic products. The basic facts need reminding in this overheated climate of fear, so this post exposes widely observed poppycock about plastics with facts to put things in perspective.

Definition: pop·py·cock ˈpäpēˌkäk/informal noun meaning nonsense.
Synonyms: nonsense, rubbish, claptrap, balderdash, blather, moonshine, garbage; Origin: mid 19th century: from Dutch dialect pappekak, from pap ‘soft’ + kak ‘dung.’

Below are some points to consider in favor of plastics.  Examples below frequently mention plastic bags, bottles, and food containers, all subject to demonizing reports from activists.

Plastics are functional.

It feels like we have always had plastic. It is so widespread in our lives that it’s hard to imagine a time without it. But in reality, plastic products were only introduced in the 1950s. That was a time when the Earth’s population was 2.5 billion people and the global annual production of plastic was 1.5 million tonnes. Now, nearly 70 years later, plastic production exceeds 300 million tonnes a year and the world population is on its way to 8 billion. If this trend continues, another 33 billion tonnes of plastic will have accumulated around the planet by 2050.

Versatile plastics inspire innovations that help make life better, healthier and safer every day. Plastics are used to make bicycle helmets, child safety seats and airbags in automobiles. They’re in the cell phones, televisions, computers and other electronic equipment that makes modern life possible. They’re in the roofs, walls, flooring and insulation that make homes and buildings energy efficient. And plastics in packaging help keep foods safe and fresh.

Example: Conventional plastic shopping bags are not just a convenience, but a necessity. Plastic shopping bags are multi-use/multi-purpose bags with a shorter life. They are used not just as carry bags for groceries, but are essential – reused to help manage household and pet waste.

They are not just a convenience to carry groceries, but a necessity playing an important role to facilitate impulse purchases and for the management of household and pet waste; and in Toronto, organics collection. They have very high alternate use rate in Ontario of 59.1% (Ontario MOE (data).

A move to reusable bags will not eliminate the need for shorter-life bags. Householders will have to supplement their use of reusable bags with a paper or kitchen catcher type bags for household and pet waste. In Ireland, the virtual elimination of plastic bags because of a high bag tax, led to a 77% increase in the purchase of kitchen catchers which contain up to 76% more plastic than conventional plastic bags and a 21% increase in plastics consumed. The fact they are a necessity is reinforced by Decima Research which shows that 76% of Canadians would purchase kitchen catchers if plastic shopping bags are not available at retail check outs.

Beyond bags, manufacture of goods such as automobiles increasingly use plastics to reduce weight and fuel consumption, as well as meet requirements for recycling.

Plastics are Cheap.
Alternatives consume much more energy. Plastics are made mostly of petroleum refining by-products.

Paper bags generate 50 times more water pollutants, and 70% more emissions than plastic bags.
Plastic bags generate 80% less solid waste than paper bags.
Plastic bags use 40% less energy as compared to paper bags.
Even paper bags manufactured from recycled fiber utilized more fossil fuels than plastic bags.

On top of all this, if plastic bag bans like California’s end up causing people to use more paper bags — instead of bringing their reusable ones to the store — it’ll certainly end up being worse for the environment. Research shows that making a paper bag consumes about four times more energy than a plastic bag, and produces about four times more waste if it’s not recycled.

These numbers can vary based on agricultural techniques, shipping methods, and other factors, but when you compare plastic bags with food, it’s not even close. Yet for whatever reason, we associate plastic bags — but not food production — with environmental degradation. If we care about climate change, cutting down on food waste would be many, many times more beneficial than worrying about plastic bags.

Plastics are Durable.
Plastics are highly inert, do not easily degrade or decompose. Without sunlight, they can last for centuries.

Almost all bags are reusable; even the conventional plastic shopping bag has a reuse rate of between 40-60% in Canada.

Conventional plastic shopping bags are highly recyclable in Canada because there is a strong recycling network across the country. Recycling rates are quite high in most provinces.

Plastics are Abused.
Because plastic items are useful, cheap and durable, people leave them around as litter.
Plastics should be recycled or buried in landfill.

In a properly engineered landfill, nothing is meant to degrade. No bag – reusable or conventional plastic shopping bag – will decompose in landfill. which actually helps the environment by not producing greenhouse gases like methane.

This myth is based on a common misunderstanding of the purpose of landfills and how they work. Modern landfills are engineered to entomb waste and prevent decomposition, which creates harmful greenhouse gases like methane and carbon dioxide.

Plastics are Benign.
Plastics are not toxic, nor do they release greenhouse gases.

In Canada, plastic shopping bags are primarily made from a by-product of natural gas production, ethane.

Polyethylene bags are made out of ethane, a component of natural gas. Ethane is extracted to lower the BTU value of the gas in order to meet pipeline and gas utility specifications and so that the natural gas doesn’t burn too hot when used as fuel in our homes or businesses. The ethane is converted, and its BTU value is “frozen” into a solid form (polyethylene) using a catalytic process to make a plastic shopping bag.

There have been claims that chemicals in plastics can leach into food or drink and cause cancer. In particular, there have been rumours about chemicals called Bisphenol A (BPA) and dioxins. Hoax emails have spread warnings about dioxins being released when plastic containers are reused heated or frozen. These are credited to Johns Hopkins University in America, but the university denies any involvement.

Some studies have shown that small amounts of chemicals from plastic containers can end up in the food or drinks that are kept inside them. But the levels of these are very low.

Studies may also look at the effect of these chemicals on human cells. But they will often expose them to much higher levels than people are exposed to in real life. These levels are also much higher than the limits which are allowed in plastic by law. There is no evidence to show using plastic containers actually causes cancer in humans.

The European Food Safety Authority did a full scientific review of BPA in 2015 and decided there was no health risk to people of any age (including unborn children) at current BPA exposure levels. They are due to update this in 2018.

In the UK there is very strict regulation about plastics and other materials that are used for food or drink. These limits are well below the level which could cause harm in humans.

“Generally speaking, any food that you buy in a plastic container with directions to put it in the microwave has been tested and approved for safe use,” says George Pauli, associate director of Science and Policy at the US FDA’s Center for Food and Safety and Applied Nutrition.

Elizabeth Whelan, of the American Council on Science and Health, a consumer-education group in New York think that the case against BPA and phthalates has more in common with those against cyclamates and Alar than with the one against lead. “The fears are irrational,” she said. “People fear what they can’t see and don’t understand. Some environmental activists emotionally manipulate parents, making them feel that the ones they love the most, their children, are in danger.” Whelan argues that the public should focus on proven health issues, such as the dangers of cigarettes and obesity and the need for bicycle helmets and other protective equipment. As for chemicals in plastics, Whelan says, “What the country needs is a national psychiatrist.”

Plastics are A Scapegoat.
Rather than using plastics responsibly, some advocate banning them.

The type of bag you use makes less importance than what you put into it. When it comes to both climate change and trash production, eliminating plastic bags is a symbolic move, not a substantial one. Encouraging people to cut down on food waste, on the other hand, would actually mean something.

Litter audit data from major Canadian municipalities shows that plastic shopping bags are less than 1% of litter. The City of Toronto 2012 Litter Audit shows that plastic shopping bags were 0.8% of the entire litter stream.

Focus on the less than 1% of plastic bag litter does not address the other 99% of litter. Litter is a people problem, not a litter problem. Even if you removed all plastic shopping bag litter, 99 % of the litter would still be a problem.

Believe it or not, plastic bags are one of the most energy efficient things to manufacture. According to statistics, less than .05% of a barrel crude oil is used for the manufacturing of plastic bags in the US. On the other hand, 93% to 95% of each barrel is used for heating purposes and fuel.

In fact, most of the plastic bags used in the US are made from natural gas, 85% of them to be exact. And although plastic bags are made from natural gas and crude oil, the overall amount of fossil fuels they consume during their lifetime are significantly lesser than paper bags and compostable plastic.

So, banning or taxing plastic bags will really do nothing to curb America’s oil consumption. After all, it hardly uses a fraction!

Resources:  

https://news.grida.no/debunking-a-few-myths-about-marine-litter

http://www.allaboutbags.ca/myths.html

https://science.howstuffworks.com/environmental/green-science/paper-plastic1.htm

Top 7 Myths about Plastic Bags

https://www.plasticstoday.com/extrusion-pipe-profile/fear-plastics-and-what-do-about-it/13536413858942

https://www.cancerresearchuk.org/about-cancer/causes-of-cancer/cancer-controversies/plastic-bottles-and-food-containers

https://www.newyorker.com/magazine/2010/05/31/the-plastic-panic

https://www.webmd.com/food-recipes/features/mixing-plastic-food-urban-legend#3

Ocean Air Temps Tepid in July

Presently sea surface temperatures (SST) are the best available indicator of heat content gained or lost from earth’s climate system.  Enthalpy is the thermodynamic term for total heat content in a system, and humidity differences in air parcels affect enthalpy.  Measuring water temperature directly avoids distorted impressions from air measurements.  In addition, ocean covers 71% of the planet surface and thus dominates surface temperature estimates.  Eventually we will likely have reliable means of recording water temperatures at depth.

Recently, Dr. Ole Humlum reported from his research that air temperatures lag 2-3 months behind changes in SST.  He also observed that changes in CO2 atmospheric concentrations lag behind SST by 11-12 months.  This latter point is addressed in a previous post Who to Blame for Rising CO2?

The July update to HadSST3 will appear later this month, but in the meantime we can look at lower troposphere temperatures (TLT) from UAHv6 which are already posted for July. The temperature record is derived from microwave sounding units (MSU) on board satellites like the one pictured above.

The UAH dataset includes temperature results for air above the oceans, and thus should be most comparable to the SSTs. There is the additional feature that ocean air temps avoid Urban Heat Islands (UHI).  The graph below shows monthly anomalies for ocean temps since January 2015.

UAH Oceans 201807The anomalies are holding close to the same levels as 2015. In July, both the Tropics and SH rose, while NH rose very slightly, resulting in a small increase in the Global average of air over oceans. Taking a longer view, we can look at the record since 1995, that year being an ENSO neutral year and thus a reasonable starting point for considering the past two decades.  On that basis we can see the plateau in ocean temps is persisting. Since last October all oceans have cooled, with offsetting bumps up and down.

UAHv6 TLT 
Monthly Ocean
Anomalies
Average Since 1995 Ocean 7/2018
Global 0.13 0.21
NH 0.16 0.3
SH 0.11 0.15
Tropics 0.13 0.29

As of July 2018, global ocean temps are slightly higher than June and the average since 1995.  NH remains virtually the same,  while both SH and Tropics rose making the global temp warmer.  Global, NH and SH are matching July temps in 2015, while the Tropics are the lowest July since 2013.

The details of UAH ocean temps are provided below.  The monthly data make for a noisy picture, but seasonal fluxes between January and July are important.

Open image in new tab to enlarge.

The greater volatility of the Tropics is evident, leading the oceans through three major El Nino events during this period.  Note also the flat period between 7/1999 and 7/2009.  The 2010 El Nino was erased by La Nina in 2011 and 2012.  Then the record shows a fairly steady rise peaking in 2016, with strong support from warmer NH anomalies, before returning to the 22-year average.

Summary

TLTs include mixing above the oceans and probably some influence from nearby more volatile land temps.  They started the recent cooling later than SSTs from HadSST3, but are now showing the same pattern.  It seems obvious that despite the three El Ninos, their warming has not persisted, and without them it would probably have cooled since 1995.  Of course, the future has not yet been written.

 

Duped into War on Plastic

Step aside Polar Bear, It’s Turtle Time!

Everyday now, everywhere in the media someone else is lamenting the presence of plastics and proposing ways to end straws and other plastic items.  Terence Corcoran in Financial Post explains how we got here:  How green activists manipulated us into a pointless war on plastic  Excerpts in italics with my bolds.

The disruptive Internet mass-persuasion machine controlled by the major corporate tech giants, capable of twisting and manipulating the world’s population into believing concocted stories and fake news, is at it again, this time spooking billions of people into panic over plastic. Except…hold on: Those aren’t big greedy corporations and meddling foreign governments flooding the blue planet with alarming statistics of microplastics in our water and gross videos of turtles with straws stuck up their noses and dead birds with bellies stuffed with plastic waste.

As Earth Day/Week 2018 came to a close, the greatest professional twisters and hypers known to modern mass communications — green activists and their political and mainstream media enablers — had succeeded in creating a global political wing-flap over all things plastic.

That turtle video, viewed by millions and no doubt many more through Earth Day, is a classic of the genre, along with clubbed baby seals and starving polar bears. Filmed in 2015 near waters off the coast of Guanacaste, Costa Rica, the video was uploaded as news by the Washington Post in 2017 and reworked last week by the CBC into its series on the curse of plastics: “ ‘We need to rethink the entire plastics industry’: Why banning plastic straws isn’t enough.”

New York Governor Andrew Cuomo introduced a bill to ban single-use plastic shopping bags. In Ottawa, Prime Minister Justin Trudeau wants the G7 leaders to sign on to a “zero plastics waste charter” while British Prime Minister Theresa May promised to ban plastic straws.

No need for a secret data breach, a Russian bot or a covert algorithm to transform a million personal psychological profiles into malleable wads of catatonic dough. All it takes is a couple of viral videos, churning green activists and a willing mass media. “Hello, is this CBC News? I have a video here of a plastic straw being extracted from the nose of sea turtle. Interested?”

One turtle video is worth 50-million Facebook data breaches, no matter how unlikely the chances are that more than one turtle has faced the plastic-straw problem. If the object in the unfortunate turtle’s nasal passage was a plastic straw (was it analyzed?), it would have likely come from one of the thousands of tourists who visit Costa Rica to watch hundreds of plastics-free healthy turtles storm the country’s beaches for their annual egg-hatching ritual.

That the turtles are not in fact threatened by plastic straws would be no surprise. It is also hard to see how banning straws in pubs in London and fast-food joints in Winnipeg would save turtles in the Caribbean or the Pacific Ocean.

Creating such environmental scares is the work of professional green activists. A group called Blue Ocean Network has been flogging the turtle video for three years, using a propaganda technique recently duplicated by polar-bear activists. Overall, the plastic chemical scare follows a familiar pattern. Canadians will remember Rick Smith, the former executive director of Environmental Defense Canada and co-author of a 2009 book, Slow Death by Rubber Duck. In the book, Smith warned of how the toxic chemistry of everyday life was ruining our health, reducing sperm counts and threatening mothers and all of humanity. The jacket cover of Slow Death included a blurb from Sophie Gregoire-Trudeau, who expressed alarm about all the chemicals “absorbed into our bodies every day.”

To mark Earth Day 2018, orchestrated as part of a global anti-plastics movement, Smith was back at his old schtick with an op-ed in The Globe and Mail in which he warns “We must kill plastics to save ourselves.” Smith, now head of the left-wing Broadbent Institute in Ottawa, reminded readers that since his Slow Death book, a new problem has emerged, “tiny plastic particles (that) are permeating every human on earth.” He cites a study that claimed “83 per cent of tap water in seven countries was found to contain plastic micro-fibres.”

You would think Smith would have learned by now that such data is meaningless. Back in 2009, Smith issued a similar statistical warning. “A stunning 93 per cent of Americans tested have measurable amounts of BPA in their bodies.” BPA (bisphenol A) is a chemical used in plastics that Smith claimed produced major human-health problems.

Turns out it wasn’t true. The latest — and exhaustive — science research on BPA, published in February by the U.S. National Toxicology Program, concluded that “BPA produces minimal effects that were indistinguishable from background.” Based on this comprehensive research, the U.S. Food and Drug Administration said current uses of BPA “continue to be safe for consumers.”

Might the same be true for barely-measurable amounts of micro-plastics found today in bottled water and throughout the global ecosystem? That looks possible. A new meta-analysis of the effects of exposure to microplastics on fish and aquatic invertebrates suggests there may be nothing to worry about. Dan Barrios-O’Neill, an Irish researcher who looked at the study, tweeted last week that “There are of course many good reasons to want to curb plastic use. My reading of the evidence — to date — is that negative ecological effects might not be one of them.”

Instead of responding to turtle videos and images of plastic and other garbage swirling in ocean waters with silly bans and high talk of zero plastic waste, it might be more useful to zero in on the real sources of the floating waste: governments that allow it to be dumped in the oceans in the first place.

Update: August 4, 2018

Claim: There is a “sea of plastic” the size of Texas in the North Pacific Gyre north of Hawaii

First question: have you ever seen an aerial or satellite photograph of the “sea of plastic”? Probably not, because it doesn’t really exist. But it makes a good word- picture and after all plastic is full of deadly poisons and is killing seabirds and marine mammals by the thousands.

This is also fake news and gives rise to calls for bans on plastic and other drastic measures. Silly people are banning plastic straws as if they were a dire threat to the environment. The fact is a piece of plastic floating in the ocean is no more toxic than a piece of wood. Wood has been entering the sea in vast quantities for millions of years. And in the same way that floating woody debris provides habitat for barnacles, seaweeds, crabs, and many other species of marine life, so does floating plastic. That’s why seabirds and fish eat the bits of plastic, to get the food that is growing on them. While it is true that some individual birds and animals are harmed by plastic debris, discarded fishnets in particular, this is far outweighed by the additional food supply it provides. Plastic is not poison or pollution, it is litter.

Patrick Moore, PhD

Footnote:  Dare I say it?  (I know you are thinking it.): “They are grasping at straws, with no end in sight.”

 

 

Halting Failed Auto Fuel Standards

There are deeper reasons why US auto fuel efficiency standards are and should be rolled back.  They were instituted in denial of regulatory experience and science.  First, a parallel from physics.

In the sub-atomic domain of quantum mechanics, Werner Heisenberg, a German physicist, determined that our observations have an effect on the behavior of quanta (quantum particles).

The Heisenberg uncertainty principle states that it is impossible to know simultaneously the exact position and momentum of a particle. That is, the more exactly the position is determined, the less known the momentum, and vice versa. This principle is not a statement about the limits of technology, but a fundamental limit on what can be known about a particle at any given moment. This uncertainty arises because the act of measuring affects the object being measured. The only way to measure the position of something is using light, but, on the sub-atomic scale, the interaction of the light with the object inevitably changes the object’s position and its direction of travel.

Now skip to the world of governance and the effects of regulation. A similar finding shows that the act of regulating produces reactive behavior and unintended consequences contrary to the desired outcomes.

US Fuel Economy (CAFE) Standards Have Backfired

An article at Financial Times explains about Energy Regulations Unintended Consequences  Excerpts below with my bolds.

Goodhart’s Law holds that “any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes”. Originally coined by the economist Charles Goodhart as a critique of the use of money supply measures to guide monetary policy, it has been adopted as a useful concept in many other fields. The general principle is that when any measure is used as a target for policy, it becomes unreliable. It is an observable phenomenon in healthcare, in financial regulation and, it seems, in energy efficiency standards.

When governments set efficiency regulations such as the US Corporate Average Fuel Economy standards for vehicles, they are often what is called “attribute-based”, meaning that the rules take other characteristics into consideration when determining compliance. The Cafe standards, for example, vary according to the “footprint” of the vehicle: the area enclosed by its wheels. In Japan, fuel economy standards are weight-based. Like all regulations, fuel economy standards create incentives to game the system, and where attributes are important, that can mean finding ways to exploit the variations in requirements. There have long been suspicions that the footprint-based Cafe standards would encourage manufacturers to make larger cars for the US market, but a paper this week from Koichiro Ito of the University of Chicago and James Sallee of the University of California Berkeley provided the strongest evidence yet that those fears are likely to be justified.

Mr Ito and Mr Sallee looked at Japan’s experience with weight-based fuel economy standards, which changed in 2009, and concluded that “the Japanese car market has experienced a notable increase in weight in response to attribute-based regulation”. In the US, the Cafe standards create a similar pressure, but expressed in terms of size rather than weight. Mr Ito suggested that in Ford’s decision to end almost all car production in North America to focus on SUVs and trucks, “policy plays a substantial role”. It is not just that manufacturers are focusing on larger models; specific models are also getting bigger. Ford’s move, Mr Ito wrote, should be seen as an “alarm bell” warning of the flaws in the Cafe system. He suggests an alternative framework with a uniform standard and tradeable credits, as a more effective and lower-cost option. With the Trump administration now reviewing fuel economy and emissions standards, and facing challenges from California and many other states, the vehicle manufacturers appear to be in a state of confusion. An elegant idea for preserving plans for improving fuel economy while reducing the cost of compliance could be very welcome.

The paper is The Economics of Attribute-Based Regulation: Theory and Evidence from Fuel-Economy Standards Koichiro Ito, James M. Sallee NBER Working Paper No. 20500.  The authors explain:

An attribute-based regulation is a regulation that aims to change one characteristic of a product related to the externality (the “targeted characteristic”), but which takes some other characteristic (the “secondary attribute”) into consideration when determining compliance. For example, Corporate Average Fuel Economy (CAFE) standards in the United States recently adopted attribute-basing. Figure 1 shows that the new policy mandates a fuel-economy target that is a downward-sloping function of vehicle “footprint”—the square area trapped by a rectangle drawn to connect the vehicle’s tires.  Under this schedule, firms that make larger vehicles are allowed to have lower fuel economy. This has the potential benefit of harmonizing marginal costs of regulatory compliance across firms, but it also creates a distortionary incentive for automakers to manipulate vehicle footprint.

Attribute-basing is used in a variety of important economic policies. Fuel-economy regulations are attribute-based in China, Europe, Japan and the United States, which are the world’s four largest car markets. Energy efficiency standards for appliances, which allow larger products to consume more energy, are attribute-based all over the world. Regulations such as the Clean Air Act, the Family Medical Leave Act, and the Affordable Care Act are attribute-based because they exempt some firms based on size. In all of these examples, attribute-basing is designed to provide a weaker regulation for products or firms that will find compliance more difficult.

Summary from Heritage Foundation study Fuel Economy Standards Are a Costly Mistake Excerpt with my bolds.

The CAFE standards are not only an extremely inefficient way to reduce carbon dioxide emission but will also have a variety of unintended consequences.

For example, the post-2010 standards apply lower mileage requirements to vehicles with larger footprints. Thus, Whitefoot and Skerlos argued that there is an incentive to increase the size of vehicles.

Data from the first few years under the new standard confirm that the average footprint, weight, and horsepower of cars and trucks have indeed all increased since 2008, even as carbon emissions fell, reflecting the distorted incentives.

Manufacturers have found work-arounds to thwart the intent of the regulations. For example, the standards raised the price of large cars, such as station wagons, relative to light trucks. As a result, automakers created a new type of light truck—the sport utility vehicle (SUV)—which was covered by the lower standard and had low gas mileage but met consumers’ needs. Other automakers have simply chosen to miss the thresholds and pay fines on a sliding scale.

Another well-known flaw in CAFE standards is the “rebound effect.” When consumers are forced to buy more fuel-efficient vehicles, the cost per mile falls (since their cars use less gas) and they drive more. This offsets part of the fuel economy gain and adds congestion and road repair costs. Similarly, the rising price of new vehicles causes consumers to delay upgrades, leaving older vehicles on the road longer.

In addition, the higher purchase price of cars under a stricter CAFE standard is likely to force millions of households out of the new-car market altogether. Many households face credit constraints when borrowing money to purchase a car. David Wagner, Paulina Nusinovich, and Esteban Plaza-Jennings used Bureau of Labor Statistics data and typical finance industry debt-service-to-income ratios and estimated that 3.1 million to 14.9 million households would not have enough credit to purchase a new car under the 2025 CAFE standards.[34] This impact would fall disproportionately on poorer households and force the use of older cars with higher maintenance costs and with fuel economy that is generally lower than that of new cars.

CAFE standards may also have redistributed corporate profits to foreign automakers and away from Ford, General Motors (GM), and Chrysler (the Big Three), because foreign-headquartered firms tend to specialize in vehicles that are favored under the new standards.[35] 

Conclusion

CAFE standards are costly, inefficient, and ineffective regulations. They severely limit consumers’ ability to make their own choices concerning safety, comfort, affordability, and efficiency. Originally based on the belief that consumers undervalued fuel economy, the standards have morphed into climate control mandates. Under any justification, regulation gives the desires of government regulators precedence over those of the Americans who actually pay for the cars. Since the regulators undervalue the well-being of American consumers, the policy outcomes are predictably harmful.

 

Deaths from Heat Waves to Increase 2000%: “It Will Be Awful”

This story is from Sputnik and is presented with a straight face: It Will Be Awful:’ Global Warming Could Soon Increase Heat Wave Deaths 2,000% Excerpts below in italics with my bolds and some references for context.

Deaths Classified as “Heat-Related” in the United States, 1979–2014. Source: EPA

The number of deaths from heat waves could increase by up to 2,000 percent in some areas of the world by 2080, according to a new study released Tuesday by researchers at Monash University in Melbourne, Australia.

“Future heat waves in particular will be more frequent, more intense and will last much longer,” Yuming Guo, the study’s lead researcher, said in a Tuesday statement to Reuters. “If we cannot find a way to mitigate the climate change (reduce the heat wave days) and help people adapt to heat waves, there will be a big increase of heat wave-related deaths in the future.”

The researchers developed a model to predict the number of deaths caused by heatwaves in 412 communities in 20 countries on four continents between 2031 and 2080.

The study predicted mortality caused by heat waves under different scenarios that take into account levels of greenhouse gas emissions, population density and adaptation strategies.

“We estimated heat wave-mortality associations through a two-stage time series design,” the report, which was published in PLOS Medicine, stated.

“Current and future daily mean temperature series were projected under four scenarios of greenhouse gas emissions from 1971 — 2099… We projected excess mortality in relation to heat waves in the future under each scenario of greenhouse gas emissions, with two assumptions for adaptation (no adaptation and hypothetical adaptation) and three scenarios of population change (high variant, median variant and low variant).”

The findings stated that the increase in mortality caused by heat waves is expected to be highest near the equator. Countries in that area are projected see a 2,000 percent increase in heat wave-related fatalities from 2031 to 2080, compared to the 1971 to 2020 span.

India sees sharp fall in heat wave deaths  CNN June 25, 2018

“If people cannot adapt to future climate change, heat wave-related excess mortality is expected to increase the most in tropical and subtropical countries/regions, while European countries and the United States will have smaller increases. The more serious the greenhouse gas emissions, the higher the heat wave-related excess mortality in the future,” concluded the study.

Even if people do adapt to future climate change, heat wave-related deaths would still increase in the future under the high-variant population and serious greenhouse gas emissions situations. The projected increase in mortality is much smaller, however, than in the no-adaptation cases.

Heat-related mortality trends under recent climate warming in Spain: A 36-year observational study

• We analysed daily mortality records from 47 major cities in Spain.
• There has been a general and sustained decline in the vulnerability of the population since 1980.
• Despite the observed warming, the decline of the vulnerability has generally contributed to a progressive reduction in the number of deaths attributed to heat since 1980.

Fred Magdoff, professor emeritus of plant and soil science at the University of Vermont and co-author of “What Every Environmentalist Needs to Know About Capitalism” and “Creating an Ecological Society: Toward a Revolutionary Transformation,” told Sputnik Wednesday that the increase in heat waves will not only affect poorer countries that are close to the equator, but also countries like Japan.
(Note: Magdoff is not related to Bernie Madoff, who made off with 65 billion US$ by bilking investors.)

“Although the poor countries will have more problems with this, it also affects the north — Japan hit an all time high of 106 degrees F, and there are heat waves in Europe and the US. Clearly those in the wealthier countries are able to deal with this better, either through home air conditioning or access to ‘cooling stations,'” Magdoff told Sputnik.

The carbon dioxide level in the atmosphere is currently about 410 ppm (it was around 320 in the 1950s), and in a relatively few decades it will reach 450, assuming current trends persist. After that, global warming may actually increase faster. Thus, I am not too surprised about the prediction for 2080 — not a pretty picture indeed. It will be awful,” Magdoff added.

According to the study, adaptation strategies to reduce greenhouse gas emissions are necessary, including opening cooling centers and painting rooftops white to reflect sunlight.

Footnote:  A previous post celebrated the fact that heat wave hysteria was muted this year.  Obviously, that didn’t last very long.

“World to burn up; Women, minorities and the poor to be hardest hit.”

July Arctic Ice Surprise

Arctic ice July07to18

Early in July, a divergence of 2018 surplus ice resembled a hockey stick temporarily.  Though the blade later drooped downward, ice extent remained above average throughout July.  The graph above shows 2018  300k km2 above the 11 year average for July (2007 to 2017 inclusive).  Only 2015 and 2008 had a higher July monthly average extent.  Note that SII (NOAA’s Sea Ice Index) was lower by 436k km2 in 2018, and SII 11 yr. average is lower by 264k km2.

Arctic day 212

The surprise:  This is the first 2018 month above the average.  Indeed March 2018 (annual maximum) was almost 500k km2 lower than March 11 yr. average.  But reduced rates of melting in May, June and July have resulted in more ice extent than most other recent Julys.  At end of July, 2018, 2017 and 11 yr. average are close together, with SII 600k lower and 2007 with 824k km2 less ice.

The table below shows ice extents by regions comparing 2018 with 11-year average (2007 to 2017 inclusive) and 2017.

Region 2018212 Day 212 
Average
2018-Ave. 2007212 2018-2007
 (0) Northern_Hemisphere 7169781 7084113 85668 6344860 824921
 (1) Beaufort_Sea 898821 774345 124476 760576 138246
 (2) Chukchi_Sea 540543 544864 -4320 382350 158193
 (3) East_Siberian_Sea 952130 770306 181824 445385 506745
 (4) Laptev_Sea 338486 448988 -110502 314382 24103
 (5) Kara_Sea 112802 188689 -75888 239232 -126430
 (6) Barents_Sea 525 34556 -34031 23703 -23177
 (7) Greenland_Sea 213399 309333 -95934 324737 -111338
 (8) Baffin_Bay_Gulf_of_St._Lawrence 252017 146604 105413 94179 157838
 (9) Canadian_Archipelago 549236 565183 -15947 510063 39173
 (10) Hudson_Bay 253116 147477 105639 93655 159462
 (11) Central_Arctic 3057671 3151943 -94272 3154837 -97166

2018 is 86k km2 above average (1.2%). Laptev, Kara, Greenland Sea and Central Arctic are down.  Offsetting surpluses are in Beaufort and East Siberian seas, as well as Hudson and Baffin Bays.  Since the two bays will melt out soon, the eventual annual minimum remains to be seen.

Postscript:

Interesting commentary from Dr. Judah Cohen at his July 30, 2018 Arctic Oscillation and Polar Vortex Analysis and Forecasts Excerpts in italics with my bolds.

In the rest of the Impacts section I want to discuss what I wrote in the June 4th blog as it seems to be very relevant for this summer and especially the European summer version of western North America’s winter ridiculously resilient ridge. In the blog I described how a diminishing cryosphere (snow and ice) might be contributing to more persistent and amplified waves in the atmosphere a physical mechanism different from what I typically describe in winter. This mechanism may just be the best explanation of what occurred this summer over Europe. Though probably the precondition of the soil or its desiccation is probably another important contributor to the heat and dry conditions over Northern Europe this summer:

“Over the past several blog posts I have been discussing blocking and ridging over northern Europe with a split Jet Stream across Europe with the polar branch way to the north across northern Scandinavia and a second subtropical branch across the Mediterranean. In between the two Jet Streams has been a sort of no-man’s land with weak zonal winds in the mid-troposphere across much of Central and Northern Europe. This has resulted in a warm spring so far and for the months of April and May, Europe has seen possibly the largest positive temperature departures from normal for the entire Northern Hemisphere (NH).

A warm spring is different than a hot summer and had this atmospheric circulation occurred in July and August instead of April and May it would have created greater news headlines, straining resources and resulting a likely spike in mortality. Europe has been experiencing more frequent hot summers over the past couple of decades with possibly the most infamous being 2003 but even as recently as last summer, when extreme heat accompanied by forest fires were common across Southern Europe. The atmospheric circulation across Europe this spring with blocking and a split Jet Stream is consistent with an idea that Arctic change is resulting in more frequent occurrences of extreme summer weather including flooding, drought and heat waves. This is admittedly not my expertise but I thought it could be interesting to give a brief discussion given the weather pattern across Europe this past spring, which could be laying the groundwork for an overall hot upcoming summer.

In winter the loss of sea ice has contributed to accelerated warming across the Arctic Ocean referred to as Arctic amplification. This is hypothesized by some including me to influence mid-latitude weather either by weakening the zonal Jet Stream or by favoring large scale anomalous atmospheric waves that project onto the climatological waves forced by the geography of the NH. Amplification of the climatological waves leads to a breakdown of the polar vortex followed by increases in severe winter weather across the NH. This is a topic that write about often in my blog posts in the winter months.

Other scientists have postulated something somewhat analogous but also different for the warm season. During the warm boreal months it is much harder for the Arctic Ocean to warm rapidly relative to normal because even with increased ice melt, the ocean remains colder than the overlying atmosphere so energy is transferred from the atmosphere to the ocean rather than vice versa as in winter. Therefore we have not observed in the Arctic Ocean basin the extreme warm events in summer as we do in winter. Instead the rapid disappearance of snow cover in the spring and early summer has allowed the land masses that ring the Arctic Ocean to heat up much more quickly today than they used to two or three decades ago. So the accelerated Arctic warming in summer is not observed over the Ocean but rather across the adjacent land masses of Eurasia and North America.

The accelerated warming to the north can still cause a slackening of the zonal Jet Stream as the south to north temperature gradient weakens. Instead of one Jet Stream across the mid-latitudes, the Jet Stream splits into two pieces one to the north and a second to the south. The northerly Jet forms along and just north of the land regions that are experiencing the most accelerated warming that ring the Arctic approximately along the 70°N latitude, as a the strong warming along the north slope of the continents with a still relatively cold Arctic ocean maintains a strong temperature gradient and a Jet Stream. The southerly Jet Stream forms where the normal south to north temperature gradient resumes across the southern mid-latitudes or in a band between 30-45°N latitude. In between the two Jet Streams the winds are very weak. It turns out this atmospheric configuration with a Jet Stream to the north, a Jet Stream to the south and very weak winds in between is ideal for trapping waves that are persistent in one location and can even amplify. This is referred to as quasi-resonant amplification (QRA). When QRA occurs, atmospheric waves become trapped and persist for much longer periods than normal. This in turns leads to an increased probability of extreme weather whether it be floods, drought or heat waves. The split Jet Stream, the persistent atmospheric waves and extreme weather have all been observed to be increasing over the past two decades. Some early papers on the subject are Petoukhov et al. 2013 and Coumou et al. 2014. The weather models are predicting this latest example of QRA over Europe to dissipate over the coming two weeks but a recurrence of QRA over Europe or a different region this summer is not only of meteorological interest but of societal importance.”

Footnote on MASIE Data Sources:

MASIE reports are based on data primarily from NIC’s Interactive Multisensor Snow and Ice Mapping System (IMS). From the documentation, the multiple sources feeding IMS are:

Platform(s) AQUA, DMSP, DMSP 5D-3/F17, GOES-10, GOES-11, GOES-13, GOES-9, METEOSAT, MSG, MTSAT-1R, MTSAT-2, NOAA-14, NOAA-15, NOAA-16, NOAA-17, NOAA-18, NOAA-N, RADARSAT-2, SUOMI-NPP, TERRA

Sensor(s): AMSU-A, ATMS, AVHRR, GOES I-M IMAGER, MODIS, MTSAT 1R Imager, MTSAT 2 Imager, MVIRI, SAR, SEVIRI, SSM/I, SSMIS, VIIRS

Summary: IMS Daily Northern Hemisphere Snow and Ice Analysis

The National Oceanic and Atmospheric Administration / National Environmental Satellite, Data, and Information Service (NOAA/NESDIS) has an extensive history of monitoring snow and ice coverage.Accurate monitoring of global snow/ice cover is a key component in the study of climate and global change as well as daily weather forecasting.

The Polar and Geostationary Operational Environmental Satellite programs (POES/GOES) operated by NESDIS provide invaluable visible and infrared spectral data in support of these efforts. Clear-sky imagery from both the POES and the GOES sensors show snow/ice boundaries very well; however, the visible and infrared techniques may suffer from persistent cloud cover near the snowline, making observations difficult (Ramsay, 1995). The microwave products (DMSP and AMSR-E) are unobstructed by clouds and thus can be used as another observational platform in most regions. Synthetic Aperture Radar (SAR) imagery also provides all-weather, near daily capacities to discriminate sea and lake ice. With several other derived snow/ice products of varying accuracy, such as those from NCEP and the NWS NOHRSC, it is highly desirable for analysts to be able to interactively compare and contrast the products so that a more accurate composite map can be produced.

The Satellite Analysis Branch (SAB) of NESDIS first began generating Northern Hemisphere Weekly Snow and Ice Cover analysis charts derived from the visible satellite imagery in November, 1966. The spatial and temporal resolutions of the analysis (190 km and 7 days, respectively) remained unchanged for the product’s 33-year lifespan.

As a result of increasing customer needs and expectations, it was decided that an efficient, interactive workstation application should be constructed which would enable SAB to produce snow/ice analyses at a higher resolution and on a daily basis (~25 km / 1024 x 1024 grid and once per day) using a consolidated array of new as well as existing satellite and surface imagery products. The Daily Northern Hemisphere Snow and Ice Cover chart has been produced since February, 1997 by SAB meteorologists on the IMS.

Another large resolution improvement began in early 2004, when improved technology allowed the SAB to begin creation of a daily ~4 km (6144×6144) grid. At this time, both the ~4 km and ~24 km products are available from NSIDC with a slight delay. Near real-time gridded data is available in ASCII format by request.

In March 2008, the product was migrated from SAB to the National Ice Center (NIC) of NESDIS. The production system and methodology was preserved during the migration. Improved access to DMSP, SAR, and modeled data sources is expected as a short-term from the migration, with longer term plans of twice daily production, GRIB2 output format, a Southern Hemisphere analysis, and an expanded suite of integrated snow and ice variable on horizon. Source:  Interactive Multisensor Snow and Ice Mapping System (IMS)

 

 

 

 

Kavanaugh’s EPA Opinions Already Endorsed by Supremes

Adam J. White wrote in Real Clear Policy July 31, 2018 Brett Kavanaugh’s Past Opinions Endorsed by Supreme Court  Excerpts below in italics with my bolds.

If all goes according to plan, Brett Kavanaugh will soon join the Supreme Court. But his ideas arrived at the Court well before him.

For 12 years, Judge Kavanaugh has served on the U.S. Court of Appeals for the D.C. Circuit, often considered the “second-highest court in the land” because of its heavy portion of constitutional and regulatory cases. On those issues, Kavanaugh has become the intellectual leader of his generation of judges on the lower courts. And the best evidence of this are those cases in which Judge Kavanaugh’s analysis was adopted by the Supreme Court even after Kavanaugh’s colleagues on the D.C. Circuit rejected it.

Through eloquent judicial opinions and nuanced law review articles, Kavanaugh has challenged, in particular, today’s increasingly unaccountable administrative state. His uncanny ability to identify fundamental threats to our Constitution’s republican institutions, and to anticipate the Supreme Court’s own eventual response, is exemplified by three cases.

The first involved so-called “independent” agencies. Since the New Deal, the Supreme Court has recognized Congress’s discretion to create agencies with a measure of insulation against day-to-day presidential control. But when Congress attempted to layer one independent agency within another — i.e., the Sarbanes-Oxley Act’s creation of the Public Company Accounting Oversight Board, inside the SEC — Kavanaugh recognized that a line must be drawn.

“By restricting the President’s authority over the Board,” he wrote in a 2008 case, “the Act renders this Executive Branch agency unaccountable and divorced from Presidential control to a degree not previously countenanced in our constitutional structure.” Recognizing that “upholding the PCAOB here would green-light Congress to create a host of similar entities,” Kavanaugh dissented from his colleagues’ decision affirming the agency. The Supreme Court then reversed the D.C. Circuit, largely adopting Kavanaugh’s approach.

The second case involved an agency’s assertion of immense power in lieu of — or even contrary to — the laws enacted by Congress. When the Environmental Protection Agency imposed its initial suite of regulations for greenhouse gas emissions, the agency attempted to “tailor” the Clean Air Act to suit its climate policy. The EPA recognized that applying various parts of the Act to GHG emissions would lead to “absurd” results that Congress specifically sought to avoid when it created the Act in the first place. So the agency attempted simply to nullify those parts of the Act in order to maintain its climate policy.

As Kavanaugh explained in a dissenting opinion, the EPA was putting the regulatory cart before the legislative horse. If the EPA’s climate policy didn’t fit the Clean Air Act, then the EPA needed to change its policy, not the Act. Once again, Kavanaugh’s colleagues disagreed — and once again, the Supreme Court reversed the D.C. Circuit, largely adopting his approach in a 2014 case.

The third case involves judicial deference to an agency’s implausible and self-serving statutory interpretation. The Clean Air Act allows the EPA to impose certain air quality regulations when the agency concludes that such regulations are “appropriate.” When the EPA created new mercury restrictions for utilities, it refused to consider the enormous cost of those rules, claiming that such costs have no bearing on whether the rules are “appropriate.” Citing many scholars and judges, Kavanaugh concluded that it “is entirely unreasonable for EPA to exclude consideration of costs in determining whether” the regulation is “appropriate.” His colleagues rejected his approach and deferred instead to the agency. But in 2015 the Supreme Court reversed the D.C. Circuit and followed Kavanaugh.

In each of these cases, Kavanaugh sensed that the administrative state was pushing matters to a breaking point. Each time, his circuit colleagues rejected his approach, but the Supreme Court embraced it.

If Kavanaugh’s nomination succeeds and he winds up joining the Court, where his ideas already have had such influence, there are at least three places where he will likely have significant impact in reforming and modernizing the judicial doctrines surrounding the administrative state.

First, Kavanaugh has expressed reservations about the degree of “deference” that courts now give agencies’ legal interpretations. (The best example of this is his 2016 article in the Harvard Law Review.) This is an increasingly common theme among conservative judges — indeed, the justice whom Kavanaugh would replace (and for whom he once clerked), Justice Kennedy, raised the same concerns in one of his own last judicial opinions.

Second, and relatedly, Kavanaugh has been called on courts to be more skeptical of agencies’ assertions of power over the most significant economic and political issues of our times. In an opinion dissenting from the D.C. Circuit’s deference to the Obama FCC’s “net neutrality” rules, Kavanaugh argued that courts should presume that Congress did not commit such vast regulatory powers to bureaucratic agencies, absent a clear statement to the contrary. In this respect, Kavanaugh echoes Chief Justice Roberts’s own un-deferential opinion in one of the Affordable Care Act cases, where Roberts — joined by the Court’s four liberal justices — agreed with the Obama administration but expressly refused to approach the case with any interpretive “deference” to the agencies’ claims of authority.

Finally, Kavanaugh raises serious questions about novel forms of agency “independence.” In a case involving the Consumer Financial Protection Bureau, Kavanaugh wrote a majority opinion holding the CFPB’s structure unconstitutional. The reason? The Dodd-Frank Act gave the CFPB an unprecedented measure of independence without the usual multimember agency structure that disperses in independent agency’s power among more deliberative body (as in the Federal Trade Commission). Kavanaugh’s majority opinion — which echoed themes raised by Chief Justice Roberts in an earlier Supreme Court cases — was eventually vacated by the full D.C. Circuit, where Democratic appointees enjoy a strong majority. But even then, Kavanaugh’s intellectual influence among other judges was made evident when his approach to the CFPB case was adopted by the federal district court in Manhattan in a different challenge to the CFPB. Even more recently, the U.S. Court of Appeals for the Fifth Circuit applied a similar analysis to the Federal Housing Finance Agency, holding the FHFA’s structure unconstitutional with a judicial opinion replete with citations to Kavanaugh.

At the Scalia Law School, I direct the Center for the Study of the Administrative State. It is no exaggeration to say that for the past decade, to study the administrative state has been, in no small part, to study Judge Kavanaugh’s D.C. Circuit opinions. With an appointment to the Supreme Court, his official title will finally match his real-world influence.

Adam J. White is research fellow at the Hoover Institution, and director of the C. Boyden Gray Center for the Study of the Administrative State. Previously, as a lawyer, he participated in some of the mentioned cases.

Postscript: A Majority Kavanaugh EPA Opinion

Last year DC Court of Appeals struck down EPA rules regarding HFCs and Judge Kavanaught wrote the majority opinion:

“EPA’s novel reading of Section 612 is inconsistent with the statute as written. Section 612 does not require (or give EPA authority to require) manufacturers to replace non ozone-depleting substances such as HFCs,” said the opinion, written by Judge Brett Kavanaugh.

“In any event, the legislative history strongly supports our conclusion that Section 612(c) does not grant EPA continuing authority to require replacement of non-ozone-depleting substitutes.. . In short, although Congress contemplated giving EPA broad authority under Title VI to regulate the replacement of substances that contribute to climate change, Congress ultimately declined.”

“However, EPA’s authority to regulate ozone-depleting substances under Section 612 and other statutes does not give EPA authority to order the replacement of substances that are not ozone depleting but that contribute to climate change. Congress has not yet enacted general climate change legislation. Although we understand and respect EPA’s overarching effort to fill that legislative void and regulate HFCs, EPA may act only as authorized by Congress. Here, EPA has tried to jam a square peg (regulating non-ozone depleting substances that may contribute to climate change) into a round hole (the existing statutory landscape).”

More at Gamechanger: DC Appeals Court Denies EPA Climate Rules

Footnote:

More and more likely we are witnessing a return to Constitutional separation of powers.