Serenity Faces Ice Aug.30

Click on image to enlarge.

August 31 and Sept. 1 Updates Below

h/t to angech for asking an interesting question:  What are the chances of an early refreezing in the Northwest Passage?  It provoked me to look more into the recent history of ice extents in the CAA (Canadian Arctic Archipelago).

The image above shows the years in the last decade closest to 2017 in terms of ice present in the CAA.  Only 2014 had more ice in that region than this year: 513k km2 compared to 464k km2 at day 240.  It appears that CAA annual minimum typically occurs at day 260, the same as the overall NH minimum.  The image above also shows that 2009 with 388k km2 provides the closest analog to this year for the amount of ice in McClintock Bay just in front of Serenity. The image below gives the progression in 2009 from day 244 to 270.

As of 10:30 EST this morning Serenity is located as shown in image below along with ice extent reported by MASIE for yesterday.  Ship tracking is provided by marinevesseltraffic.com  and shows that Serenity is now in convoy with two icebreakers ahead of her.

Canadian ice chart from yesterday showing Serenity and escort worked their way through the water and thinner green ice, but with some thicker stuff ahead.

Serenity will make it through with such assistance, which was largely unnecessary last year.  And as the above shows, 2014 would have been close to impossible.

UPDATE 16:30 EST

The convoy has reached the southern tip of Prince of Wales Island and are turning northeast. It appears they will sail along the thin shore ice, presumably headed for the western entrance of Bellot Strait which passes through to Prince Regent channel.

UPDATE 7:00AM EST AUG. 31

Serenity convoy is about to enter Bellot Strait.  Waiting for a cargo ship to exit the channel.

UPDATE 9:00AM EST Aug. 31

Cruisemapper shows Serenity has entered the Strait following her icebreaker.

Update 12:00 PM EST Aug. 31

Serenity and icebreaker escort have emerged from Bellot Strait into Prince Regent channel and are turning north.  Next destination may well be Devon Island with some interesting things that were seen last year.  See Mars in the Arctic

Update 7:00 AM EST Sept. 1, 2017

Serenity and icebreaker are anchored just off Beechey Island at the southwestern tip of Devon Island.

 

 

 

 

Persistent August Arctic Ice

Click on image to enlarge.

The image above shows ice extents for yesterday, day 239, from 2007 to 2017.  Particularly interesting is the variation in the CAA (Canadian Arctic  Archipelago), crucial for the Northwest Passage.  (The region is located just north of the word “Sea” in gold.)  Note that 2016 was a fine year for cruising with the passage completely open at this date.  That was not the case in 2014, and this year also has places frozen solid.

Crystal Serenity is going through the Northwest Passage again this year, having left Seward Alaska on Aug. 15, 2017.  She arrives at Cambridge Bay today, having traveled in the wake of icebreaker RRS Ernest Shackleton.  The next part of the voyage could be challenging.

The graph of August NH ice extents shows 2017 has moved above the decadal average in recent days. (Ten-year average is for 2007 to 2016 inclusive)

This year is now 850k km2 greater than 2016 and exceeds the 10 year average by 200k km2.  SII (Sea Ice Index) 2017 is also 450k km2 lower.  A previous post Beware the Arctic Storms of August discussed how late summer storms have dramatic impacts, and the graph shows both 2012 and 2016 plummeting in the last ten days.  By the end of the month in four days, those two years will go below 4.4M km2.

By the way, Barents is still above average and holding steady near 50k km2.

The black line is average for the last 11 years.  2007 in purple appears close to an average year.  2014 had the highest annual extent in Barents Sea, due to higher and later maximums, holding onto ice during the summer, and recovering quickly.  In contrast, 2016 was the lowest annual extent, melting out early and recovering later.  2017 in blue started out way behind, but grew rapidly to reach average, and then persisted longer to exceed even 2014.  It will be important to see when the recovery of ice begins.

For more on why Barents Sea matters see Barents Icicles

Footnote

Some people unhappy with the higher amounts of ice extent shown by MASIE continue to claim that Sea Ice Index is the only dataset that can be used.  This is false in fact and in logic.  Why should anyone accept that the highest quality picture of ice day to day has no shelf life, that one year’s charts can not be compared with another year?  Researchers do this, including Walt Meier in charge of Sea Ice Index.  That said, I understand his interest in directing people to use his product rather than one he does not control.  As I have said before:

MASIE is rigorous, reliable, serves as calibration for satellite products, and continues the long and honorable tradition of naval ice charting using modern technologies.  More on this at my post Support MASIE Arctic Ice Dataset

Crystal Serenity and escort near Cambridge Bay in 2016.

Partying in Paris instead of Preparing in Houston

Waves pound the shore from approaching Hurricane Harvey on August 25, 2017 in Corpus Christi, Texas.(Photo by Joe Raedle/Getty Images)

“We are not ready for a return to 1950’s weather, let alone something unprecedented.”

As evidence of the above, we have Hurricane Harvey as a case in point.  While elected officials blather about “zero-carbon” footprints, nothing is done to prepare for weather events that have happened before and can always happen again. International accords and conferences like the last one in Paris do nothing for a vulnerable place like Houston.

Consider this article A Texas newsroom predicted a disaster. Now it’s close to coming true. published at Columbia Journalism Review.  Excerpts below.

THE TEXAS TRIBUNE AND PROPUBLICA last year published a multi-part investigation looking at what would happen if Houston was hit by a major hurricane.

The reporters partnered with scientists at several universities in Texas to conduct simulations, gaming out various storm scenarios for the country’s fourth-largest city, with its rapidly growing population, huge stores of oil and natural gas, and a major NASA facility.

The conclusion: The city and region were woefully unprepared for a major hurricane, with inadequate infrastructure for evacuation and flood control. A major storm would inflict catastrophic damage, bringing “economic and ecological disaster.” The series won awards, including a Peabody and an Edward R. Murrow, but it didn’t lead to substantive policy changes or big new investments in infrastructure.

Houston is the fourth-largest city in the country. It’s home to the nation’s largest refining and petrochemical complex, where billions of gallons of oil and dangerous chemicals are stored. And it’s a sitting duck for the next big hurricane. Learn why Texas isn’t ready. March 3, 2016

A house is engulfed in flames as water and waves inundate homes on Galveston Island as Hurricane Ike approaches the coast Sept. 12, 2008.

Now the same journalists are watching nervously as Hurricane Harvey inches closer to the Texas shoreline. While landfall is expected between Corpus Christi and Houston, one of their worst-case scenarios could still come true.

“Unfortunately it might take a disaster,” Shaw adds, “before Texas wakes up and realizes we need to send some real money to protect one of the nation’s biggest ports, where we keep most of our oil and chemicals.” If Houston was directly hit by a storm of Harvey’s magnitude, Shaw says, the environmental damage would exceed the BP Deepwater Horizon oil spill.

After the series appeared, the reporters reached out to the state’s entire congressional delegation and both of its US senators, one of whom, Ted Cruz, ran for president. “So none of them can say nobody could anticipate the calamity a large storm could inflict upon their constituencies,” Klein wrote.

“Ike was supposed to be that wake-up call to do something about this,” Shaw says. “All I can hope for is that this will be another wake-up call, and Texas will ask for more action before the ‘big one.’”

A short video explaining in 2011 how to protect the area from the next big one.

See also Climate Adaptive Cities: The Smart Way Forward

X-Weather is Back! Harvey edition

With Hurricane Harvey making landfall in Texas as a Cat 4, the storm drought is over and claims of linkage to climate change can be expected.  So far (mercifully) articles in Time and Washington Post have been more circumspect than in the past.  Has it become more respectable to look at the proof supporting wild claims?  This post provides background on X-Weathermen working hard to claim extreme weather as proof of climate change.

In the past the media has been awash with claims of “human footprints” in extreme weather events, with headlines like these:

“Global warming is making hot days hotter, rainfall and flooding heavier, hurricanes stronger and droughts more severe.”

“Global climate change is making weather worse over time”

“Climate change link to extreme weather easier to gauge”– U.S. Report

“Heat Waves, Droughts and Heavy Rain Have Clear Links to Climate Change, Says National Academies”

That last one refers to a paper released by the National Academy of Sciences Press: Attribution of Extreme Weather Events in the Context of Climate Change (2016)

And as usual, the headline claims are unsupported by the actual text. From the NAS report (here): (my bolds)

Attribution studies of individual events should not be used to draw general conclusions about the impact of climate change on extreme events as a whole. Events that have been selected for attribution studies to date are not a representative sample (e.g., events affecting areas with high population and extensive infrastructure will attract the greatest demand for information from stakeholders) P 107

Systematic criteria for selecting events to be analyzed would minimize selection bias and permit systematic evaluation of event attribution performance, which is important for enhancing confidence in attribution results. Studies of a representative sample of extreme events would allow stakeholders to use such studies as a tool for understanding how individual events fit into the broader picture of climate change. P 110

Correctly done, attribution of extreme weather events can provide an additional line of evidence that demonstrates the changing climate, and its impacts and consequences. An accurate scientific understanding of extreme weather event attribution can be an additional piece of evidence needed to inform decisions on climate change related actions. P. 112

The Indicative Without the Imperative

extreme-weather-events

The antidote to such feverish reporting is provided by Mike Hulme in a publication: Attributing Weather Extremes to ‘Climate Change’: a Review (here).

He has an insider’s perspective on this issue, and is certainly among the committed on global warming (color him concerned). Yet here he writes objectively to inform us on X-weather, without advocacy: real science journalism and a public service, really. Excerpts below with my bolds.

Overview

In this third and final review I survey the nascent science of extreme weather event attribution. The article proceeds by examining the field in four stages: motivations for extreme weather attribution, methods of attribution, some example case studies and the politics of weather event Attribution.

The X-Weather Issue

As many climate scientists can attest, following the latest meteorological extreme one of the most frequent questions asked by media journalists and other interested parties is: ‘Was this weather event caused by climate change?’

In recent decades the meaning of climate change in popular western discourse has changed from being a descriptive index of a change in climate (as in ‘evidence that a climatic change has occurred’) to becoming an independent causative agent (as in ‘climate change caused this event to happen’). Rather than being a descriptive outcome of a chain of causal events affecting how weather is generated, climate change has been granted power to change worlds: political and social worlds as much as physical and ecological ones.

To be more precise then, what people mean when they ask the ‘extreme weather blame’ question is: ‘Was this particular weather event caused by greenhouse gases emitted from human activities and/or by other human perturbations to the environment?’ In other words, can this meteorological event be attributed to human agency as opposed to some other form of agency?

The Motivations

Hulme shows what drives scientists to pursue the “extreme weather blame” question, noting four motivational factors.

Why have climate scientists over the last ten years embarked upon research to provide an answer beyond the stock phrase ‘no individual weather event can directly be attributed to greenhouse gas emissions’?  There seem to be four possible motives.

1.Curiosity
The first is because the question piques the scientific mind; it acts as a spur to develop new rational understanding of physical processes and new analytic methods for studying them.

2.Adaptation
A second argument, put forward by some, is that it is important to know whether or not specific instances of extreme weather are human-caused in order to improve the justification, planning and execution of climate adaptation.

3.Liability
A third argument for pursuing an answer to the ‘extreme weather blame’ question is inspired by the possibility of pursuing legal liability for damages caused. . . If specific loss and damage from extreme weather can be attributed to greenhouse gas emissions – even if expressed in terms of increased risk rather than deterministically – then lawyers might get interested.

The liability motivation for research into weather event attribution also bisects the new international political agenda of ‘loss and damage’ which has emerged in the last two years. . . The basic idea is to give recognition that loss and damage caused by climate change is legitimate ground for less developed countries to gain access to new international climate adaptation funds.

4. Persuasion
A final reason for scientists to be investing in this area of climate science – a reason stated explicitly less often than the ones above and yet one which underlies much of the public interest in the ‘extreme weather blame’ question – is frustration with and argument about the invisibility of climate change. . . If this is believed to be true – that only scientists can make climate change visible and real –then there is extra onus on scientists to answer the ‘extreme weather blame’ question as part of an effort to convince citizens of the reality of human-caused climate change.

Attribution Methods

Attributing extreme weather events to human influences requires different approaches, of which four broad categories can be identified.

1. Physical Reasoning
The first and most general approach to attributing extreme weather phenomena to rising greenhouse gas concentrations is to use simple physical reasoning.

General physical reasoning can only lead to broad qualitative statements such as ‘this extreme weather is consistent with’ what is known about the human-enhanced greenhouse effect. Such statements offer neither deterministic nor stochastic answers and clearly underdetermine the ‘weather blame question.’ It has given rise to a number of analogies to try to communicate the non-deterministic nature of extreme event attribution. The three most widely used ones concern a loaded die (the chance of rolling a ‘6’ has increased, but no single ‘6’ can be attributed to the biased die), the baseball player on steroids (the number of home runs hit increases, but no single home run can be attributed to the steroids) and the speeding car-driver (the chance of an accident increases in dangerous conditions, but no specific accident can be attributed to the fast-driving).

2. Classical Statistical Analysis
A second approach is to use classical statistical analysis of meteorological time series data to determine whether a particular weather (or climatic) extreme falls outside the range of what a ‘normal’ unperturbed climate might have delivered.

All such extreme event analyses of meteorological time series are at best able to detect outliers, but can never be decisive about possible cause(s). A different time series approach therefore combines observational data with model simulations and seeks to determine whether trends in extreme weather predicted by climate models have been observed in meteorological statistics (e.g. Zwiers et al., 2011, for temperature extremes and Min et al., 2011, for precipitation extremes). This approach is able to attribute statistically a trend in extreme weather to human influence, but not a specific weather event. Again, the ‘weather blame question’ remains underdetermined.

slide20

3. Fractional Attributable Risk (FAR)
Taking inspiration from the field of epidemiology, this method seeks to establish the Fractional Attributable Risk (FAR) of an extreme weather (or short-term climate) event. It asks the counterfactual question, ‘How might the risk of a weather event be different in the presence of a specific causal agent in the climate system?’

The single observational record available to us, and which is analysed in the statistical methods described above, is inadequate for this task. The solution is to use multiple model simulations of the climate system, first of all without the forcing agent(s) accused of ‘causing’ the weather event and then again with that external forcing introduced into the model.

The credibility of this method of weather attribution can be no greater than the overall credibility of the climate model(s) used – and may be less, depending on the ability of the model in question to simulate accurately the precise weather event under consideration at a given scale (e.g. a heatwave in continental Europe, a rain event in northern Thailand) (see Christidis et al., 2013a).

4. Eco-systems Philosophy
A fourth, more philosophical, approach to weather event attribution should also be mentioned. This is the argument that since human influences on the climate system as a whole are now clearly established – through changing atmospheric composition, altered land surface characteristics, and so on – there can no longer be such a thing as a purely natural weather event. All weather — whether it be a raging tempest or a still summer afternoon — is now attributable to human influence, at least to some extent. Weather is the local and momentary expression of a complex system whose functioning as a system is now different to what it would otherwise have been had humans not been active.

Results from Weather Attribution Studies

Hulme provides a table of numerous such studies using various methods, along with his view of the findings.

It is likely that attribution of temperature-related extremes using FAR methods will always be more attainable than for other meteorological extremes such as rainfall and wind, which climate models generally find harder to simulate faithfully at the spatial scales involved. As discussed below, this limitation on which weather events and in which regions attribution studies can be conducted will place important constraints on any operational extreme weather attribution system.

Political Dimensions of Weather Attribution

Hulme concludes by discussing the political hunger for scientific proof in support of policy actions.

But Hulme et al. (2011) show why such ambitious claims are unlikely to be realised. Investment in climate adaptation, they claim, is most needed “… where vulnerability to meteorological hazard is high, not where meteorological hazards are most attributable to human influence” (p.765). Extreme weather attribution says nothing about how damages are attributable to meteorological hazard as opposed to exposure to risk; it says nothing about the complex political, social and economic structures which mediate physical hazards.

And separating weather into two categories — ‘human-caused’ weather and ‘tough-luck’ weather – raises practical and ethical concerns about any subsequent investment allocation guidelines which excluded the victims of ‘tough-luck weather’ from benefiting from adaptation funds.

Contrary to the claims of some weather attribution scientists, the loss and damage agenda of the UNFCCC, as it is currently emerging, makes no distinction between ‘human-caused’ and ‘tough-luck’ weather. “Loss and damage impacts fall along a continuum, ranging from ‘events’ associated with variability around current climatic norms (e.g., weather-related natural hazards) to [slow-onset] ‘processes’ associated with future anticipated changes in climatic norms” (Warner et al., 2012:21). Although definitions and protocols have not yet been formally ratified, it seems unlikely that there will be a role for the sort of forensic science being offered by extreme weather attribution science.

Conclusion

Thank you Mike Hulme for a sane, balanced and expert analysis. It strikes me as being another element in a “Quiet Storm of Lucidity”.

Is that light the end of the tunnel or an oncoming train?

Climate Adaptive Cities: The Smart Way Forward

A recent post Renewables Hypocrisy described the vain and empty spectacle of Mayors of major cities lining up to get badges claiming “100% Renewable Energy”. This post provides the alternative to such posturing. Matthew Kahn is a microeconomist studying and writing on the more rational and productive response to future climate change possibilities. His thinking is explained in a recent paper is Will Climate Change Cause Enormous Social Costs for Poor Asian Cities? Asian Development Review September 2017. There is much wisdom packed in this document, not only regarding Asia but everywhere.  Some excerpts below give the flavor.

Overview

Climate change could significantly reduce the quality of life for poor people in Asia. Extreme heat and drought, and the increased incidence of natural disasters will pose new challenges for the urban poor and rural farmers. If farming profits decline, urbanization rates will accelerate and the social costs of rapid urbanization could increase due to rising infectious disease rates, pollution, and congestion. This paper studies strategies for reducing the increased social costs imposed on cities by climate change.

Cities face a practical challenge and many have not embraced it

While Asia’s poor face major new risks because of climate change, there are countervailing forces in play as well. With the rise of big data, governments and individuals have greater access to real-time information about emerging threats. Increased international flows of capital have given local governments the capacity to fund public infrastructure projects.

An open question concerns the incentives for mayors and city governments in developing countries to take costly steps to improve the quality of life of the urban poor. Such investments will ultimately increase the migration of rural poor to these cities. Anticipating this effect, some mayors and city governments are discouraged from making such investments. Feler and Henderson (2011) present evidence of this dynamic in Brazil.

Asia’s collective ability to adapt to anticipated but ambiguous new climate risks hinges on the well-being of the urban poor. If this group can successfully adapt to new challenges, then Asia’s overall urbanization experience is more likely to yield long-term economic growth and improvements in living standards.

The campaign to mitigate (prevent) global warming by reducing CO2 emissions is typified by the Paris accord, an ongoing political theater providing cover for elected officials.
Mitigation is not only useless, it distracts from real efforts to prepare for the future. Kahn:

Given that climate change directly impacts city quality of life, one surprising fact is that many leaders from around the world appear to devote more effort to seeing their city become a low-carbon city rather than a resilient city. To an economist, greater effort being devoted to mitigation rather than adaptation is surprising. With the former, there is a free-rider problem. Each city only contributes a small amount of the world’s total emissions; even if a city’s emissions are reduced to zero, its efforts will make no real difference in mitigating climate change.

Therefore, self-interest should drive adaptation efforts. Climate change can be perceived as a medium-term threat that will intensify after today’s leaders are no longer in power. It remains an open question whether elected officials are rewarded for tackling medium-term challenges. If real estate markets are forward looking, then current prices should reflect future threats. If a city develops a reputation for having a declining quality of life, then it will have more trouble attracting and retaining skilled workers. The threat of a brain drain (and lost tourism receipts) should incentivize local leaders to act.

It is also a case where one size does not fit all

In the spatial economics literature, cities differ with respect to their locational characteristics. Some cities feature cold winters, while others feature mountains. Real estate prices and rents adjust across cities so that those with better quality of life have higher rents and lower wages as compensation differential for living there (Rosen 2002). Climate change affects a city’s attributes. For example, a city that has enjoyed a temperate summer climate may now be much warmer during summer months.

In the typical urban economics model, a city’s attributes are all common knowledge. When considering the urban consequences of climate change, a city’s future attributes should be thought of as a random vector. For example, we do not know exactly what challenges Singapore will face in the year 2025 resulting from higher temperatures and/or rising sea levels. We can form expectations of these random variables, but we know that we do not know these future outcomes.

Facing uncertainty about how the quality of life will evolve in different cities, the theory of option value suggests it is important that Asia’s urban poor have as many possible destinations to move as possible. Such a menu of cities protects individuals and creates competition. Every city and country features locations of “higher ground” that are ostensibly more protected from the impacts of climate change. Advances in spatial mapping software can pinpoint such areas. If cities change their land use patterns to allow for higher densities in such areas, then adaptation is promoted.

If a population can self-protect from emerging threats using affordable technologies and real-time information, and if local governments can do a better job protecting urban residents from risk, then the historical relationship between risk and negative outcomes can be attenuated. This is a new version of the “Lucas critique,” which argues that as governments change the rules of the game, economic agents reoptimize and past relationships between consumption and income no longer hold (Lucas 1976).

In the case of climate adaptation, Mother Nature changes the rules of the game and economic actors change their decision making to reduce their risk exposure. Forward-looking households and firms should be making investments to become more nimble in the face of increased exposure to heat, drought, and climate volatility. The net effect should be that B1 in equation (1) shrinks toward zero over time.

Summary

Two schools of thought regarding future climates:

Mitigation: Cut down on use of fossil fuels to mitigate or prevent future global warming.
Adaptation: As changes occur, adapt our methods and practices to survive and prosper in new conditions.

The Paris Agreement and various cap-and-trade schemes intend to Mitigate future warming. Lots of gloom and doom is projected (forecast) by activists claiming mitigation is the only way. But the facts of our experience say otherwise. Building Adaptive Cities means investing in resilience, preparing for future periods both colder and warmer than the present. Key objectives include reliable, affordable energy and robust infrastructure.

See also: Adapt, Don’t Fight Climate Change

Mitigation is Bad for Us and the Planet

 

 

 

New Icebreaker Tanker Opens NSR

Its strong, ultra-lightweight steel reinforced hull makes it the largest commercial ship to receive Arc7 certification and allows it to sail through ice up to 2.1 meters thick.

Alarmist media like the Guardian are claiming this event signals the demise of Arctic ice, when in fact it is a triumph of modern technology over natural challenges.  Headlines are announcing a tanker transited the Northern Sea Route (NSR) without icebreaker escort, neglecting to mention that the vessel in question is an icebreaker that functions as a tanker.

Operated by Sovcomflot on behalf of Total, Novatek, CNPC and the Silk Road Fund, this 300 meter long carrier has a capacity of 172,600 cubic meters of LNG.

From Total Inaugurates the Northern Sea Route with LNG Carrier Christophe de Margerie

It’s on its way! After loading its cargo at the Snøhvit LNG export terminal in Norway, in which Total has an 18.4% interest, the Christophe de Margerie is taking the Northern Sea Route to Boryeong in South Korea, where it will deliver a cargo for Total Gas & Power. It’s the first unescorted merchant LNG vessel ever to take this route, which makes it possible to reach Asia via the Bering Strait in 15 days versus 30 days via the Suez Canal.

This technological feat was made possible through the participation of Total teams to the design of these next-generation LNG carriers. Compilations of technology, they efficiently transport large quantities of LNG year-round, without requiring escort icebreakers during the period from July to November. The Christophe de Margerie is the first of a total of 15 planned LNG carriers that will be gradually deployed.

They have been specially designed for Yamal LNG, a flagship Total project (20%) in northern Russia to develop the giant onshore South Tambey gas and condensate field with the construction of a liquefaction plant. Ultimately, close to 16.5 million tons of LNG a year will transit through the port of Sabetta, built specifically for the project.

Energy companies are planning for ice and are building equipment to deal with it.  It is not evidence of global warming but human ingenuity.

Climate Hype Running Amok

The climate hype machine has switched into overdrive with the release of a draft US climate assessment report.  Read about it with caution, as explained in recent post Impaired Climate Vision.

Meanwhile a good synopsis of wrong-headed thinking about climate is provided by John Stossel in a review of Al Gore’s science fictions (here).  (Excerpts below with my bolds)

John Stossel: Al Gore and me — Don’t believe the hype

I was surprised to discover that Al Gore’s new movie begins with words from me!

While icebergs melt dramatically, Gore plays a clip of me saying, “‘An Inconvenient Truth’ won him an Oscar, yet much of the movie is nonsense. ‘Sea levels may rise 20 feet’ — absurd.” He used this comment from one of my TV shows.

The “20 feet” claim is absurd — one of many hyped claims in his movie.

His second film, “An Inconvenient Sequel,” shows lower Manhattan underwater while Gore intones: “This is global warming!”

My goodness! Stossel doubts Al Gore’s claim, but pictures don’t lie: The 9/11 Memorial is underwater! Gore is right! Stossel is an ignorant fool!

But wait. The pictures were from Superstorm Sandy. Water is pushed ashore during storms, especially “super” storms. But average sea levels haven’t risen much.

Over the past decade, they have risen about 1 inch. But this is not because we burn fossil fuels. Sea levels were rising long before we burned anything. They’ve been rising about an inch per decade for a thousand years.

In his new movie, Gore visits Miami Beach. No storm, but streets are flooded! Proof of catastrophe!

But in a new e-book responding to Gore’s film, climate scientist Roy Spencer points out that flooding in “Miami Beach occurs during high tides called ‘king tides,’ due to the alignment of the Earth, sun and moon. For decades they have been getting worse in low-lying areas of Miami Beach where buildings were being built on reclaimed swampland.”

It’s typical Al Gore scaremongering: Pick a place that floods every year and portray it as evidence of calamity.

Spencer, a former NASA scientist who co-developed the first ways of monitoring global temperatures with satellites, is no climate change “denier.” Neither am I. Climate changes.

Man probably plays a part. But today’s warming is almost certainly not a “crisis.” It’s less of a threat than real crises like malaria, terrorism, America’s coming bankruptcy, etc. Even if increasing carbon dioxide warming the atmosphere were a serious threat, nothing Al Gore and his followers now advocate would make a difference.

“What I am opposed to is misleading people with false climate science claims and alarming them into diverting vast sums of the public’s wealth into expensive energy schemes,” writes Spencer.

Gore does exactly that. He portrays just about every dramatic weather event as proof that humans have changed weather. Watching his films, you’d think that big storms and odd weather never occurred before and that glaciers never melted.

In his first movie, Gore predicted that tornadoes and hurricanes would get worse. They haven’t. Tornado activity is down.

What about those dramatic pictures of collapsing ice shelves?

“As long as snow continues to fall on Antarctica,” writes Spencer, “glaciers and ice shelves will continue to slowly flow downhill to the sea and dramatically break off into the ocean. That is what happens naturally, just as rivers flow naturally to the ocean. It has nothing to do with human activities.”

Gore said summer sea ice in the Arctic would disappear as early as 2014. Nothing like that is close to happening.

Gore’s movie hypes solar power and electric cars but doesn’t mention that taxpayers are forced to subsidize them. Despite the subsidies, electric cars still make up less than 1 percent of the market.

If electric cars do become more popular, Spencer asks, “Where will all of the extra electricity come from? The Brits are already rebelling against existing wind farms.”

I bet most Gore fans have no idea that most American electricity comes from natural gas (33 percent), coal (30 percent) and nuclear reactors (20 percent).

Gore probably doesn’t know that.

I’d like to ask him, but he won’t talk to me. He won’t debate anyone.

Critics liked “An Inconvenient Sequel.” An NPR reviewer called it “a hugely effective lecture.” But viewers were less enthusiastic. On Rotten Tomatoes, my favorite movie guide, they give “Sequel” a “tipped over popcorn bucket” score of 48 percent. Sample reviews: “Dull as can be.” “Faulty info, conflated and exaggerated.”

Clearly, Nobel Prize judges and media critics are bigger fans of big government and scaremongering than the rest of us.

Summary:  Al Gore is Jumping the Shark

“Jumping the shark” is attempting to draw attention to or create publicity for something that is perceived as not warranting the attention, especially something that is believed to be past its peak in quality or relevance. The phrase originated with the TV series “Happy Days” when an episode had Fonzie doing a water ski jump over a shark. The stunt was intended to perk up the ratings, but it marked the show’s low point ahead of its demise.

Judiciary Climate Confusion

On August 8, 2017 a ruling by the DC Court of Appeals was hailed by me as a “gamechanger”, since the text was the most sensible thinking from judges I have seen regarding the climate issue.  More on that decision later on.  Then yesterday we get a ruling from the same court coming down on the opposite, “same old, same old” side.  Looking at the two rulings reveals how the judiciary is struggling with claims of global warming/climate change.  Let’s look at the most recent decision first.

Overview of the August 22 Ruling on Sabal Trail Florida Pipeline Project

Activists won a huge victory when a Washington, D.C. appellate court panel sided with the Sierra Club, saying the federal agency that reviewed the project had made a huge error. In the narrow 2-1 decision, U.S. Circuit Judge Thomas B. Griffith wrote that the Federal Energy Regulatory Commission (FERC) should have considered the impact of the pipeline’s added greenhouse gas emissions.

Though the D.C. judges said there was nothing wrong with FERC’s consideration of the poor, minority residents who live along the pipeline, they did agree the agency failed to estimate the amount of carbon emissions that would be generated by Sabal Trail. (Celebratory language from thinkprogress)

Synopsis of Ruling and Dissenting Opinion

The court document is On Petitions for Review of Orders of the Federal Energy Regulatory Commission  heard by Circuit Judges Rogers, Brown, and Griffith.  Opinion for the Court filed by Circuit Judge GRIFFITH.  Opinion concurring in part and dissenting in part filed by Circuit Judge BROWN. (Excerpts below provide a synopsis, my bolds)

The Case and the Ruling

The three segments of the project have different owners,1 but they share a common purpose: to serve Florida’s growing demand for natural gas and the electric power that natural gas can generate. At present, only two major natural-gas pipelines serve the state, and both are almost at capacity. Two major utilities, Florida Power & Light and Duke Energy Florida, have already committed to buying nearly all the gas the project will be able to transport. Florida Power & Light claims that without this new project, its gas needs will begin to exceed its supply this year. But the project’s developers also indicate that the increased transport of natural gas will make it possible for utilities to retire older, dirtier coal-fired power plants.

Despite these optimistic predictions, the project has drawn opposition from several quarters. Environmental groups fear that increased burning of natural gas will hasten climate change and its potentially catastrophic consequences. Landowners in the pipelines’ path object to the seizure of their property by eminent domain. And communities on the project’s route are concerned that pipeline facilities will be built in low-income and predominantly minority areas already overburdened by industrial polluters.

The Federal Energy Regulatory Commission FERC launched an environmental review of the proposed project in the fall of 2013. The agency understood that it would need to prepare an environmental impact statement (EIS) before approving the project, as the National Environmental Policy Act of 1969 (NEPA) requires for each “major Federal action[] significantly affecting the quality of the human environment.” See 42 U.S.C. § 4332(2)(C). FERC solicited public comment and held thirteen public meetings on the project’s environmental effects, and made limited modifications to the project plan in response to public concerns, before releasing a draft impact statement in September 2015 and a final impact statement in December 2015.

The role of the courts in reviewing agency compliance with NEPA is accordingly limited. Furthermore, because NEPA does not create a private right of action, we can entertain NEPA-based challenges only under the Administrative Procedure Act and its deferential standard of review. See Theodore Roosevelt Conservation P’ship v. Salazar, 616 F.3d 497, 507 (D.C. Cir. 2010). That is, our mandate “is ‘simply to ensure that the agency has adequately considered and disclosed the environmental impact of its actions and that its decision is not arbitrary or capricious.’”

To sum up, the EIS acknowledged and considered the substance of all the concerns Sierra Club now raises: the fact that the Southeast Market Pipelines Project will travel primarily through low-income and minority communities, and the impact of the pipeline on the city of Albany and Dougherty County in particular. The EIS also laid out a variety of alternative approaches with potential to address those concerns, including those proposed by petitioners, and explained why, in FERC’s view, they would do more harm than good. The EIS also gave the public and agency decisionmakers the qualitative and quantitative tools they needed to make an informed choice for themselves. NEPA requires nothing more.

It’s not just the journey, though, it’s also the destination. All the natural gas that will travel through these pipelines will be going somewhere: specifically, to power plants in Florida, some of which already exist, others of which are in the planning stages. Those power plants will burn the gas, generating both electricity and carbon dioxide. And once in the atmosphere, that carbon dioxide will add to the greenhouse effect, which the EIS describes as “the primary contributing factor” in global climate change. J.A. 915. The next question before us is whether, and to what extent, the EIS for this pipeline project needed to discuss these “downstream” effects of the pipelines and their cargo. We conclude that at a minimum, FERC should have estimated the amount of power-plant carbon emissions that the pipelines will make possible.

BROWN, Circuit Judge, concurring in part and dissenting in part:

I join today’s opinion on all issues save the Court’s decision to vacate and remand the pipeline certificates on the issue of downstream greenhouse emissions. Case law is clear: When an agency “‘has no ability to prevent a certain effect due to’ [its] ‘limited statutory authority over the relevant action[],’ then that action ‘cannot be considered a legally relevant cause’” of an indirect environmental effect under the National Environmental Policy Act (“NEPA”).

Here, FERC declined to engage in an in-depth examination of downstream greenhouse gas emissions because there is no causal relationship between approval of the proposed pipelines and the downstream greenhouse emissions; and, even if a causal relationship exists, any additional analysis would not meaningfully contribute to its decision making. Both determinations were reasonable and entitled to deference.

Regarding causation, the Court is correct that NEPA requires an environmental analysis to include indirect effects that are “reasonably foreseeable,” Freeport, 827 F.3d at 46, but it misunderstands what qualifies as reasonably foreseeable. The Court blithely asserts it is “not just the journey,” it is “also the destination.” Maj. Op. at 18. In fact, NEPA is a procedural statute that is all about the journey. It compels agencies to consider all environmental effects likely to result from the project under review, but it “does not dictate particular decisional outcomes.”

While the Court concludes FERC’s approval of the proposed pipelines will be the cause of greenhouse gas emissions because a significant portion of the natural gas transported through the pipeline will be burned at power plants, see Maj. Op. at 19, the truth is that FERC has no control over whether the power plants that will emit these greenhouse gases will come into existence or remain in operation.

Even if the Court is correct that the Commission has the power to deny pipeline certificates based on indirect environmental concerns, such a denial represents the limit of the Commission’s statutory power. Nothing would prevent the Florida Board from independently approving the construction or expansion of the power plants at issue. In fact, the record shows the Board has already approved some of these projects prior to the Commission reaching a decision on the proposed pipelines. JA 910–11. Moreover, there is also nothing preventing the Intervenors from pursuing an alternative method of delivery to account for the same amount of natural gas. Practical considerations point in the opposite direction. Both the Board and the Commission have concluded Florida has a need for additional natural gas, and nothing in today’s opinion takes issue with those holdings.

Thus, just as FERC in the DOE cases and the Federal Motor Carrier Safety Administration in Public Citizen did not have the legal power to prevent certain environmental effects, the Commission here has no authority to prevent the emission of greenhouse gases through newly-constructed or expanded power plants approved by the Board.

The DC Appeals Court Decision August 8, 2017

Overview

A major clarification came today from the DC Court of Appeals ordering EPA (and thus the Executive Branch Bureaucracy) to defer to Congress regarding regulation of substances claimed to cause climate change.  While the issue and arguments are somewhat obscure, the clarity of the ruling is welcome.  Basically, the EPA under Obama attempted to use ozone-depleting authority to regulate HFCs, claiming them as greenhouse gases.  The judges decided that was a stretch too far.

The EPA enacted the rule in question in 2015, responding to research showing hydroflourocarbons, or HFCs, contribute to climate change.

The D.C. Circuit Court of Appeals’ 2-1 decision said EPA does not have the authority to enact a 2015 rule-making ending the use of hydrofluorocarbons commonly found in spray cans, automobile air conditioners and refrigerators. The three-judge panel said that because HFCs are not ozone-depleting substances, the EPA could not use a section of the Clean Air Act targeting those chemicals to ban HFCs.

“Indeed, before 2015, EPA itself maintained that Section 612 did not grant authority to require replacement of non ozone-depleting substances such as HFCs,” the court wrote.

“EPA’s novel reading of Section 612 is inconsistent with the statute as written. Section 612 does not require (or give EPA authority to require) manufacturers to replace non ozone-depleting substances such as HFCs,” said the opinion, written by Judge Brett Kavanaugh, joined by Judge Janice Brown.

Contextual Background from the Court Document On Petitions for Review of Final Action by the United States Environmental Protection Agency  Excerpts below (my bolds)

In 1987, the United States signed the Montreal Protocol. The Montreal Protocol is an international agreement that has been ratified by every nation that is a member of the United Nations. The Protocol requires nations to regulate the production and use of certain ozone-depleting substances.

As a result, in the 1990s and 2000s, many businesses stopped using ozone-depleting substances in their products. Many businesses replaced those ozone-depleting substances with HFCs. HFCs became prevalent in many products. HFCs have served as propellants in aerosol spray cans, as refrigerants in air conditioners and refrigerators, and as blowing agents that create bubbles in foams.

In 2013, President Obama announced that EPA would seek to reduce emissions of HFCs because HFCs contribute to climate change.

Consistent with the Climate Action Plan, EPA promulgated a Final Rule in 2015 that moved certain HFCs from the list of safe substitutes to the list of prohibited substitutes. . .In doing so, EPA prohibited the use of certain HFCs in aerosols, motor vehicle air conditioners, commercial refrigerators, and foams – even if manufacturers of those products had long since replaced ozonedepleting substances with HFCs. Id. at 42,872-73.

Therefore, under the 2015 Rule, manufacturers that used those HFCs in their products are no longer allowed to do so. Those manufacturers must replace the HFCs with other substances that are on the revised list of safe substitutes.

In the 2015 Rule, EPA relied on Section 612 of the Clean Air Act as its source of statutory authority. EPA said that Section 612 allows EPA to “change the listing status of a particular substitute” based on “new information.” Id. at 42,876. EPA indicated that it had new information about HFCs: Emerging research demonstrated that HFCs were greenhouse gases that contribute to climate change. See id. at 42,879. EPA therefore concluded that it had statutory authority to move HFCs from the list of safe substitutes to the list of prohibited substitutes. Because HFCs are now prohibited substitutes, EPA claimed that it could also require the replacement of HFCs under Section 612(c) of the Clean Air Act even though HFCs are not ozone-depleting substances.

EPA’s current reading stretches the word “replace”  beyond its ordinary meaning. . .
Under EPA’s current interpretation of the word “replace,” manufacturers would continue to “replace” an ozone-depleting substance with a substitute even 100 years or more from now. EPA would thereby have indefinite authority to regulate a manufacturer’s use of that substitute. That boundless interpretation of EPA’s authority under Section 612(c) borders on the absurd.

In any event, the legislative history strongly supports our conclusion that Section 612(c) does not grant EPA continuing authority to require replacement of non-ozone-depleting substitutes.. . In short, although Congress contemplated giving EPA broad authority under Title VI to regulate the replacement of substances that contribute to climate change, Congress ultimately declined.

However, EPA’s authority to regulate ozone-depleting substances under Section 612 and other statutes does not give EPA authority to order the replacement of substances that are not ozone depleting but that contribute to climate change. Congress has not yet enacted general climate change legislation. Although we understand and respect EPA’s overarching effort to fill that legislative void and regulate HFCs, EPA may act only as authorized by Congress. Here, EPA has tried to jam a square peg (regulating non-ozone depleting substances that may contribute to climate change) into a round hole (the existing statutory landscape).

The Supreme Court cases that have dealt with EPA’s efforts to address climate change have taught us two lessons that are worth repeating here. See, e.g., Utility Air Regulatory Group v. EPA, 134 S. Ct. 2427 (2014). First, EPA’s well intentioned policy objectives with respect to climate change do not on their own authorize the agency to regulate. The agency must have statutory authority for the regulations it wants to issue. Second, Congress’s failure to enact general climate change legislation does not authorize EPA to act. Under the Constitution, congressional inaction does not license an agency to take matters into its own hands, even to solve a pressing policy issue such as climate change.

Summary

As the August 6 ruling makes clear, judges are making decisions in a legislative vacuum.  The law of the land regarding “greenhouse gases” has yet to be enacted.  Congress could put on their big boy pants and pass a declaration that CO2 can not be considered a “pollutant” as classified by the Clean Air Act.  Failing that, rulings will come down on all sides, unless and until the Supreme Court takes up the issue.  I suppose then Judge Kennedy will make the call.

Judges are not experts in the fields of knowledge involved in cases that come before them.  Instead, they make decisions as “reasonable people”, nominated in fact as the most reasonable people we can find in our society.  How disappointing it is to see many of them accepting social proof instead of weighing the evidence undermining the notion of CO2 as the climate control knob.  How frustrating to see some of them twisting and stretching to kowtow to an hypothetical “consensus” rather than apply the law as written.

Footnote:  It appears that in July Judge Janice Brown announced her future retirement before taking the position of reason in both these rulings.

Notes from the Judicial Climate Battleground

Flashes of lucidity from the bench in recent proceedings Climate Case: Judge Defends Rule of Law

On lawsuits fronted by children Climate War Human Shields

Legal entanglements for corporations How Climate Law Relies on Paris

On the legal case exonerating fossil fuels Claim: Fossil Fuels Cause Global Warming

 

August Arctic Ice Unalarming

Click on image to enlarge.

The image above shows ice extents for day 233 from 2007 to 2017.  Particularly interesting is the variation in the CAA (Canadian Arctic  Archipelago), crucial for the Northwest Passage.  (The region is located just north of the words “Ice Extent” in gold.)  Note that 2016 was a fine year for cruising with the passage completely open at this date.  That was not the case in 2014, and this year is also frozen solid.

The graph of August NH ice extents shows 2017 virtually tied with the decadal average as of yesterday. This year is now 550k km2 greater than 2016 and exceeds 2007 by 250k km2.  SII (Sea Ice Index) 2017 is also 400k km2 lower.  A previous post Beware the Arctic Storms of August discussed how late summer storms have dramatic impacts, and the graph shows both 2012 and 2016 plummeting in the last five days.  By the end of the month in nine days, those two years will go below 4.4M km2.

The Table compares 2017 day 233 ice extents with the decadal average and 2007.  it is evident that  this year’s extents are in surplus on the Canadian side and Central Arctic, offset by deficits on the Pacific side.

Region 2017233 Day 233
Average
2017-Ave. 2007234 2017-2007
 (0) Northern_Hemisphere 5634884 5652704 -17820 5388004 246880
 (1) Beaufort_Sea 569472 643245 -73773 731647 -162175
 (2) Chukchi_Sea 285855 399788 -113934 222895 62959
 (3) East_Siberian_Sea 386603 517871 -131268 81989 304614
 (4) Laptev_Sea 308812 244158 64655 295384 13428
 (5) Kara_Sea 65151 86439 -21288 161780 -96628
 (6) Barents_Sea 33482 22883 10599 18656 14826
 (7) Greenland_Sea 184582 222908 -38326 335976 -151394
 (8) Baffin_Bay_Gulf_of_St._Lawrence 92400 37803 54597 51008 41392
 (9) Canadian_Archipelago 494273 353728 140546 325028 169245
 (10) Hudson_Bay 34936 43613 -8677 61078 -26141
 (11) Central_Arctic 3178159 3079193 98966 3101306 76853

By the way, Barents is still above average and just added some ice.

The black line is average for the last 11 years.  2007 in purple appears close to an average year.  2014 had the highest annual extent in Barents Sea, due to higher and later maximums, holding onto ice during the summer, and recovering quickly.  In contrast, 2016 was the lowest annual extent, melting out early and recovering later.  2017 in blue started out way behind, but grew rapidly to reach average, and then persisted longer to exceed even 2014.  It will be important to see when the recovery of ice begins.

For more on why Barents Sea matters see Barents Icicles

Summary and Outlook

As discussed above, weather these few weeks will determine the fate of ice extents.  Here is the Arctic Oscillation and Polar Vortex Analysis and Forecast from  August 21, 2017 by Dr. Judah Cohen from Atmospheric and Environmental Research (AER).

The AO is currently neutral (Figure 1), reflective of mostly mixed geopotential height anomalies across the Arctic and mixed geopotential height anomalies across the mid-latitudes of the NH (Figure 2). Geopotential height anomalies are also mixed across Greenland and Iceland (Figure 2), and therefore the NAO is also neutral. However blocking/high pressure will strengthen first across Northern Canada and eventually across Greenland forcing the AO/NAO into negative territory over the next two weeks.

New snowfall is also predicted over the Arctic sea ice over the coming two weeks, a sign that the Arctic sea ice melt season is coming to an end. It was never in doubt that once again Arctic sea ice would be well below normal this summer. However predicted new snowfall will retard sea ice melt and a new record low minimum seems unlikely. The trajectory of sea ice extent seems similar to 2007, 2011, 2015 and 2016 and will likely bottom out close to those group of years. If the melt season ends early then there is even an outside chance it could even match the years of 2008 and 2010.

Footnote

Some people unhappy with the higher amounts of ice extent shown by MASIE continue to claim that Sea Ice Index is the only dataset that can be used.  This is false in fact and in logic.  Why should anyone accept that the highest quality picture of ice day to day has no shelf life, that one year’s charts can not be compared with another year?  Researchers do this, including Walt Meier in charge of Sea Ice Index.  That said, I understand his interest in directing people to use his product rather than one he does not control.  As I have said before:

MASIE is rigorous, reliable, serves as calibration for satellite products, and continues the long and honorable tradition of naval ice charting using modern technologies.  More on this at my post Support MASIE Arctic Ice Dataset