Energy Changes Society: Transition Stories

Energy Sources and the Rise of Civilization. Source: Bill Gates

Richard Rhodes won the Pulitzer Prize for The Making of the Atomic Bomb. He’s written countless other books and his new book about energy called Energy: A Human history.

He takes us on a journey through the history behind energy transitions over time from wood to coal to oil to renewables and beyond. Some stories are well known, some much less so, a fascinating set of characters that go back in time all the way to Elizabethan England.

It also provides fascinating insights into how energy history should help us today understand a possible energy transition of the future towards lower carbon economy, to provide affordable reliable and sustainable energy for a growing global population. Rhodes shared some transition stories in an interview with Jason Bordoff at Columbia University, podcast and transcription entitled Richard Rhodes — Energy: A Human History. Excerpts below in italics are from Rhodes unless otherwise indicated, with my light editing, headers, bolds and images.

Jason Bordoff: I think some people may be familiar with more recent energy innovations electricity obviously, oil, nuclear power. But you really start with animals, with woods and what it meant for human civilization as we know it to depend on that for their energy sources and how transformative it was to then convert initially to coal and then beyond. Talk a little bit about how big, how big a deal that was and what it meant for the human experience as we know it.

From Wood to Coal in Elizabethan England

Richard Rhodes: Well, the story of the transition by the Elizabethan English from wood to coal was one of the most fascinating and in some ways comical although of course it wasn’t comical for them. One of the things that I wanted to do with this book was to tell the human stories that are behind the technologies involved, since so many books on the history of energy focused almost entirely on the technological changes.

The Wood Burning Society

And of course there are vast human stories, because changing from one source of energy to another is as much a social phenomenon as it is a technical phenomenon perhaps more so. So, the Elizabethans had been cutting down their trees in vast numbers, primarily for firewood for their homes. And they burned firewood usually on stern platforms or fire places set against the wall that didn’t have chimneys.

They liked the smell of wood and they thought that the smoke hardened their rafters, so either there was just a hole in the roof leading straight up from the fire place or they let the smoke drift through the rooms and out through the windows. Well, that was fine as long as they had enough wood, but as they cut the wood down farther and farther away from London it got more and more expensive to transport.

Substituting Coal for Wood

So, eventually it reached the point where it was really too expensive for the common people to afford, at that point the only alternative that they had was really smelly bituminous coal from New Castle up the river — up the country in the northeast, and they didn’t like its characteristics compared to wood. First of all imagine lighting a bituminous coal fire in the middle of your living room with no place for the smoke to go and imagine what – you’d be coughing and breathing from that. And then on top of that imagine resting your beef, your good English beef over a coal fire with all the sulfur that’s in coal smoke.

In a way England was just one vast coal mine made of these layers of coal which was black and dirty and smelled sulfurous when you burned it was literally the devils exponent. If the devil had hell down in the center of the earth this is where his body waste accumulated further up during the surface. Well, that obviously didn’t endear coal to the populous. So, they really struggled with it and basically what happened is the rich kept buying wood which they could afford and the poor had to find a way to survive with coal and they hated it.

A New King Adopts Coal

The transition really was a social transition when Elizabeth died at the end of the sixteen century, and just around 1598 or so King James I — VI of Scotland became the King of England. And he came down to London as James I and the Scots who had a much thinner forest to begin with up north then the English had had already switched to coal a long time ago. They had been working on coal for a hundred years.

And so Scottish coal was better quality, it didn’t have so much sulfur in it, so when the King came to London and started burning coal in the castle it became fashionable. Well, the King does it I suppose we can too, was the result and after that the transition was much facilitated. In addition they had to retrofit all the homes that didn’t have proper chimneys with chimney, which is another lesson that has extended across the entire history of energy transitions.

Converting Society to Burn Coal

In this regard it seems so simple – you find a new source of energy when it’s old – when it is causing you troubles and you switch over to it. I mean that’s the way people are talking today about wind and solar and other renewables. But it turns out, it takes anywhere from 50 to 100 years to make a full scale energy transition, because it’s not just a matter of the technology at all, it’s a matter of all sorts of social and societal changes.

In this case for example even they had to retrofit all the chimneys. They had to open coal mines and find a way to transport the coal down to London. They had to develop markets. They would sell the coal and then most of all you had to figure out how to burn it in your home without making the place un-inhabitable. So, it took a while. It was not really until the 1650s and the 1660s that coal had really moved in England and then of course they had the problem of air pollution.

Inventing the Coal Industry and Society

Well, just staying with the English, once they started digging coal they first dug of course the superficial layers that tended to out crop on hillsides. So, they could easily drain their mines just by putting in what they call adits which were to — basically channels for the water to flow out. But as they continued to dig deeper as they used the superficial coal they began to intersect the water table and the mines began flooding. They tried pumping them out with horses and what were called Rims which were basically horse turned pumps.

But that got more and more difficult as the mines continued to deepen, they were going down as far as 800 feet below ground to dig their coal. It’s hard to pump water that far with just a couple of horses. So, the solution that they found as time went on and this is now the early eighteenth century the middle eighteenth century was to develop an engine – a steam engine an early form of steam engine that was very inefficient less than 1%.

So, it was a big thing the size of a house, they would kind of sit on top of the coal mine opening to the surface and pump out the water, so that the mines could continue to be mined. This was the Newcomen engine which basically produced a vacuum which then allowed atmospheric pressure to rush in and function as a pump. That limited its function to the pressure of the atmosphere about 32 feet of lift. And therefore there continued to be a desire for innovation, a better way to pump water farther along, because if you had a – let’s say a 300 foot shaft in a coal mine.

Newcomen Atmospheric Steam Engine

Energy Necessity Calls for Innovation

The only way you could pump with a Newcomen engine would be to put the engines every 32 or so feet up and down the shaft, which was not a very efficient idea, especially since coal mines tend to release a certain amount of methane and other gases. And there were lots of explosions that people had to deal with. So, it quickly became apparent that there was a place for a better steam engine that’s where James Watt the Scotsman came along and invented a true steam engine one that worked by using steam to expand and push the piston back and forth.

And it could pump as much as its capacity was built to pump. Then of course they had the problem of moving the coal from the mine down to the river or the ocean in order to barge it to London. Again moving stuff around which turns out to be a large part of the problems in dealing with these forms of energy. At first the mines were close enough to the water to simply put the coal on a cart and roll it down hill. They used rails to do that, originally wooden rails, but then they started covering the wooden rails with cast iron plates on top to make it more efficient.

A late version of a Watt double-acting steam engine, built by D. Napier & Son (London) in 1859.

Transformation into a Coal Society

And you know, once you switch from wood to coal and as the country began to industrialize, particularly with advent of the steam engines, coal production got more and more enormous. And it wasn’t just a matter of heating homes anymore it became a matter of running factories as well. And you couldn’t do that with bags of coal on the saddles of horses, you needed some more large scale way to move the coal around.

Once the mines were farther back from the valleys where the rivers ran or the canals as they came to be, you had to find a way to move the material uphill as well as downhill. And horses weren’t going to do that job not at the scale that England was operating by then. So, someone realized that if you had a small steam engine and by then Watts engine could be made fairly small, you could mount them on wheels and move the coal with the steam engine, which is a certainly a railroad engine.

One of the early steam engine locomotives.

The Coal-Based Society Emerges

This whole story is about how self reinforcing all of these things were. You need to go deeper and deeper to get the coal for heating and cooking. You develop an innovation like the steam engine to do that, which enables an innovation to transport the coal, and then uses the very energy you were trying to get for another purpose to power that new innovation.

And once it was clear that you could move railroad carts of coal with a steam engine, someone realized that you could move people too. And England suddenly blossomed with railroads all over the country. The canal age was over and the railroad age began and it all followed from this early transition from wood to coal. And all the industrialization that came with the development of a stable, reliable source of continual power, which water power had not been, and animal power had been limited.

Here was an engine you fed it coal and it gave you – it gave you a turning wheel that would turn mills to loom cotton that would turn mills to make steel whatever you needed to do. So, it really was an innovation stage by stage just kind of piggyback from one to the next, really a fascinating transition time in history.

The Downside of Coal Energy

One of the interesting things that’s very clear, and it’s as clear today in Beijing as it was in London at 1660. The first thing you do is get your energy, you do what you have to do to increase the energy supply to your country or your society. Then as a kind of a luxury good in a way, you start looking at how to reduce the baleful side effect such as air pollution that comes along with that source of energy.

The first paper published by the newly formed royal society of London in 1664 was a study of how to improve the air in London and it was remarkably similar to today’s ideas to move industry into the suburbs, to ring the city with plant life, trees. So, this particular writer proposed all sorts of wonderful trees that put out perfume during their flowering season that should be built in a belt around London.

The King was so busy having just been restored to power – after the Roundhead Revolution that had caused his father to be beheaded– that he was much too busy selling monopolies and refilling his coffers to actually do anything about it. But the point is people were thinking about this prospect and it was not different from what happened when Pittsburgh at the turn of the century was so filled with coal smoke that from a nearby hill where you barely can see the city.

Pittsburgh train station in the 1940s.

Pittsburgh Faces Coal Air Pollution

And that was pretty much true in most American cities up until the 1950s. As they went about cleaning up their air supply in the early 1950s, there was a proposal by United States government to share the cost of building the first commercial nuclear power plant in the United States at a place called Shippingport near Pittsburgh on the river. I talked to the president of the Duquesne Light which was the company that was going to be the private contractor for this power plant.

He said you know, we sold this power plant to the city council of Pittsburgh as a green technology. People have come to think of nuclear as the devil’s excrement, but compared to burning coal, compared to burning what they had available to burn at the time, nuclear was great with its total absence of carbon production. The past can really inform the present when you look at how things have been done before and why they were done that way and what lessons they offer us to learn in the process.

Smog in Los Angeles

The stories that I tell are very much intertwined. So let’s jump to Los Angeles in the 1950s when what we now call smog was beginning to be a very serious problem there. The companies that refined oil in and around Los Angeles wanted to do whatever was available to clean up the air pollution, because it was commonly believed that it all was coming from their refineries or from trash burning.

Previously cities had been focused and states had been focused primarily on coal smoke – on smoke and its baleful effects on the atmosphere. Smog was originally smoke and fog two words combined. But in the 50s in Los Angeles they became this photochemical phenomenon that was going on in the atmosphere that was making everything look brown. And the question now was what do you do about that?

Anti-Smog Device.

Dr. Haagen-Smit at Caltech was carrying out an exercise in identifying the perfume essence of ripe pineapple. So, he had a room full of ripe pineapples that he was sucking the air in the room through a machine that included some liquid nitrogen that would freeze out of the air, the essential aroma of chemicals. It was to Dr. Haagen-Smit that the California county people turned and asked him if he could identify the component of this smog that was in the air.

So, he used the same machinery, but he put the pineapples away and he opened the window and sucked in about 30,000 liters of California smog into the room ran it through his machine and ended up with a few drops of very nasty brownish sticky material, which was essence of California photochemical smog, and identified where it came from. And its primary component with the other things like the refineries and so forth were certainly a part of it.

But the main component was automobile exhaust and that gave Los Angeles the beginning of what turned out to be a large national struggle with the automobile manufacturers to get them to put catalytic converter on their automobiles and eventually to get rid of the nitrous oxides, which was another component of automobile exhaust that was deadly. I repeat this not merely a technical book this is really a collection of the most amazing human stories.

But Haagen-Smit who was then of course put down by the great laboratories that had been turned to by automobile companies to refute his work, he with his simple experiments, he was a veteran of World War II. He had been a survivor of World War II so he knew how to make things simple. He with his laboratory work was able to identify what needed to be done and finally by the 1980s the entire country was trying to deal with smog by way of adding catalytic converter to cars.

Whale Oil and Petroleum

It’s a truism of the oil industry that the petroleum saved the whales. And they say that because one of the main sources of lighting for wealthier people – it was pretty expensive this whale oil–particularly spermaceti which was the very lovely refined oil that whales carry on their heads as a way of controlling their buoyancy. By heating and cooling the oil in their heads they can adjust their neutral buoyancy and therefore don’t sink to the bottom or rise to the top unless they want to.

So, these beautiful whales were used to make candles and were collected at the rate of 10,000 whales a year at the height of the whaling industry as Herman Melville beautifully describes it in Moby Dick. But most people couldn’t afford whale oil that was pretty expensive item. What they actually used — and I and most people had never known about this — was something that was called burning fluid which was basically the sap of the long leaf pines of south eastern United States which could be refined into turpentine and the turpentine could then be mixed with plain alcohol.

And with a little bit of menthol to sweeten the smell because turpentine burning is not a great smell, this then became something called burning fluid which is what almost everyone used in their lamps. One particular brand of burning fluid was called kerosene we know that name from its later application to petroleum I’ll jump to that in a sec. But — so most people burn lamps or they simply burn the cheap tallow candles, which smell like burning beef fat, not a great smell in your home either. Then came the discovery of petroleum in 1859 or rather the discovery of “rock oil” or “coal oil”. If you could drill for this stuff and pump it out in vast quantities, you could make all the kerosene you wanted.

There had been petroleum seeps in various parts of the country and particularly one in Pennsylvania where they tried to use the petroleum by soaking it up as it floated on the surface of streams where it oozed out from underground, in blankets, and squeezing the blankets out into a jug, and then selling that for liniment to rub on your sores and on your gums and swallow as a healthy item and so forth, if you can imagine. Anyway, once Colonel Drake went off to oil city as it came to be called and drilled a well and showed how you could pump oil out of the ground or indeed some wells would pump it for you and spate it into the air.

Gasoline, A Dangerous Byproduct, Transforms Society

All of a sudden petroleum was the new stuff, but it wasn’t the new stuff for powering machinery, nobody had found that use yet. Its first use for the next 50 years was for lighting, once they figured out how to refine petroleum into what was now called kerosene made from petroleum or it was used for the lubrication. But since the automobile hadn’t been invented and they had among their waste products their refinery — this stuff called gasoline which was much too volatile to put in a lamp.

The lamp would blow up from the fumes, so they would either pour it out on the ground to evaporate it into the air or they dump it into the streams and rivers of America in the dark of the night and so much other waste was in those days. It was beginning in the 1880s to be question among oil refiners so would they kind of run out of possible uses for their stuff. In a way the automobile saved petroleum. It was the automobile that came along just at the turn of the century and the industry took off.

Beyond the Petroleum Society

Jason Bordoff: You write about the disruptive, unexpected consequences of these innovations. As you said the whales are being slaughtered by the 10s of 1000s and oil is discovered and then that significantly reduces demand for whale oil. And then you wrote about the automobile and how one of the major problems at the turn of the century was horse populations and horse manure in cities like London and New York.  Problems were sort of solved with the technology innovation that we didn’t expect.  Does that tell you anything about what’s coming around the corner and maybe the level of humility we should have for our ability to anticipate it.

Richard Rhodes: We’re now in the middle of what I think is the largest energy transition in human history. And you know, I’ve written so much around this subject and here was the chance to take a look all the way back to what was really the beginning and the rest. One of the things that I discovered when I was working on The Making of the Atomic bomb, in fact one of the reasons I wrote that book is because we seem to be in the early 1980s at a crossroads where it looked so dangerous for the world. All the nuclear weapons brought in the world.

And it seems to me that if we went back to the beginning and took another look there might have been alternative pathways that would have led in a safer and better direction. And I thought that my thing – the same thing might be true for our energy dilemmas of today. So, that’s the reason I wrote the book.

I simply say we have to use every available energy source that isn’t carbon heavy in order to survive this largest of all energy transitions. But much of the world is just in the process of developing; that is to say people who have lived for millennia in deep poverty are slowly beginning to see the possibility. China being the most obvious example of moving up to the kind of middle class lives that we in the United States pretty much take for granted.

So, we have a double problem which is increasing the energy usage of large numbers of people around the world while at the same time reducing the carbon levels of the energy we use that is a really big challenge, bigger than people realize. And that means that we’re not going to be able to sit down and say well nuclear  is dangerous, because once in a while nuclear power plant blows up–which is true of any energy source and particularly unusual in the nuclear world by the way.

We’re going to have to find a way to work with nuclear, as well as, these other energy sources. You cannot power the world on renewables. The United States is rich enough that if it really wanted to it could probably work out a way to run its entire energy economy on renewables although I don’t think it would be a very efficient system it would be a very expensive system. But the rest of the world doesn’t really have that luxury. Right now China has on the drawing boards or in development some 125 nuclear power reactors. They are not even to deal with global warming. They are to deal with air pollution. And the Chinese are selling coal to the rest of the world unfortunately.

So, when Germany for example decided to eliminate its nuclear power and go all renewables, it has found itself compelled by its own energy demands to increase its use of brown coal which is the most carbon producing of all the various kinds of coal. They actually have increased their production of carbon dioxide since they decided to go — to eliminate their nuclear power supply.

The Italians eliminated their nuclear power, so now they buy their electricity from the French and the French of course are about 80% nuclear, which is pretty hypocritical of the Italians. I mean this is the kind of discussion that I think we’re all going to have to have and swallow hard and look again at nuclear, look again at all the other sources of energy we can think of that are not carbon producing to deal with what is the worlds – the histories most enormous energy transition yet.

 

Arctic Ice Holding Up June 29

IMSSandIce07to18day179

In June 2018, Arctic ice extent held up against previous years despite the Pacific basins of Bering and Okhotsk being ice-free.  The Arctic core is showing little change, perhaps due to increased thickness (volume) as reported by DMI.  The image above shows ice extents on day 179 for years 2007 through 2018.

The graph below shows how the Arctic extent has faired in June compared to the 11 year average and to some years of interest.
NHday179Note that 2018 started June well below the 11 year average and below other recent years.  As of day 179 (yesterday) ice extent is matching average and 2007, and slightly above 2017, with further losses to come in previous years.  SII 2018 is tracking the same as MASIE this month.

The table below shows ice extents by regions comparing 2018 with 11-year average (2007 to 2017 inclusive) and 2017.

Region 2018179 Day 179
Average
2018-Ave. 2007179 2018-2007
 (0) Northern_Hemisphere 10029935 10054734 -24798 10034293 -4358
 (1) Beaufort_Sea 1015808 919074 96734 948463 67345
 (2) Chukchi_Sea 711178 732616 -21437 680534 30645
 (3) East_Siberian_Sea 1053171 1032249 20923 963850 89321
 (4) Laptev_Sea 647574 745700 -98126 663276 -15702
 (5) Kara_Sea 726226 598140 128086 665920 60307
 (6) Barents_Sea 60948 134229 -73281 177419 -116471
 (7) Greenland_Sea 356614 552157 -195543 627602 -270989
 (8) Baffin_Bay_Gulf_of_St._Lawrence 714402 552083 162319 531706 182696
 (9) Canadian_Archipelago 794355 783057 11298 775033 19322
 (10) Hudson_Bay 900609 761919 138690 777550 123058
 (11) Central_Arctic 3047677 3217803 -170125 3216654 -168977
 (12) Bering_Sea 185 6350 -6165 1080 -895
 (13) Baltic_Sea 0 4 -4 0 0
 (14) Sea_of_Okhotsk 0 17972 -17972 3531 -3531

2018 is 25k km2 below average, entirely due to Okhotsk plus Bering being ice-free.  Greenland Sea and Barents are down, offset by surpluses in Beaufort, Kara, Baffin and Hudson Bays.

 

 

 

On Energy Transitions

These days the media are full of stories about people setting targets to “decarbonize” the energy sources fueling their societies. Some are claiming (and some have failed notoriously) to achieve zero carbon electrification. We should take a deep breath, step back and rationally consider what is being discussed and proposed.

The History of Energy Transitions

Thanks to Bill Gates we have this helpful graph showing the progress of human civilization resulting from shifts in the mix of energy sources.

Before the 19th century, it is all biomass, especially wood. Some historians think that the Roman Empire collapsed partly because the cost of importing firewood from the far territories exceeded the benefits. More recently, the 1800’s saw the rise of coal and the industrial revolution and a remarkable social transformation, along of course with issues of mining and urban pollution. The 20th century is marked first by the discovery and use of oil and later by natural gas. Since the chart is proportional, it shows how oil and gas took greater importance, but in fact the total amount of energy produced and consumed in the modern world has grown exponentially. So energy from all sources, even biomass has increased in absolute terms.

The global view also hides the great disparity between advanced societies who exploited carbon-based energy to become wealthy and build large middle classes composed of human resources multiplying the social capital and extending the general prosperity. Those societies have also used their wealth to protect to a greater extent their natural environments.

The 21st Century Energy Concern

Largely due to reports of rising temperatures 1980 to 2000, alarms were sounded about global warming/climate change and calls to stop using carbon-based energy. To understand what “decarbonization” actually means, we have two recent resources that explain clearly what is involved and why we should be skeptical and rationally critical.

First, Master Resource describes how the anti-carbon agenda is now embedded in societal structures. Mark Krebs writes Paris Lives! “Deep Decarbonization” at DOE. Excerpts in italics with my bolds.

Despite President Trump’s announcement that the U.S. would withdraw for the Paris Agreement, the basis of that agreement–“deep decarbonization” through “beneficial electrification”–is proceeding virtually unabated. The reason that this is occurring is because it serves the purposes of the electric utility industry and their environmentalist allies, e.g., the Natural Resources Defense Council (NRDC).

According to the Paris Agreement, the fundamental strategy for climate stabilization would be by “deep decarbonization” primarily through “beneficial electrification” powered with “clean energy.” But how are these terms defined exactly?

Deep decarbonization: [4]
The primary strategy of the Paris Agreement for climate stabilization through an 80% reduction in the global use of fossil fuels to “decarbonize” the World’s energy systems by 2050.

Beneficial Electrification: [5]
Replacing consumers direct consumption of natural gas and gasoline, along with other forms of fossil fuels, and on to electricity (with the assumption that electricity generation will be dominated by “clean energy”).

Clean Energy:
Strictly interpreted, it’s just renewables. And more specifically, renewable electric generation. However, many variant definitions exist. For example, DOE includes nuclear, bioenergy and fuel cells as “clean.” And so-called clean coal also appears to qualify via “carbon capture & sequestration” (CCS) as does natural gas, if it is used as a feedstock to make electricity. Energy efficiency (e.g., “nega-watts”) is also deemed “clean energy” by some.

Think about it: Transitioning to a global clean energy economy means there must be a transition from something. By the process of elimination, about the only energy sources not clean are the direct use of fossil fuels. In addition to natural gas direct use, “not clean energy” also includes gasoline, propane, etc. Regardless, “clean energy” (i.e. electrification) is being put forth as the universal cure without disclosure of side effects. In essence the ‘clean energy’ future striven for by EERE exports environmental impacts to others and at high costs. Such non-climate related impacts are ignored.

Whether it’s called regulatory capture, rent seeking or political capitalism, the result is the same: Power accrues to the powerful. In addition to receiving taxpayer funding, advocates of “deep decarbonization” have profited greatly by climate change fear mongering for donations as well as from the deep pockets of Tom Steyer and the like. And now these advocates have officially joined forces with the electric utility industry as evidenced by the recent pact between NRDC and EEI that includes the pursuit of “efficient electrification of transportation, buildings, and facilities.” [21]

In large measure, EERE’s current activities should be viewed as inappropriate subsidies for deep decarbonization via electrification in contravention of President Trump’s proclamation to withdraw from the Paris Agreement. It is also contrary to President Trump’s Executive Order 13783.

Decarbonists in Denial of History

Against this backdrop of imperatives against fossil fuels, we have Lessons from technology development for energy and sustainability by Cambridge Professor Michael J. Kelly (H/T Friends of Science). Excerpts in italics with my bolds.

Abstract: There are lessons from recent history of technology introductions which should not be forgotten when considering alternative energy technologies for carbon dioxide emission reductions.

The growth of the ecological footprint of a human population about to increase from 7B now to 9B in 2050 raises serious concerns about how to live both more efficiently and with less permanent impacts on the finite world. One present focus is the future of our climate, where the level of concern has prompted actions across the world in mitigation of the emissions of CO2. An examination of successful and failed introductions of technology over the last 200 years generates several lessons that should be kept in mind as we proceed to 80% decarbonize the world economy by 2050. I will argue that all the actions taken together until now to reduce our emissions of carbon dioxide will not achieve a serious reduction, and in some cases, they will actually make matters worse. In practice, the scale and the different specific engineering challenges of the decarbonization project are without precedent in human history. This means that any new technology introductions need to be able to meet the huge implied capabilities. An altogether more sophisticated public debate is urgently needed on appropriate actions that (i) considers the full range of threats to humanity, and (ii) weighs more carefully both the upsides and downsides of taking any action, and of not taking that action.

Key Points

Only fossil fuels and nuclear fuels have the ability to power megacities in 2050, when over half of the then 9B people will live in them.

As the more severe predictions of climate change over the last 25 years are simply not happening, it makes no sense to deploy the more costly options for renewable energy.

Abandoned infrastructure projects (such as derelict wind and solar farms in the Mojave desert) remain to have their progenitors mocked.

In this review, I want to concentrate on the measures taken to reduce the global emissions of carbon dioxide, and how the lessons from recent history of technology introductions can inform the decarbonization project. I want to review the last 20 years in particular and see what this portends for the next 40 years which will take us beyond 2050, which is the pivotal date in the public discourse. A Royal Commission into Environmental Pollution in 2000 advocated a 60% reduction of carbon dioxide emissions for the UK by 2050. 14 The date was fixed by the response to the enquiry as to when energy from nuclear fusion might supply 10% of the world’s energy needs. The answer was not before 2050, and we will need to get there without it. The revision from 60% to 80% reduction came from concern that developed countries should make allowances for developing countries using fossil fuel to escape poverty, i.e., they can take the same route as developed countries did to their relative affluence.

We have had over 20 years since the first Earth Summit in Rio de Janeiro in 1992, where 1990 emissions of carbon dioxide were agreed upon as the benchmark for reductions. Before discussing specific technologies, I want to establish the scale of the challenge in engineering, technology, and project delivery terms: this does include economics, societal attitudes, and the public discourse. I also discuss some engineering fundamentals. I will then summarize the many lessons of technology introductions, the preparation for other global challenges, and finally discuss a realistic way forward.

Scale

It is important to note the scale of the perceived problem. The entire history of modern civilization that started with the first industrial revolution has been enabled by the burning of fossil fuels. Our mobility, our health and lifestyles, our diet and its variety, our education system, particularly at the higher level, and our high culture would be quite impossible without fossil fuels, which have provided over 90% of the energy consumed on the earth since 1800. Today, geothermal, hydro- and nuclear power, together with the historic biofuels of wood and straw, account for about 15% of our energy use. 18 Even though it is 40 years since the first oil shocks kick-started the modern renewable energy developments (wind, solar, and cultivated biomass), we still get rather less than 1% of our world energy from these sources. Indeed the rate at which fossil fuels are growing is seven times that at which the low carbon energies are growing, as the ratio of fossil fuel energy used to total energy used has remained unchanged since 1990 at 85%. 19 The call to decarbonize the global economy by 80% by 2050 can now only be described as glib in my opinion, as the underlying analysis shows it is only possible if we wish to see large parts of the population die from starvation, destitution or violence in the absence of enough low-carbon energy to sustain society.

A further insight into the scale of present day energy consumption is as follows. In Europe, today, we use about 6–7 times as much energy per person per day as was used in 1800, and there are seven times as many people on earth now as compared with then. 18 The energy of 1800 was expended on heating and lighting one room in a house and producing hot water used in that same room, and on the purchase of local produce and manufactures. By examining the breakdown of today’s energy usage in the UK and Europe, 20 this energy use persists today, but with lighting and central heating of whole buildings. In addition, Europeans today use as much energy per person per day on private motoring as they used in total in 1800. They use an equal amount on mobility through public transport: trains, ships, and aeroplanes. Three times the personal consumption of 1800 is used in the manufacture and logistics of things we consume or use, such as food or manufactured goods.

Over the next 20 years, the World Bank estimates that the middle class will rise from 3B to 5B, on the basis of which BP estimates a further increase in global energy demand of 40% still to be met in the main by fossil fuels. The graph in Fig. 1(a) is on the wrong scale to show that the total installed renewable energy capacity as of today is equal to the combined capacity of the nuclear power plants shut down in Japan and scheduled to close in Germany, making the challenge of carbon free energy impossible to meet.

Energy return on investment (EROI)

The debate over decarbonization has focussed on technical feasibility and economics. There is one emerging measure that comes closely back to the engineering and the thermodynamics of energy production. The energy return on (energy) investment is a measure of the useful energy produced by a particular power plant divided by the energy needed to build, operate, maintain, and decommission the plant. This is a concept that owes its origin to animal ecology: a cheetah must get more energy from consuming his prey than expended on catching it, otherwise it will die. If the animal is to breed and nurture the next generation then the ratio of energy obtained from energy expended has to be higher, depending on the details of energy expenditure on these other activities. Weißbach et al. 23 have analysed the EROI for a number of forms of energy production and their principal conclusion is that nuclear, hydro-, and gas- and coal-fired power stations have an EROI that is much greater than wind, solar photovoltaic (PV), concentrated solar power in a desert or cultivated biomass: see Fig. 2. In human terms, with an EROI of 1, we can mine fuel and look at it—we have no energy left over. To get a society that can feed itself and provide a basic educational system we need an EROI of our base-load fuel to be in excess of 5, and for a society with international travel and high culture we need EROI greater than 10. The new renewable energies do not reach this last level when the extra energy costs of overcoming intermittency are added in. In energy terms the current generation of renewable energy technologies alone will not enable a civilized modern society to continue!

Successful new technologies improve the lot of mankind

I have already referred to the use of Watt’s steam energy as a source of energy to improve harvesting, greatly aiding agricultural productivity. Notice too that the windmills of Europe stopped turning: the new source of energy was compact, moveable, reliable, available when needed, and of relatively low maintenance. This differential has widened ever since, and the recent windmills do not greatly close the gap in practical utility or cost. Later in the 19th century, electricity from steam turbines became available to lighten the darkness, power an increasing range of machinery, and increase the productivity of mankind to the extent that can be seen today when one contrasts an industrial city with a remote off-grid rural community. It is this energy which has underpinned the ability to improve sanitation, transport goods, and allow modern communications and advanced healthcare. During the 20th century, jet engines greatly reduced the time taken to get between two distant places, with semiconductor technologies eliminating that time with virtual presence anywhere anytime. The genetic engineering technologies have greatly speeded up the processes of plant breeding and the recent green revolution means that the larger population of the world now is better fed than ever before. The remaining areas of starvation are universally associated with war, and/or bad governance interfering with supply chains.

R&D in new technologies is a good use of public money

Really new technologies often span several existing sectors of private industry which would benefit or lose out if the new technology were introduced: coachmen in the age of buses and trains, pigeon carriers in the age of telegraph. It is difficult to have foreseen the rise of electronics to its current pervasive state if governments around the world had not supported relevant R&D in the early stages. Today the global R&D budget exceeds $1T, and the public purse contributes much of that. 33 In many advanced countries, there is significant public support of private R&D in the perceived total public interest, an interest that is not the particular focus of any one company in the private sector. The analysis of the origin of Apple’s technologies is an exemplary case of the private capture of public investment. 34

In the last 40 years, there has continued to be public-good R&D undertaken in many countries on new energy technologies that have given rise to the first generation of renewables. The support goes well down the development channel as well as the background research. This is because the private risk of initial small scale deployment to test the effectiveness of a new technology is often too high for a single company or consortium to bear. The USA has the most effective ecosystem of innovation in the world, eclipsed only for a brief period in the 1980s by Japan. 

Premature roll-out of immature/uneconomic technologies is a recipe for failure

The virtuous role of government funding in R&D is to be contrast with the litany of failure in recent times of subsidies in support of the premature rollout of technologies that are uneconomic and/or immature.

At its prime, the Carrizo Plain (S. California) was by far the largest photovoltaic array in the world, with 100,000 1′x 4′ photovoltaic arrays generating 5.2 megawatts at its peak. The plant was originally constructed by ARCO in 1983 and was dismantled in the late 1990s. The used panels are still being resold throughout the world.

In the late 1980s, large scale installations were made in the Mojave desert of farms of windmills and solar panels. One can see the square kilometers of green industrial dereliction by googling the phrases ‘abandoned wind farm’ and ‘abandoned solar farm’, respectively. The useful energy generated within these farms was insufficient to pay the interest on capital and to maintain production. The companies have gone bankrupt, and there is no one to decommission the infrastructure and return the sites to their pristine condition. I note that the remains are there to be mocked as an infrastructure project has gone wrong, and they will remain for decades, a modern version of the hubris of Ozymandias or the builders of the Tower of Babel. It is important to note that some (but not all 36 ) second and third generation wind and solar farms in the Mojave desert have fared and are faring better, 37 but the lesson here remains that premature roll-out of unready technology is unwise.

The primary problem is the use of public money, i.e., subsidies, to encourage the roll-out. They have a plethora of unintended consequences in the energy infrastructure sector. During the economic crisis of 2008/9 many of the subsidies were reduced or withdrawn in the USA. Many small companies went bankrupt. This has continued with subsidy reductions in the UK, Germany, Spain and elsewhere with further bankruptcies in the alternative energy sector. 38 Indeed there is an index for the stock value of alternative energy companies, RENIXX, that lost 80% of its value between 2008 and 2013, although it has recovered a little of that fall more recently. It is certainly not the place for pension fund investments: if the market were mature and stable, a 40-year programme to renew the global energy infrastructure should be the place for pension funds. 39 The reason so far for these failures is that the technologies are uneconomic over their lifecycles and immature in terms of the energy return on their investment (as in section “Energy return on investment (EROI)” above). In China, public subsidies continue with solar panels being sold at about a 30% loss on the cost of production. 40 That is a political strategy at work rather than an industrial strategy. In democracies, there is unlikely to be multiparty, multigovernment consensus lasting for the multidecadal timescales implied by major infrastructure change.

There is an unintended and unwanted social consequence of the roll out of these new technologies. There is ample evidence in the UK of increasing fuel poverty (i.e., household spending over 10% of disposable income keeping warm in winter) in the regions of wind farm deployment where higher electricity bills are needed to cover the rent of the land (from usually already rich) landowners, a direct reversal of the process whereby cheap energy over the last century has lifted a significant fraction of the world’s poor from their poverty. 41 Renewable energy supplements are viewed as socially divisive.

Technology breakthroughs are not pre-programmable

When public commentators such as Thomas L Friedman enter the debate about energy technologies, they urge more research to produce a breakthrough energy technology, in his case, a ‘plentiful supply of clean green cheap electrons’. 43 It is salutary to realize that all but two of the energy technologies used today have counterparts in biblical times, the only newcomers being nuclear energy and solar photovoltaics. The delivery of coal, gas, wind, water, and solar energy may be quite different today from then but the underlying principles of operation have not changed. Since nuclear fusion was first demonstrated, there has been a 60-year effort to tame it for a source of electrical energy, but so far without success. One can ask the experts whether they might have made more progress with more money, but the challenges have remained profound. Even if there were a breakthrough tomorrow in the basic processes, it would still take of order 40 years (rather than 20 in my opinion) to complete the further engineering and technology work and deploy fusion reactions to be able to provide (say) 10% of the world’s electricity. We must get to 2050 without it.

Finance is limited, so actions at scale must be prioritized

The sum involved in renewing the energy infrastructure in the UK is about £200B over the next decade. 46 A large element of this cost is to make good the lack of infrastructure investment over the last 20 years since a privatized energy market was introduced. In addition the large scale modification of the grid to cope with multiple renewable energy inputs has to be included. There remains a dispute in the public domain as to where these costs lie. The grid as we know it in major industrial countries has evolved over a period of 100 years on the basis of a relatively few large sources of energy connected to the grid which circulates power to substations from which it is transmitted to individual end-users in a broadcast mode. With multiple small and independent sources of energy from wind and solar installations, the grid topology has to change to cope with this very different quality of energy. The conventional suppliers of energy say that they should not have to cover these extra costs which should be book-kept with the renewable energies in the overall balance sheet of costs. A similar book-keeping problem arises with the costs of back-up to intermittent renewable energies. The combined cycle gas turbine generators that have delivered base load electricity to the grid in Germany are now being asked to act in back-up mode, with frequent acceleration and deceleration of the turbines, for which purpose they were not designed and they shorten the in-service life time as a result. 47 The cost analyses of future energy are bedevilled by the assignments of additional consequential costs. In practice as consumers, we buy the energy provided by electricity, blind to the particular way it is produced.

The scale of the costs of these energy bills is such that one cannot make mistakes in infrastructure investment decisions. A wrong investment is a missed opportunity on a large scale.

Finally, it is as well to remember that there are only ever two sources of payment, the consumer and the taxpayer, and the only issue at stake between them is the directness with which costs are recovered. 

The way forward

It is surely time to review the current direction of the decarbonization project which can be assumed to start in about 1990, the reference point from which carbon dioxide emission reductions are measured. No serious inroads have been made into the lion’s share of energy that is fossil fuel based. Some moves represent total madness. The closure of all but one of the aluminium smelters that used gas-fired electricity in the UK (because of rising electricity costs from the green tariffs that are over and above any global background fossil fuel energy costs) reduces our nation’s carbon dioxide emissions. 62 However, the aluminium is now imported from China where it is made with more primitive coal-based sources of energy, making the global problem of emissions worse! While the UK prides itself in reducing indigenous carbon dioxide emissions by 20% since 1990, the attribution of carbon emissions by end use shows a 20% increase over the same period. 63

It is also clear that we must de-risk all energy infrastructure projects over the next two decades. While the level of uncertainty remains high, the ‘insurance policy’ justification of urgent large-scale intervention is untenable, and we do not pay premiums if we would go bankrupt as a consequence. Certain things we do not insure against, such as a potential future mega-tsunami, 64 or a supervolcano, 65 or indeed a meteor strike, even though there have been over 20 of these since 2000 with the local power of the Hiroshima bomb! 66 Using a significant fraction of the global GDP to possibly capture the benefits of a possibly less troublesome future climate leaves more urgent actions not undertaken.

Two important points remain. The first is that there is no alternative to business as usual carrying on, with one caveat expressed in the following paragraph. Since energy use has a cost, it is normal business practice to minimize energy use, by increasing energy efficiency (see especially the recent improvement in automobile performance), 67 using less resource material and more effective recycling. These drivers have become more intense in recent years, but they were always there for a business trying to remain competitive.

The second is that, over the next two decades, the single place where the greatest impact on carbon dioxide emissions can be achieved is in the area of personal behaviour. Its potential dwarfs that of new technology interventions. Within the EU over the last 40 years there has been a notable change in public attitudes and behaviour in such diverse arenas as drinking and driving, smoking in public confined spaces, and driving without a seatbelt. If society’s attitude to the profligate consumption of any materials and resources including any forms of fuel and electricity was to regard this as deeply antisocial, it has been estimated we could live something like our present standard of living on half the energy consumption we use today in the developed world. 68 This would mean fewer miles travelled, fewer material possessions, shorter supply chains, and less use of the internet. While there is no public appetite to follow this path, the short term technology fix path is no panacea.

Conclusions

Over the last 200 years, fossil fuels have provided the route out of grinding poverty for many people in the world (but still less than half of all people) and Fig. 1 shows that this trend is certain to continue for at least the next 20 years based on the technologies of scale that are available today. A rapid decarbonization is simply impossible over the next 20 years unless the trend of a growing number who succeed to improve their lot is stalled by rich and middle class people downgrading their own standard of living. The current backlash against subsidies for renewable energy systems in the UK, EU and USA is a sign that all is not well with current renewable energy systems in meeting the aspirations of humanity.

Finally, humanity is owed a serious investigation of how we have gone so far with the decarbonization project without a serious challenge in terms of engineering reality. Have the engineers been supine and lacking in courage to challenge the orthodoxy? Or have their warnings been too gentle and dismissed or not heard? Science and politicians can take too much comfort from undoubted engineering successes over the last 200 years. When the sums at stake are on the scale of 1–10% of the world’s GDP, this is a serious business.

See also:  Climateers Tilting at Windmills Updated

Figure 12: Figure 9 with Y-scale expanded to 100% and thermal generation included, illustrating the magnitude of the problem the G20 countries still face in decarbonizing their energy sectors.

Global Warming Hole Found: Minus 144 F

r1138736_14100357

A high ridge in Antarctica on the East Antarctic Plateau where temperatures in several hollows can dip down to minus 144 F.

National Geographic Coldest Place on Earth Found—Here’s How
“It’s a place where Earth is so close to its limit, it’s almost like another planet.”

Just how cold can it get on Earth’s surface? About minus 144°F, according to recent satellite measurements of the coldest known place on the planet.

Scientists recorded this extreme temperature on the ice sheet deep in the middle of Antarctica during the long, dark polar winter. As they report this week in Geophysical Research Letters, the team thinks this is about as cold as it can possibly get in our corner of the solar system.

“It’s a place where Earth is so close to its limit, it’s almost like another planet,” says study leader Ted Scambos, a researcher at the National Snow and Ice Data Center at the University of Colorado, Boulder.

The measurement smashes the previous record for the coldest known air temperature in the natural world: a frigid minus 128.6°F felt in 1983 at the Russian Vostok Station, not far from the South Pole. Humans can’t inhale air that cold for more than a few breaths—it would cause our lungs to hemorrhage. Russian scientists ducking out to check on the weather station would wear masks that warmed the air before they breathed it in.

DEATHLY HOLLOWS
While the East Antarctic ice sheet looks flat at the surface, it actually domes ever so slightly from center to edge like a vast, icy turtle shell. Vostok is perched near the top of the dome, on about 2.2 miles of ice, but it’s not quite at the apex. Scambos’s team suspected that it could get even colder at the very highest parts of the ice sheet.

There aren’t any weather stations perched at the peak of the ice sheet, and there isn’t anyone there to check on them in the dead of Antarctic winter. But satellites can sense the temperature at the surface of the ice as they pass overhead. So Scambos and his colleagues sifted through several years of satellite data, mapping out when and where temperatures dipped low.

Sure enough, they found about a hundred little pockets of exceptional cold scattered across the highest parts of the ice sheet. The coldest spots were in shallow depressions in the ice, little hollows where the surface isn’t perfectly smooth. That’s probably because cold air sinks into these depressions like it sinks into a river valley or a canyon, says John Turner, a polar scientist with the British Antarctic Survey who was not involved in the study.

“They’re such shallow dips, you probably couldn’t even see them with your eyes,” he says.

The air warms up by a few degrees right above the surface, which is where the scientists at Vostok had recorded the previous coldest temperature. By comparing the satellite measurements to data from the nearest weather stations, Scambos and his team figured out that the air temperatures in this region would be a little warmer near human-head height, about minus 137°F. But right at the surface, where your feet would touch the snow, they saw temperatures of minus 144°F.

“But you hope your feet wouldn’t ever touch the snow,” Scambos says. “That would not be fun at all.”

OBSCURING THE VIEW
Only very special conditions lead to such extreme cold. First, it has to be the dead of winter, long after the midnight sun sets for the season. Then, the air needs to be still for a few days, and the sky needs to be perfectly clear, without a wisp of a cloud or a shimmer of diamond dust above the ice sheet.

As cold as it may be, ice radiates a tiny amount of heat. Normally, most of that heat is captured by water vapor in the atmosphere and gets beamed back down to Earth’s surface, trapping warmth in the lower atmosphere.

But during dry spells in Antarctica, when most of the water vapor has been wrung out of the atmosphere, “it starts to open a window that isn’t usually open anywhere else on Earth,” Scambos says. Then, faint heat emitted by the ice sheet can escape all the way to space, leaving the ice surface even colder.

The ultra-clear conditions that enable these chilly events are also ideal for looking out into space, which is why scientists placed a telescope just a few miles from the extreme cold spots Scambos’ team pinpointed.

“Water vapor is our nemesis,” says Craig Kulesa, an astronomer from the University of Arizona who runs the High Elevation Antarctic Terahertz Telescope, or, somewhat satirically, HEAT. “We put our telescope at this superbly dry site, but if we put it 10 miles away, would it be any better?”

It may be a question worth considering as the climate changes around the globe, though there’s nowhere else on this planet they could go where conditions would be better. Water vapor concentrations in the atmosphere are increasing, which in turn means more of the ice-emitted heat gets trapped near the surface—keeping it warmer. So, the perfectly clear conditions that are ideal for looking into space will become less frequent—and any scientists hoping to break the record for sensing extreme cold on Earth may be running out of time.

“As we see increases in greenhouse gas and water vapor concentrations, we’re expecting warming across the Antarctic of about 3 to 4°C,” says Turner. “Seeing any new temperature lows will be more and more unlikely. The odds are just getting smaller.”

 

US District Court grants Chevron’s motion to dismiss climate change case

US District Court grants Chevron’s motion to dismiss climate change case

World Pipelines, Tuesday, 26 June 2018 12:00

The US District Court for the Northern District of California has issued a ruling dismissing the climate change lawsuits filed against Chevron Corporation by the cities of San Francisco and Oakland. The court dismissed the complaint as requiring foreign and domestic policy decisions that are outside the proper purview of the courts.

As the court described, “the scope of plaintiffs’ theory is breathtaking. It would reach the sale of fossil fuels anywhere in the world, including all past and otherwise lawful sales.”

“It is true,” the court continued, “that carbon dioxide released from fossil fuels has caused (and will continue to cause) global warming. But against that negative, we must weigh this positive: our industrial revolution and the development of our modern world has literally been fuelled by oil and coal. Without these fuels, virtually all of our monumental progress would have been impossible. All of us have benefitted. Having reaped the benefit of that historic progress, would it really be fair to now ignore our own responsibility in the use of fossil fuels and place the blame for global warming on those who supplied what we demanded? Is it really fair, in light of those benefits, to say that the sale of fossil fuels was unreasonable?”

The court concluded by dismissing the claims and deferring to the policy judgments of the legislative and executive branches of the federal government: “The dangers raised in the complaints are very real. But those dangers are worldwide. Their causes are worldwide. The benefits of fossil fuels are worldwide. The problem deserves a solution on a more vast scale than can be supplied by a district judge or jury in a public nuisance case. While it remains true that our federal courts have authority to fashion common law remedies for claims based on global warming, courts must also respect and defer to the other co-equal branches of government when the problem at hand clearly deserves a solution best addressed by those branches.”

Reliable, affordable energy is not a public nuisance but a public necessity,” said R. Hewitt Pate, Chevron’s Vice President and General Counsel. “Tackling the difficult international policy issues of climate change requires honest and constructive discussion. Using lawsuits to vilify the men and women who provide the energy we all need is neither honest nor constructive.”

The court’s decision dismisses a lawsuit that the cities of San Francisco and Oakland filed against BP, Chevron, Conoco-Phillips, ExxonMobil and Royal Dutch Shell, seeking to hold a selected group of oil and gas companies responsible for the potential effects of global climate change. The suit, filed in 2017, claims that the production and sale of oil and gas are a public nuisance because they result in greenhouse gas emissions that contribute to worldwide climate change and rising sea levels. The US Supreme Court and other courts around the country have previously rejected similar claims brought by the same lawyers. Those courts – like the court today – found that America’s environmental policies must be determined by national policymakers like the Environmental Protection Agency, not courts of law.

Several other US cities and counties, including New York City and King County, Washington, recently filed nearly identical cases against the same oil and gas companies. Many were filed by the same lawyers. The energy companies have filed motions to dismiss those cases as well. As Chevron has repeatedly emphasised in its court filings, Chevron supports meaningful efforts to address climate change and accepts internationally recognised climate science, but climate change is a global issue that requires global engagement, not lawsuits. Chevron is taking prudent, practical and cost-effective actions to mitigate potential climate change risks, including managing emissions, testing new technologies, and increasing efficiency.

Chevron Corporation is one of the world’s leading integrated energy companies. Through its subsidiaries that conduct business worldwide, the company is involved in virtually every facet of the energy industry. Chevron explores for, produces and transports crude oil and natural gas; refines, markets and distributes transportation fuels and lubricants; manufactures and sells petrochemicals and additives; generates power; and develops and deploys technologies that enhance business value in every aspect of the company’s operations. Chevron is based in San Ramon, California.

Judge Alsup’s Ruling

Footnote:  It will be claimed that the court has confirmed dangerous man made warming.  But IPCC science was stipulated by both plaintiffs and defendants, so there was no disagreement for the court to resolve.  The science was not at issue between the parties.  It doesn’t mean the science holds up under scrutiny, only that such examination was not pertinent here.

NOAA Climate Intrigue

Defenders of the Federal status quo (AKA swamp denizens) are aroused over an apparent move to refocus the mission statement of National Oceanic and Atmospheric Administration (NOAA). UCS raised the alarm which was, as usual, taken up by the New York Times. At a recent Department of Commerce summit, the acting head of the National Oceanic and Atmospheric Administration (NOAA), Rear Admiral Timothy Gallaudet, proposed a new mission statement for the agency. The proposed change in wording is as follows.

The mission of NOAA has been:

  • To understand and predict changes in climate, weather, oceans and coasts;
  • To share that knowledge and information with others; and
  • To conserve and manage coastal and marine ecosystems and resources.

In his presentation, Rear Admiral Gallaudet suggested the mission statement would change to:

  • To observe, understand and predict atmospheric and ocean conditions;
  • To share that knowledge and information with others; and
  • To protect lives and property, empower the economy, and support homeland and national security.

Comment on NOAA Mission Statement

Note the word “observe” is added to give emphasis to NOAA’s responsibility to obtain and maintain data records relating to the ocean and atmosphere. Instead of the words “changes to climate, weather, oceans and coasts,” NOAA is tasked to predict “atmospheric and ocean conditions.” This suggest a move away from climatological considerations to more immediate support for adapting to natural events. It also suggests that coastal land management is outside NOAA’s scope.

Readers will note the proposed wording drops “conserve and manage” from the mission, replaced by the more explicit “To protect lives and property, empower the economy, and support homeland and national security.” The latter phase would be consistent with the larger thrust of the Commerce Department. (See Commerce priorities at end.)

Background:

September 1, 2017, Rear Admiral Gallaudet was nominated by President Trump and was warmly welcomed by scientists.

The University Corporation for Atmospheric Research (UCAR) congratulates Rear Admiral Timothy Gallaudet, a former oceanographer of the Navy, on his nomination to assistant secretary of commerce for oceans and atmosphere. In that position, Gallaudet will serve as the second-in-command at the National Oceanic and Atmospheric Administration (NOAA).

Gallaudet, who also served as commander of the Navy’s Meteorology and Oceanography Command, is a 32-year Navy veteran. He holds master’s and doctoral degrees in oceanography from the Scripps Institution of Oceanography.

“Tim’s mixture of operational expertise and scientific knowledge make him an ideal choice for this position,” said UCAR President Antonio Busalacchi. “His understanding of the vital collaborations between NOAA, private forecasting companies, and the academic community can help foster the movement of research to operational forecasting and advance the nation’s weather prediction capabilities. Furthermore, his knowledge of Earth system science and his ability to align that science with budget and programs will be essential to moving NOAA forward in the next few years.”

NOAA runs the National Weather Services, engages in weather and climate research, and operates weather satellites and a climate data center. The agency also works to better understand and protect the nation’s coasts, oceans, and fisheries.

UCAR is a nonprofit consortium of more than 100 colleges and universities focused on research and training in the atmospheric and related sciences.

September 25, 2017

In his answers to the confirmation committee’s questionnaire, Gallaudet listed the top three challenges he sees facing NOAA. He identified the first challenge as implementing the Weather Research and Forecasting Innovation Act that Congress passed earlier this year.

“If confirmed, I would make it my top priority to meet the intent of this law, especially the aspects concerning improvement to severe weather, tornado and hurricane warnings, and satellite data collection program management. … Finally I will need to work with the NOAA Administrator as well as NESDIS and NWS leadership to focus on the NOAA satellite programs which are growing at an unsustainable rate and that have been delayed numerous times.”

October 2017

Gallaudet was confirmed and on October 11, President Trump nominated Barry Myers, chief executive of the private weather forecasting company AccuWeather, to run NOAA. The appointment breaks from the recent precedent of scientists leading the agency tasked with a large, complex, and technically demanding portfolio. Myers has a bachelor’s degree in business administration and economics, a master’s degree in business from Pennsylvania State University, and a law degree from Boston University School of Law. Myers has been an adviser to five directors of NOAA’s National Weather Service and a representative of the U.N. World Meteorological Organization, according to a biography from AccuWeather. He must be confirmed by the Senate before taking the post.

December 17, 2017

In his Senate Confirmation hearing, Myers sought to assure members of the Senate Commerce, Science, and Transportation Committee that he has a deep appreciation for NOAA’s scientific mission. In response to pointed questions from Democratic senators, Myers vowed to uphold NOAA’s scientific integrity policies and champion free and open data. And, for the first time in public since his nomination, he concurred with the mainstream scientific consensus on climate change and promised to support NOAA’s climate research portfolio. The full inquisition is described by the American Institute of Physics NOAA Nominee Barry Myers Embraces Science at Confirmation Hearing

April 11, 2018

Timothy Gallaudet testifies at Budget hearings.NOAA Budget Cuts Get Chilly Reception in Congress

In his opening statement at the April 11 hearing, Gallaudet explained that the $4.5 billion budget request for NOAA focuses on two priorities. The first is “reducing the impacts of extreme weather and water events, by implementing the Weather Research and Forecasting Innovation Act,” which was enacted last April. The second is increasing sustainable economic contributions of U.S. fisheries and other ocean resources.

Gallaudet also touted NOAA’s successes over the last year in responding effectively to the record-setting hurricane season, saying the agency’s efforts “saved thousands of lives despite Hurricanes Harvey, Irma, [and] Maria, being three of the five most costly hurricanes in history.” He also highlighted the “perfect” recent launches of two flagship weather satellites — Geostationary Operational Environmental Satellite-S (GOES-S) and Joint Polar Satellite System-1 (JPSS-1).

Later in the hearing, Gallaudet described further investments NOAA is making in high-performance computing and modeling to support operational weather prediction. In describing the Global Forecast System FV3 experimental model that is being transitioned to the National Weather Service, Gallaudet said,

This model out-performed the European models for the hurricane track forecasts for the three Category 4 hurricanes that made landfall [last year]. Our goal is to regain world leadership, take number one back for our weather modeling. We’re on track to do it. We expect to do that before 2020.

Gallaudet assured Cartwright that climate is “embedded” within NOAA’s weather and water forecasting priority, explaining that it includes consideration of “scales that are in weeks to seasonal and even sub-seasonal and climate types of scales.”

Although the administration has proposed deep cuts for climate research and grant programs, including termination of the $48 million Competitive Climate Research grant program and the $6 million Arctic Climate Research Program, Gallaudet assured members that climate research would continue, “because there’s much we still don’t know.”

When Cartwright pressed further on his concerns about the White House’s treatment of climate change, Gallaudet reassured him that the White House has “not zeroed out our climate work,” pointing to the Climate Prediction Center’s publication of seasonal and long-range outlooks as well as recent collaboration between NOAA and the U.S. Navy on Arctic sea ice forecasting. In addressing similar concerns brought up by Sen. Maggie Hassan (D-NH) at the April 12 hearing, he added that the White House is “supporting much of our Arctic-related research that is driven primarily by climate change,” and that he has not been directed to eliminate or remove the phrase “climate change” from reports.

May 14, 2018

Senate Should Confirm Barry Myers to Lead NOAA

NOAA – the National Oceanic and Atmospheric Administration – needs its leader! President Trump nominated Barry Lee Myers, the CEO of AccuWeather, to the post in mid-October. The Senate Commerce Committee has twice advanced Myers’ nomination to the full Senate. All that’s needed to fill this important job is a majority vote on the Senate floor, which both Democrats and Republicans expect to happen. Unfortunately, partisan politics keeps getting in the way, delaying the vote.

Senate offices have received more than 60 letters from individuals and organizations supporting his confirmation, including strong backing from the past four leaders of the U. S. National Weather Service who served under both Democratic and Republican administrations. In addition, the seafood industry has overwhelmingly advocated his confirmation with letters of support from seafood processors and others in the fisheries industry ranging from ship captains to sport fishermen.

Also, as a recognized leader in the sciences, Myers has demonstrated respect for quality-tested science when making decisions related to all areas of the agencies’ responsibilities, including the nation’s fisheries, weather, oceanographic and climate challenges.

Myers worked closely with lawmakers to help secure enactment of last year’s Weather Research and Forecasting Innovation Act. The American Meteorological Society conferred its highest award for Excellence in Meteorology on him. He also has demonstrated a deep knowledge about NOAA and is committed to making the agency the best it can be, second to none in the world.

Prompt confirmation of Myers will benefit the public and the U.S. economy in the days, weeks and months ahead by solidifying the NOAA leadership team. With the unprecedented threat of catastrophic storms, the agency’s mission – protecting life and property and expanding American economic competitiveness – is on the line. The Senate should quickly confirm Barry Myers as NOAA administrator.

Conrad C. Lautenbacher Jr. VADM USN (ret.), CEO of GeoOptics, is a former under secretary of commerce for oceans and atmosphere and administrator of NOAA.

Robert Vanasse is executive director of the National Coalition for Fishing

Overview of Strategic Plan of US Department of Commerce 2018 to 2022

Knowing that innovation is a key driver of economic advancement, we are placing an increased emphasis on the commercial opportunities of space exploration and aquaculture while our scientists are conducting foundational research in areas ranging from artificial intelligence to quantum computing. Our patent professionals are also working to improve the protection of intellectual property so that creators can profit from their inventions.

U.S. businesses must export more, and our workers deserve a level playing field. Enforcing our trade laws to ensure that trade is free, fair, and reciprocal is a top priority of the Department. We are also joining with all federal agencies in cutting red tape that drives up costs and puts American workers and businesses at a disadvantage.

To maintain America’s leadership in next-generation technologies, we are making important advances in data, cybersecurity, and encryption technology. Our economists and statisticians are improving Commerce data that American businesses and communities use to plan investments and identify growth opportunities. Every level of the Department will be engaged to ensure that we conduct the most accurate, secure, and technologically-advanced decennial census in history.

Finally, teams across the Department are working to keep Americans safe by predicting extreme weather events earlier and more accurately, preventing sensitive technology from getting in the hands of terrorists, rogue regimes, and strategic competitors, and deploying a nationwide public safety broadband network that allows better coordination among first responders.

Thank you to every employee at the Department and to our industry and government partners for your dedication to our mission.

Post by Wilbur Ross Secretary of Commerce

Sloppy Science + Bad Reporting = Fake Scare

 

Abusing science to incite fear is not confined to global warming/climate change. Medical science has also been debased by taking up the appeal to public alarm. The current example being exploitation of ovarian cancer, as explained by Warren Kindzierski writing in Financial Post How weaselly science and bad reporting consistently find cancer links that don’t exist  (Weaselly: Stretching facts with the use of such words as ‘this could,’ ‘can,’ ‘may,’ ‘might,’ ‘probably,’ ‘likely’ cause cancer)

Last month, the Quebec court authorized a class-action suit against two brands of baby powder that alleges that regular use of talc powder by women in their genital area is linked to a higher risk of ovarian cancer. Part of the allegations relate to claims that an ovarian cancer risk from powdered talc use is demonstrated by nearly four decades of scientific studies. Cosmetic talc has certainly been the subject of much scientific debate, study and, increasingly, legal challenge.

However, the cosmetic talc-ovarian cancer link is commonly misunderstood. Published biomedical studies cover both sides, suggesting a talc-ovarian cancer link and showing no link. Even today in prominent journals, letters to the editor — penned by scientists — rage back and forth, defending their studies or attacking the other side’s studies.

Now this is civilized, real science.

This bouncing back and forth of positive versus negative effects between talc and ovarian cancer is referred to as “vibration of effects” by John Iaonnidis, a professor of medicine and of health research and policy at Stanford University. Studies vary depending on how they are done. Why is this? Well, getting scientists to agree on important things like methods, what data to use and how to analyze and interpret effects from subtle human exposures is next to impossible. It would be no problem if one were studying cancer risks in populations receiving large exposures over long durations; but such situations are non-existent.

The truth is that the ability of any biomedical method, epidemiology included, to discriminate cancer risks in people from small exposures to a physical or chemical agent does not exist.

Most cancers are caused by a number of factors. As a result, establishing cancer causation is complex — unless a particular risk factor is overwhelming. Epidemiology studies cannot and do not realistically replicate this complexity, at least not very well. That is why the U.S. National Institutes of Health and the National Cancer Institute lists a number of key risk factors for ovarian cancer and talc is not one of them.

The institute states that it is not clear whether talc affects ovarian cancer risk. An expert U.S. cosmetic-ingredient review panel assessed the safety of cosmetic talc in 2015. It thoroughly analyzed numerous studies investigating whether or not a relationship exists between cosmetic use of talc in the perineal area and ovarian cancer. The panel determined that these studies do not support a causal link. They also agreed that there is no known physiological mechanism by which talc can plausibly migrate from the perineum to the ovaries. The news coverage of the lawsuit has been silent on that evidence.

Part of the public’s misunderstanding about talc comes from scientists offering opinions about cancer from small exposures. Too many scientists use weasel words to stretch facts: “This could,” “can,” “may,” “might,” “probably,” “likely” cause cancer. Flimsy so-called evidence from their studies that suffer from vibration of effects and their speculations are voraciously inhaled by naïve journalists. Stretched facts miraculously get reported as facts to the public — or worse, misused for litigation purposes.

The woman’s bathroom is a chemical exposure chamber with literally dozens of cosmetic products used at various times. Both skin contact and inhalation regularly occur with grooming products. However, repeated uses of small amounts of cosmetic talc or any other cosmetic product do not amount to overwhelming exposures despite the claims of some scientists and media. Overwhelming exposures — the ones that cause effects — are those that occur with laboratory rats and mice. Underwhelming exposures are what occur to people in the real world.

It is highly speculative that repeated use of small amounts of cosmetic talc is a definitive cause of ovarian cancer. It is not a definitive cause; it is only suggestive. Prominent organizations such as the U.S. National Cancer Institute and expert panels should make clear statements about such cancer risks, but they do not. Selective methods in epidemiology studies, speculation by scientists and inaccurate reporting by news media are ingredients used to transform weak suggestive evidence from underwhelming cosmetic talc exposure into something that is mistakenly claimed to be harmful for the public.

And that is why we end up with class action suits against cosmetic companies.

Warren Kindzierski is an associate professor in The School of Public Health at the University of Alberta.

Perverse Green Capitalists


Politicians and media pundits like to say that Climate Change is the biggest threat to modern society. I am coming around to agree, but not in the way they are thinking. I mean there is fresh evidence that we can defeat radical Islam, but we are already losing to radical climatism.  I refer to climate alarm and activism, which has come to dominate the environmental movement and impose an agenda for social re-engineering.  And now we have fresh evidence that even capitalists are working to undermine the infrastructure supporting modern civilization.

As we approach the year 2020, we confront the spectacle of financiers raiding shareholder wealth in order to cripple the Energy Industry, seen as threatening the climate.  2020 used to indicate perfect eyesight, so that perceptions could be trusted.  This is the opposite:  People who should know better have drunk climatist koolaid and are now running the asylum.

Dan Eberhart exposes this latest twist in his Forbes article Corporate Resolutions On Social Issues Serve Activists, Not Shareholders  Excerpts in italics with my bolds.

America’s growing energy dominance is helping transform our economy and revitalize the forgotten parts of our nation.

Through innovation and free-market principles, America’s oil and natural gas sector have moved us from an age of scarcity to a future of abundance. As a nation, we are once again the world’s biggest producer, with all of the economic, trade and national security benefits that portends.

But there is a move afoot by wealthy investment firms and environmental activists to undermine that success and turn back to a time of scarcity by making climate change an issue in the boardrooms of energy producers big and small. Under the guise of socially responsible investing or ESG – environmental, social and governance – they are attempting to “decarbonize” our economy one corporation at a time.

America’s success in the energy sector is directly attributable to the strength of our economic freedom and competitive markets – just look at Venezuela, Angola, Mexico, Iran, Libya or Russia for the grim alternative.

The numbers are astounding. Domestic oil production reached 10.9 million barrels a day this month and is expected to continue its ascent to record-setting levels well into next year, according to the U.S. Energy Information Administration (EIA). By 2019, surging domestic production is expected to drive down our use of imported oil to the lowest level since 1959.

The use of hydraulic fracturing to squeeze ever more oil and gas from tight shale rock is a key driver of the energy boom. Production from America’s seven major shale formations is forecast to hit 7.2 million barrels a day by the end of this month, according to EIA.

It’s the communities in and around these formations – located almost exclusively in what are often derided as “fly-over states”– that are seeing the everyday benefits of jobs, rising wages and increasing confidence in the economy. The resurrection of the energy sector is turning small towns once on the verge of becoming ghost towns into bustling centers of activity.

There’s no guarantee the good times will continue, though, especially if companies stop searching for new supplies of oil and gas. For those who subscribe to the ideas of socially responsible investing, the end of energy dominance can’t come soon enough.

Proxy advisory firms Glass Lewis, ISS and others are increasingly advising their large shareholder clients to turn America’s boardrooms into a battleground over climate change. In the process, they are undermining the financial stability of traditional energy companies by attempting to force directors to invest in renewable energy instead of fossil fuels.

Shareholders are, of course, within their rights to propose resolutions and pursue changes to the way corporations are governed. But, increasingly, the aim of these resolutions has shifted from securing better returns to achieving political change when our political leaders have disagreed with the direction these activists wish to go.

From the perspective of corporate leaders, this new frontier of so-called social responsibility looks more like the age of proxy pirates, who unfurl the Jolly Roger and swing aboard the boardroom deck intent on striking fear in the hearts of the captains of industry.

These attacks on corporate governance and fiduciary responsibility were once rare but are growing in frequency. In the early 2000s during the era of “peak oil” – when many believed our oil supplies were running out on their own – less than 200 shareholder proposals each year focused on environmental or social factors, according to Proxy Preview.

Over the past decade, the number of shareholder proposals motivated purely by political aims has increased in lockstep with our growing energy security. And the trend is growing. According to the Institutional Shareholder Services, more than two-thirds of the proposals filed this year were related to social or environmental pet causes.

The rising prevalence of climate-risk resolutions threatens to destabilize America’s energy sector, reversing the benefits of energy dominance and forcing change regardless of the economic and security costs to society.

Oil and gas projects take years, sometimes decades, to develop. If companies don’t invest today, consumers may find themselves paying more for imported energy.

The efforts of investment firms like BlackRock, Vanguard and State Street are distorting the market and scaring off investment that will, if allowed to continue unanswered, result in future supply shortages and higher prices for consumers.

Dan Eberhart Bio
I am CEO of Canary, one of the largest privately-owned oilfield services companies in the United States. I’ve served as a consultant to the energy industry in North America, Asia and Africa. My commentaries have been published in The Hill, Real Clear Energy, and the Economist. I have appeared on Fox News, CNN and CNBC. I am the author of The Switch. I was honored to be named to Hart Energy’s “30 Under 40” list and to be included on several U.S. trade missions to sub-Saharan Africa. I have undergraduate degrees in economics and political science from Vanderbilt University and a law degree from Tulane Law School. A Georgia-native, I currently live in Phoenix, Arizona, with my wife and daughter.

Comment:  I am all for Corporate Responsibility, which used to mean doing due diligence to get the facts and act accordingly as a reasonable good citizen.  Instead, people are falling prey to ideologues and investors are being steered toward con artists.  Behind all of this are the Climatists, true believers in the unproven notion that humans control the climate and not in a good way.

The Climatist Game Plan (From Previous post Climatist Manifesto)

Mission: Deindustrialize Civilization

Goal: Drive industrial corporations into Bankruptcy

Strategy: Cut off the Supply of Cheap, Reliable Energy

Tactics:

  • Raise the price of fossil fuels
  • Force the power grid to use expensive, unreliable renewables
  • Demonize Nuclear energy
  • Spread fear of extraction technologies such as fracking
  • Increase regulatory costs on energy production
  • Scare investors away from carbon energy companies
  • Stop pipelines because they are too safe and efficient
  • Force all companies to account for carbon usage and risk

Progress:

  • UK steel plants closing their doors.
  • UK coal production scheduled to cease this year.
  • US coal giant Peabody close to shutting down.
  • Smaller US oil companies going bankrupt in record numbers.
  • Etc.

Collateral Damage:

  • 27,000 extra deaths in UK from energy poverty.
  • Resource companies in Canada cut 17,000 jobs in a single month.
  • Etc.

For more info on progress see: http://business.financialpost.com/fp-comment/terence-corcoran-clean-green-and-catastrophic

Summary:

Radical climatism is playing the endgame while others are sleeping, or discussing the holes in the science. Truly, the debate is over (not ever having happened) now that all nations have signed up to the Paris COP doctrine. Political leaders are willing, even enthusiastic dupes, while climatist tactics erode the foundations of industrial society.  Deaths and unemployment are unavoidable, but then activists think the planet already has too many people anyway.

ISIS was an immediate threat, but there is a deeper and present danger already doing damage to the underpinnings of Life As We Know It. It is the belief in Climate Change and the activists executing their game plan.  Make no mistake: they are well-funded, well-organized and mean business.  And the recent behavior of valve-turners, acting illegally to shut off supplies of fossil fuel energy, shows they are willing to go very far to impose their will upon the rest of us.

 

 

Canadian Climate Turns Against Activists

 

In olden days kings ruled by fiat, but nowadays you need the people’s consent,
disappointing to Obama and now Trudeau.

The Liberal federal government led by Justin Trudeau is running up against a deeply ingrained and widespread skepticism in the population.  In a previous post Uncensored: Canadians View Global Warming  I noted that the principle finding in a recent survey was buried in the report and hidden by the media.  Belief in man made global warming is a minority view in Canada, as shown below:
The political implications of that lack of support for climate activism are starting to become manifest.  Ed Whitcomb writes in the Ottawa Citizen Climate change politics are undermining federalism  Excerpts in italics below with my bolds and images

Prime Minister Trudeau between BC Premier John Horgan and Alberta Premier Rachel Notley. The 2 provinces are at war over expanding the oil pipeline.

Canada’s largest province is about to reject the federal climate change policy. Saskatchewan never accepted it and Alberta could reject it in 2019. Maybe it’s time for reflection.

The current policy calls for the provinces to implement a federal government plan. That, however, is a contradiction of federalism, a system which reflects the fact that the feds and the provinces have different interests. Policies to deal with pollution in Ontario may be inappropriate for Newfoundland or Saskatchewan. The current federal government overlooked such differences when it decided that there was only one solution to global warming, a carbon tax, and only two acceptable ways to implement it, cap-and-trade or a carbon levy. Unfortunately, not all Canadians and provinces accept these assumptions, and the consensus is shrinking.

In 2015, Saskatchewan’s then-premier Brad Wall pointed out that his economy was far more dependent on fossil fuels than were other provinces. A carbon tax would be disproportionately costly, which was unacceptable. That dispute is going to court and no one knows what the outcome will be.

Alberta’s NDP government endorsed the federal scheme, providing the federal government got a pipeline built. That linked dealing with climate change to increasing energy production, linked reducing gas emissions to raising them. But the pipeline has not been built, and bitumen is unlikely to flow before a provincial election which could empower the United Conservative Party. The UCP is strongly opposed to the federal climate plan. It, and the incoming Ontario Conservative government, oppose carbon taxes because everyone will pay them whether or not that reduces their consumption of carbon. The two parties believe governments always spend any money that is available (and in fact Alberta, British Columbia and Ontario have not returned all their carbon tax revenue to taxpayers).

Federal Environment Minister Catherine McKenna preaching with Justin Trudeau in the choir.

In challenging Ontario’s upcoming withdrawal from cap-and-trade, the federal government is introducing a new and very dangerous interpretation of federalism. No one questioned that Ontario’s program was within its jurisdiction. Now the federal government is saying that if Ontario repeals its own law, it will be replaced by the imposition of a federal tax exclusively within Ontario’s borders. In effect it will be a “provincial” tax, not a “national” or “federal” one applied to all Canadians.

But the federal government has no mandate to force Ontario to retain one of its own programs if its government wants to repeal it. In effect, the federal level is trying to use its taxation power to make the environment an exclusive federal responsibility.

The courts might uphold the federal government’s right to collect such a tax but the political battle could be fatal. If it can prevent Doug Ford repealing an existing Ontario law, then it can prevent other provinces repealing other provincial laws. In that case, there is no federalism, no division of power, and no independent provincial jurisdiction. Quebec could not repeal its cap-and-trade law – just the threat the separatists need to rise from their death-bed.

The federal government can forge ahead with a series of political and court battles, or it can go back to the drawing board, in which case there seem to be two options. One is co-operative federalism – namely, call a heads-of-government meeting and confirm Canada’s Paris goals; each provinces’ share of those goals; the federal right to implement policies within its jurisdiction; each province’s right to implement their own policies as they wish; and confirm that they will all co-operate to avoid duplication or contradictory policies.

The second option is for the federal government to raise its existing national carbon tax on gasoline and other forms of fossil fuel. It has full constitutional power to do so, can do it any time, the revenue can be returned to taxpayers, and it could be completely transparent. Actually, if it had done this in 2016, Canada would already be on the way to meeting its Paris goals, rather than locked in an increasingly ugly and unnecessary federal-provincial, regional, political and ideological battle.

It’s not too late to get it right but that does mean going back to the drawing board.

Ed Whitcomb is the author of Rivals for Power: Ottawa and the Provinces, the contentious history of the Canadian federation, and of short histories of all 10 provinces.

Comment:  Whitcomb does not question the climate change ideology or see the uselessness of the Paris Accord.  In the event Trudeau imposes a widely unpopular federal tax on carbon emissions, the backlash could overturn his administration.

USCS Warnings of Coastal Floodings

Be not Confused. USCS is not the US Coastal Service, but rather stands for the Union of Super Concerned Scientists, or UCS for short. Using their considerable PR skills and budgets, they have plastered warnings in the media targeting major coastal cities, designed to strike terror in anyone holding real estate in those places. Example headlines include:

Sea level rise could put thousands of homes in this SC county at risk, study says The State, South Carolina

Taxpayers in the Hamptons among the most exposed to rising seas Crain’s New York Business

Adapting to Climate Change Will Take More Than Just Seawalls and Levees Scientific American

The Biggest Threat Facing the City of Miami Smithsonian Magazine

What Does Maryland’s Gubernatorial Race Mean For Flood Management? The Real News Network

Study: Thousands of Palm Beach County homes impacted by sea-level rise WPTV, Florida

Sinking Land and Climate Change Are Worsening Tidal Floods on the Texas Coast Texas Observer

Sea Level Rise Will Threaten Thousands of California Homes Scientific American

300,000 coastal homes in US, worth $120 billion, at risk of chronic floods from rising seas USA Today

That last gets the thrust of the UCS study Underwater: Rising Seas, Chronic Floods, and the Implications for US Coastal Real Estate (2018)

Sea levels are rising. Tides are inching higher. High-tide floods are becoming more frequent and reaching farther inland. And hundreds of US coastal communities will soon face chronic, disruptive flooding that directly affects people’s homes, lives, and properties.

Yet property values in most coastal real estate markets do not currently reflect this risk. And most homeowners, communities, and investors are not aware of the financial losses they may soon face.

This analysis looks at what’s at risk for US coastal real estate from sea level rise—and the challenges and choices we face now and in the decades to come.

The report and supporting documents give detailed dire warnings state by state, and even down to counties and townships. As example of the damage projections is this table estimating 2030 impacts:

State  Homes at Risk  Value at Risk Property Tax at Risk  Population in 
at-risk homes 
AL  3,542 $1,230,676,217 $5,918,124  4,367
CA  13,554 $10,312,366,952 $128,270,417  33,430
CT  2,540 $1,921,428,017 $29,273,072  5,690
DC  – $0 $0  –
DE  2,539 $127,620,700 $2,180,222  3,328
FL  20,999 $7,861,230,791 $101,267,251  32,341
GA  4,028 $1,379,638,946 $13,736,791  7,563
LA  26,336 $2,528,283,022 $20,251,201  63,773
MA  3,303 $2,018,914,670 $17,887,931  6,500
MD  8,381 $1,965,882,200 $16,808,488  13,808
ME  788 $330,580,830 $3,933,806  1,047
MS  918 $100,859,844 $1,392,059  1,932
NC  6,376 $1,449,186,258 $9,531,481  10,234
NH  1,034 $376,087,216 $5,129,494  1,659
NJ  26,651 $10,440,814,375 $162,755,196  35,773
NY  6,175 $3,646,706,494 $74,353,809  16,881
OR  677 $110,461,140 $990,850  1,277
PA  138 $18,199,572 $204,111  310
RI  419 $299,462,350 $3,842,996  793
SC  5,779 $2,882,357,415 $22,921,550  8,715
TX  5,505 $1,172,865,533 $19,453,940  9,802
VA  3,849 $838,437,710 $8,296,637  6,086
WA  3,691 $1,392,047,121 $13,440,420  7,320

The methodology, of course is climate models all the way down. They explain:

Three sea level rise scenarios, developed by the National Oceanic and Atmospheric Administration (NOAA) and localized for this analysis, are included:

  • A high scenario that assumes a continued rise in global carbon emissions and an increasing loss of land ice; global average sea level is projected to rise about 2 feet by 2045 and about 6.5 feet by 2100.
  • An intermediate scenario that assumes global carbon emissions rise through the middle of the century then begin to decline, and ice sheets melt at rates in line with historical observations; global average sea level is projected to rise about 1 foot by 2035 and about 4 feet by 2100.
  • A low scenario that assumes nations successfully limit global warming to less than 2 degrees Celsius (the goal set by the Paris Climate Agreement) and ice loss is limited; global average sea level is projected to rise about 1.6 feet by 2100.

Oh, and they did not forget the disclaimer:

Disclaimer
This research is intended to help individuals and communities appreciate when sea level rise may place existing coastal properties (aggregated by community) at risk of tidal flooding. It captures the current value and tax base contribution of those properties (also aggregated by community) and is not intended to project changes in those values, nor in the value of any specific property.

The projections herein are made to the best of our scientific knowledge and comport with our scientific and peer review standards. They are limited by a range of factors, including but not limited to the quality of property-level data, the resolution of coastal elevation models, the potential installment of defensive measures not captured by those models, and uncertainty around the future pace of sea level rise. More information on caveats and limitations can be found at http://www.ucsusa.org/underwater.

Neither the authors nor the Union of Concerned Scientists are responsible or liable for financial or reputational implications or damages to homeowners, insurers, investors, mortgage holders, municipalities, or other any entities. The content of this analysis should not be relied on to make business, real estate or other real world decisions without independent consultation with professional experts with relevant experience. The views expressed by individuals in the quoted text of this report do not represent an endorsement of the analysis or its results.

The need for a disclaimer becomes evident when looking into the details. The NOAA reference is GLOBAL AND REGIONAL SEA LEVEL RISE SCENARIOS FOR THE UNITED STATES NOAA Technical Report NOS CO-OPS 083

Since the text emphasizes four examples of their scenarios, let’s consider them here. First there is San Francisco, a city currently suing oil companies over sea level rise. From tidesandcurrents comes this tidal gauge record
It’s a solid, long-term record providing a century of measurements from 1900 through 2017.  The graph below compares the present observed trend with climate models projections out to 2100.

Since the record is set at zero in 2000, the difference in 21st century expectation is stark. Instead of  the existing trend out to around 20 cm, models project 2.5 meters rise by 2100.

New York City is represented by the Battery tidal gauge:
Again, a respectable record with a good 20th century coverage.  And the models say:
The red line projects 2500 mm rise vs. 284 mm, almost a factor of 10 more.  The divergence is evident even in the first 17 years.

Florida comes in for a lot of attention, especially the keys, so here is Key West:
A similar pattern to NYC Battery gauge, and here is the projection:
The pattern is established: Instead of a rise of about 30 cm, the models project 250 cm.

Finally, probably the worst case, and well-known to all already is Galveston, Texas:
The water has been rising there for a long time, so maybe the models got this one close.
Galv past & projectedThe gap is less than the others since the rising trend is much higher, but the projection is still four times the past.  Galveston is at risk, all right, but we didn’t need this analysis to tell us that.

A previous post Unbelievable Climate Models goes into why they are running so hot and so extreme, and why they can not be trusted.

July 16, 2018 Footnote:

Recently there was a flap over future sea levels at Rhode Island, so I took a look at Newport RI, the best tidal gauge record there.  Same Story:
Newport past & projected