Weather is Not Climate (again): Marine Heat Waves

Currently, the fashion is to prove global warming/climate change by pointing to “extreme” weather events. This week we have a new candidate for alarms: Marine Heat Waves. For example:

Suffering in the heat—the rise in marine heatwaves is harming ocean species Phys.org08:40

Marine heat waves threaten fish, corals SBS00:28

Ocean heat waves remake Pacific and Caribbean habitats Ars Technica06:56

Study: More Marine Heat Waves Threaten Fish, Corals Voice of America20:55 Mon, 04 Mar

More marine heatwaves threaten fish and corals — study Gulf Times16:55 Mon, 04 Mar

Ocean Heat Waves Are Threatening Marine Life The New York Times13:55 Mon, 04 Mar


Variation in sea surface temperatures is not new. Cliff Mass of U. Washington, Seattle, educated us some years ago regarding a persistent patch of N. Pacific warm water he named: “The BLOB.” A series of posts at his blog covered this event starting in Autumn 2013, waxing and waning until finally disappearing in 2018. Most informative is The Blob Strengthens Excerpts in italics with my bolds.

The original BLOB, named by Washington State Climatologist Nick Bond, formed the previous winter (2013-2014). The BLOB was defined as a persistent region of anomalously warm water in the northeast Pacific. With the air reaching the Northwest generally passing over the BLOB, the result was warmer than normal temperatures.

And by the first week of this month, the BLOB seems to have returned, and with it, its evil twin, El Nino, indicated by the warm waters in the eastern tropical Pacific. Now we have a problem. Note that the temperatures in the BLOB are 2-3 C (roughly 4-5F) above normal.

The effects of the BLOB have become more than a little evident to everyone living in our region. Temperatures are way above normal because of the warming effects of the ocean…it is hard for our minimum temperatures to fall much below the ocean temperatures this time of the year. Want to see evidence of this? Here are the surface air temperatures at Seattle Tacoma Airport for the last 4 weeks, with the average highs and lows shown. We have been warmer than normal, with minimum temperatures consistently 3-4F above normal.

The BLOB itself is not an independent player. It has been forced by an anomalous atmospheric circulation, including anomalous high pressure (ridging) centered north of our region (see map showing the height (pressure) anomalies (difference from normal) at 500 hPa (about 18,000ft) for the last 30 days. Yellow indicates higher heights than normal.

An article from that time (2016) at Climate Central took mainly the alarmist view, but also quoted a reasonable statement from Cliff Mass. California Drought, Marine Heat More Likely With Warming

“The atmospheric variability that forced the warm blob is the same that forced the drought,” said Emanuele Di Lorenzo, an ocean and climate dynamics professor at Georgia Tech who coauthored the analysis, published in Nature Climate Change. “This atmospheric variability is increasing under greenhouse gases.”

The new findings could help scientists predict when similar marine heatwaves and droughts will strike in the future. They also suggest such heatwaves will become more common and intense, which could mean greater drought risks in the West. (By increasing evaporation and reducing snowfall, warmer temperatures are already making Western droughts worse.)

“This could potentially provide predictability,” said Cliff Mass, a University of Washington atmospheric sciences professor who wasn’t involved with the research. “This is natural variability that we’re dealing with.”

What Are “Marine Heat Waves?” (From Marine Heat

We use a recently developed definition of marine heatwaves (Hobday et al. 2016). A marine heatwave is defined a when seawater temperatures exceed a seasonally-varying threshold (usually the 90th percentile) for at least 5 consecutive days. Successive heatwaves with gaps of 2 days or less are considered part of the same event.

Marine heatwaves can be caused by a whole range of factors, and not all factors are important for each event. The most common drivers of marine heatwaves include ocean currents which can build up areas of warm water and air-sea heat flux, or warming through the ocean surface from the atmosphere. Winds can enhance or suppress the warming in a marine heatwave, and climate modes like El Niño can change the likelihood of events occurring in certain regions.

[Note: the phrase about the atmosphere warming the ocean is misleading. The ocean has 1000 times the heat capacity of the air, and the heat transfer is upward. From Columbia U. on the Ocean/Atmosphere Heat Flux:

Solar heating of the ocean on a global average is 168 watts per square meter

Net infrared radiation cools the ocean, on a global average by 66 watts per square meter.

On global average the oceanic heat loss by conduction is only 24 watts per square meter. (If the ocean were colder than the atmosphere (which of course happens) the air in contact with the ocean cools, becoming denser and hence more stable, more stratified. As such the conduction process does a poor job of carrying the atmosphere heat into the cool ocean.)

On global average the heat loss by evaporation is 78 watts per square meter. (The largest heat loss for the ocean is due to evaporation, which links heat exchange with hydrological cycle.) ]

The trigger for the current concern is Marine heatwaves threaten global biodiversity and the provision of ecosystem services published March 4, 2019 at Nature Climate Change. Dan A. Smale is lead author with 17 co-authors. The media were quick to misinterpret the study and claim a link to burning of fossil fuels.  Excerpts in italics with my bolds.

Abstract: The global ocean has warmed substantially over the past century, with far-reaching implications for marine ecosystems. Concurrent with long-term persistent warming, discrete periods of extreme regional ocean warming (marine heatwaves, MHWs) have increased in frequency. Here we quantify trends and attributes of MHWs across all ocean basins and examine their biological impacts from species to ecosystems. Multiple regions in the Pacific, Atlantic and Indian Oceans are particularly vulnerable to MHW intensification, due to the co-existence of high levels of biodiversity, a prevalence of species found at their warm range edges or concurrent non-climatic human impacts. The physical attributes of prominent MHWs varied considerably, but all had deleterious impacts across a range of biological processes and taxa, including critical foundation species (corals, seagrasses and kelps). MHWs, which will probably intensify with anthropogenic climate change, are rapidly emerging as forceful agents of disturbance with the capacity to restructure entire ecosystems and disrupt the provision of ecological goods and services in coming decades.

My Comments

The authors managed to produce an hockey stick graph by means of attaching an high-resolution instrumental record to low-resolution proxy estimates of the past. The method is described in the paper:

Global time series and regional trends in total MHW days were derived using a combination of satellite-based, remotely sensed SSTs and in situ-based seawater temperatures. First, total MHW days were calculated globally over 1982–2015 at 1/4° resolution from the National Oceanic and Atmospheric Administration (NOAA) Optimum Interpolation SST V2 high-resolution data. Then, proxies for total MHW days globally over 1900–2016 were developed on the basis of five monthly gridded SST datasets (HadISST v.1.1, ERSST v.5, COBE 2, CERA-20C and SODA si.3). A final proxy time series was calculated by averaging across the five datasets. The five monthly datasets were used since no global daily SST observations are available before 1982.

The three peaks in the modern record are clearly the result of the major El Ninos 1997, 2009 and 2015. And it is likely that mining the daily satellite records since 1982 identified marine heat waves that would not show up in the proxy monthly datasets.


This is another example of a natural process that threatens our livelihoods but which we struggle to predict and to adapt. As with other short-term weather events, humankind has a great stake in better understanding in order to forecast, prepare and manage adapations as required. There have always been major variations in warming and cooling sea surface temperatures. And yet the Global average anomalies vary by a few tenths of a degree celsius, with significant difference in the two hemispheres. This implies both that marine heat waves are offset by cold waves elsewhere, and that the well-mixed CO2 molecules are not to blame.

See Also: On Climate “Signal” and Weather “Noise”

Empirical Evidence: Oceans Make Climate

Cold Waves and CO2

To put this year’s winter cold into perspective, there is an informative article by Jon Erdman at America’s Coldest Outbreaks January 17 2018 Excerpts with my bolds and showing CO2 concentrations at the referenced dates. Note  that temperatures are in degrees Fahrenheit.

The Clear Number 1  February 1899: Atmospheric CO2 295 ppm.
The cold wave during the first two weeks of February 1899 is by far and away the gold standard for cold outbreaks in U.S. history.

What made this outbreak worthy of its lofty status was the magnitude, areal coverage and longevity of the cold.

For the first and only time on record, every state in the Union (recall, there were only 45 states at the time) dipped below zero. Subzero cold invaded parts of south-central Texas, the Gulf Coast beaches and northwest Florida.

Tallahassee, Florida, dipped to -2 degrees on Feb. 13, 1899, the only subzero low in the city’s history. This remains the all-time record low for the Sunshine State.

All-time record lows were set in a dozen states, from the Plains to the Ohio Valley, Southeast and District of Columbia. In addition to Florida, state record lows in Louisiana (-16 in Minden), Nebraska (-47 in Camp Clarke) and Ohio (-39 in Milligan) still stand today.

Dozens of cities still hold onto their all-time record low from this cold wave, including Atlanta (-9), Grand Rapids, Michigan (-24), and Wichita, Kansas (-22). Temperatures as frigid as -61 degrees (Montana), -59 degrees (Minnesota) and -50 degrees (Wisconsin) were recorded.

The Mississippi River froze solid north of Cairo, Illinois, and ice not only clogged the river in New Orleans, but also flowed into the Gulf of Mexico a few days after the heart of the cold outbreak.

Ice jams triggered floods along parts of the Ohio, Tennessee, Cumberland and James Rivers. Ice skating was the activity of choice as the San Antonio River froze.

Lacking snow cover, the ground froze to a depth of 5 feet in Chicago, damaging water, gas and other pipes.

New York City engineers found trusses on the Brooklyn Bridge had contracted 14 feet due to the cold, according to Extreme American Weather, by Tim Vasquez. Due to frozen aqueducts from Catskills reservoirs, the city of Newark was forced to draw water from other rivers and bays.

Adding insult to injury, a massive snowstorm punctuated the cold outbreak from the Gulf Coast to New England Feb. 11-14.

Cape May, New Jersey, picked up 34 inches of snow, the nation’s capital was buried by 21 inches and 15.5 inches fell in New York City, overwhelming city crews and isolating suburbs.

In Florida, snow fell in Fort Myers, Tampa saw measurable snow for one of only two times in its history, and Jacksonville picked up 1.9 inches of snow. New Orleans was blanketed by 3 inches of snow.

Here are some other notable cold outbreaks since the massive 1899 outbreak.

Winter 2013-2014 Atmospheric CO2 399 ppm

Ice builds up along Lake Michigan as temperatures dipped well below zero on January 6, 2014 in Chicago, Illinois. Chicago hit a record low of -16 degree Fahrenheit as an arctic air mass brought the coldest temperatures in about two decades into the city.
(Scott Olson/Getty Images)

– December 2013 – February 2014 was among the top 10 coldest such periods on record in seven Midwest states.

– An early January 2014 outbreak brought the coldest temperatures of the 21st century, to date, for some cities.

– The winter was among the top five snowiest on record in at least 10 major cities.

Late January-Early February 1996 Atmospheric CO2 363 ppm

– Minnesota state record: -60 degrees near Tower on Feb. 2, 1996. WCCO radio’s Mike Lynch broadcasted live from Tower that morning, during which he blew soap bubbles which then froze on the ground as a crowd watched.

– All Minnesota public schools shut down.

– Fears of natural gas shortage in northern Illinois prompted requests to reduce consumption.

Mid-Late January 1994 Atmospheric CO2 359 ppm
– 14 cities set all-time record lows, including Indianapolis (-27), Cleveland (-20) and Harrisburg, Pennsylvania (-22). Pittsburgh (-22) beat its previous all-time record set during the February 1899 outbreak.

– Both Pittsburgh (52 hours) and Cleveland (56 hours) set their record stretch of subzero cold.

– Indiana state record low set: -36 degrees at New Whiteland on Jan. 19

– 35 counties in Ohio plunged to -30 degrees or colder on Jan. 19.

– Worcester, Massachusetts, had seven straight days with subzero lows, a record stretch.

– Crown Point, New York, dipped to -48 degrees on Jan. 27.

– Coldest month on record in Caribou, Maine, with an average temperature of -0.7 degrees.

December 1990 Atmospheric CO2 356 ppm
– Most destructive freeze in California since 1949. Fifty percent of California’s citrus crop damaged.

– Record 18-day freeze streak in Salt Lake City

– 2,000 children stranded in Seattle schools due to heavy snow on Dec. 18

– Randolph, Utah, bottomed out at -45 degrees on Dec. 22.

December 1989 Atmospheric CO2 354 ppm
– All-time record lows in Kansas City (-23), Topeka, Kansas (-26), Lake Charles, Louisiana (-4), and Wilmington, North Carolina (0).

– First Christmas Day snow (trace) on record in Tallahassee. Miami had a rare freeze while Key West dipped to 44 degrees.

– 14 inches of snow fall at Myrtle Beach, South Carolina, on Christmas Eve.

– At the time, it was the fourth coldest December on record for the entire U.S.

President Reagan Inauguration – Jan. 1985 Atmospheric CO2 346 ppm
Due to the cold, President Ronald Reagan takes the oath of office for his second term as President in the Capitol Rotunda on Jan. 21, 1985.

– 13.2 inches of snow in San Antonio, Texas (Jan. 12), crushed the previous 24-hour snow record, there. Austin and Houston (3 inches each) also were blanketed by this snowstorm.

– All-time record lows were set in Chicago (-27), Jacksonville, Florida (7), and Macon, Georgia (-6)

– State record lows were set in Virginia (-30 at Mountain Lake) and North Carolina (-34 atop Mt. Mitchell).

– $1.2 billion in damage to Florida’s citrus crop

– Ronald Reagan’s second inauguration was the coldest Inauguration Day on record (7 degrees). The ceremony was moved indoors and parade cancelled.

Late December 1983 Atmospheric CO2 343 ppm
– $2 billion damage to agriculture, mainly due to freezing temperatures in central and northern Florida.

– As measured using the old formula, wind chills reached 100 degrees below zero over much of North Dakota on Dec. 22.

– Williston, North Dakota tied its all-time record low (-50) on Dec. 23. (Check out the hourly observations from that day.)

– Sioux Falls, South Dakota, remained below zero from the morning of Dec. 16 until Christmas Day afternoon.

– Over 125 daily low-temperature records were broken on Christmas Day. Tampa’s Christmas Day high was only 38 degrees.

Remembering the “Freezer Bowl AFC Championship game in Cincinnati, Ohio on Jan. 10, 1982.

January 1982 Atmospheric CO2 341 ppm
– 85 deaths were attributed to the cold wave, according to the National Climatic Data Center.

– Chicago’s O’Hare and Midway Airports set all-time record lows (-26).

– Milwaukee, Wisconsin, plunged to -26 degrees on Jan. 17, their coldest temperature in 111 years.

– Montgomery, Alabama (-2), Jackson, Mississippi (-5), and Atlanta (-5) each plunged below zero.

– Snow at rush hour on Jan. 11 slickened streets, stranding motorists in Atlanta.

– Natural gas lines froze, and up to 7 million experienced brownouts, according to Tim Vasquez.

– The second coldest game in National Football League history, the “Freezer Bowl”, was played in Cincinnati, where a kickoff temperature of -9 degrees greeted the warm-weather San Diego Chargers.

– Hundreds of cases of frostbite were treated at the stadium, including Bengals quarterback Kenny Anderson’s frosbitten ear.

Tonawanda, New York – Post Blizzard of 1977
Photo of a house almost completely buried in snow in the aftermath of the “Blizzard of ’77” in Tonawanda, New York.  (Jeff Wurstner/Wikipedia)

January 1977 Atmospheric CO2 334 ppm
– 69 first-order weather stations shivered through their record coldest month, according to Weather Underground’s Christopher Burt.

– South Carolina state record set: -20 degrees near Long Creek

– Temperatures did not rise above freezing the entire month in a swath from eastern Iowa to western Pennsylvania northward, according to Burt.

– Snow fell as far south as Miami and Homestead, Florida, the farthest south occurrence of snow in the U.S. Two inches of snow fell in Winter Haven, Florida.

– 35 percent of Florida’s citrus crop was damaged; rolling blackouts were needed in Florida due to heavy power demand.

– President Jimmy Carter walked 1.5 miles in the Inauguration Parade with temperatures just below freezing on Jan. 20.

– The “Buffalo Blizzard of ’77” added a foot of snow to the 33 inches of snow on the ground, accompanied by wind gusts to 75 mph, producing snow drifts up to 30 feet high, paralyzing the city.

January 1949 Atmospheric CO2 311 ppm
Coldest month on record in Boise, Idaho, and Spokane, Washington.

– Coldest winter at virtually every weather station in California, Nevada, Idaho and Oregon, according to Burt.

– A series of blizzards in the Great Basin and Plains claimed 150,000 sheep and cattle, isolating ranches from Wyoming to South Dakota.

– The Army airlifted supplies to snowbound ranchers.

– Snow fell in San Diego. One of only three measurable snowfalls on record in Downtown Los Angeles, as well.

– All-time record low set in San Antonio, Texas (0 degrees).

Winter of 1935-1936 Atmospheric CO2 310 ppm
– Coldest Plains winter of record.

– Low temperatures dropped below -50 degrees on four separate days in Malta, Montana.

– Parshall, North Dakota, plunged to -60 degrees on Feb. 15, still the state record low today.

– Langdon, North Dakota, remained below zero for an incredible 41 straight days, the longest stretch on record in the Lower 48 states, according to Burt.

Winter of 2019 Atmospheric CO2 409 ppm

Ice builds up along the shore of Lake Michigan as temperatures dipped to lows around -20 degrees on January 31st, 2019, in Chicago, Illinois. Businesses and schools closed, Amtrak suspended service into the city, more than a thousand flights were canceled, and mail delivery was suspended as the city coped with record-setting low temperatures.  (Photo: Scott Olson/Getty Images)

A cyclist rides through the falling snow in the Financial District, January 30th, 2019, in New York City
(Photo: Drew Angerer/Getty Images)

Frost forms on the back of Galloway cows on February 1st, 2019, in Crainlarich in Scotland. Temperatures plummeted to -15 degrees Celsius on the coldest night of the year. (Photo: Jeff J. Mitchell/Getty Images)


Clearly CO2 neither causes nor prevents outbreaks of arctic cold invading North America. Concerning ourselves with GHGs is no substitute for ensuring reliable, affordable energy and robust infrastructure.

Jonathan Erdman is a senior meteorologist at and has been an incurable weather geek since a tornado narrowly missed his childhood home in Wisconsin at age 7.


Polar Vortex Update Jan. 23

Figure i. Animation of observed 10 mb geopotential heights (contours) and geopotential height anomalies (m; shading) for 15 December 2018 – 18 January 2019. Source: Dr. Judah Cohen

Excerpts from AER Arctic Oscillation blog by Judah Cohen, January 21, 2019 in italics with my bolds.

There is increasing confidence that the stratosphere and troposphere are going to couple by any accepted metric. The GFS forecast clearly shows downward propagation of positive polar cap geopotential height anomalies from the stratosphere to the troposphere, the surface AO is predicted to turn decisively negative and high latitude blocking is the norm rather than exception over the next two weeks. Also, warm temperatures are predicted across the North American Arctic including Alaska and Greenland. Therefore, relatively cold temperatures are expected to be widespread across the Northern Hemisphere (NH) including Northern Asia, Northern Europe and Eastern North America. Relatively warm temperatures are also expected in the Barents-Kara seas, the region of the Arctic with the greatest negative sea ice extent anomalies. I would expect the relatively cold pattern to last at a minimum of four weeks and up to eight weeks.

There is some question based on the latest model runs how long the relatively cold pattern will persist. Of course, there is the possibility that after a relatively cold couple of weeks the pattern turns overall milder pattern for the remainder of the winter. But as I have discussed many times the coupling from the stratosphere to the troposphere is described as “dripping paint.” That is because the downward propagation or coupling doesn’t come at once but in pieces. Therefore, the turn to colder and possibly snowier conditions are often episodic and not continuous. So, if there is a transition to milder weather it would be a relaxation of the overall colder pattern and not a complete reversal. I would just add that this has been an extreme event in the stratosphere and sometimes an extreme event in the stratosphere does not translate into an extreme event in the troposphere and that could be true for this event as well.

With the help of my colleague Karl Pfeiffer I created an animation of the ongoing PV disruption from mid -December through last Friday shown in Figure i. Some readers have stated in the past that they enjoy the animations and here is an extended version. Maybe they are not much more than bubble game for the brain, but I am always fascinated by PV splits.

The predicted NH temperature pattern is classic negative AO with cold temperature widespread across northern Eurasia including Europe and eastern North America. And unlike recent winters, temperatures are not relatively mild across the pan-Arctic but locally in Alaska and Greenland, again classic mild locations during negative AO regimes. I do think that the warm Arctic/cold continents pattern is distinct from the negative AO pattern as argued in Cohen et al. 2018. In my opinion the upcoming predicted NH temperature pattern projects more strongly onto the negative AO than the warm Arctic/cold continents pattern. One distinction in my mind is the continuous stripe of cold temperatures along the Eurasian north slope or the land areas adjacent to the Arctic ocean, they are solidly below normal in the negative AO pattern but mild in the warm Arctic/cold continents pattern. Also, as I argued in an earlier blog the timing of the troposphere-stratosphere coupling nicely matches the timing expected based on extensive October Siberian snow cover extent. Waiting for the remainder of the winter before passing judgement but so far this winter the relationship is strong.

Currently the stratospheric PV remains split into two pieces or daughter vortices. The major daughter vortex is now centered over Hudson Bay and a minor daughter vortex is centered over the Urals with ridging centered near the North Pole (Figure 12). The daughter vortex over the Urals is predicted to drift west across Siberia and fill with time while the other daughter vortex over Hudson Bay remains nearly stationary. However, the anomalous warmth in the polar stratosphere is gone and is a sign that the stratospheric PV is recovering. The cold temperatures in the stratosphere are focused in Siberia and western North America and could be a sign where the coldest temperatures at the surface may be focused as well during the month of February, something to watch.

Niagara Falls January 21, 2019 h/t Mike Clegg

Niagara Falls January 21, 2019 h/t yorkeryan


Self-Serving Global Warmism


To believe humans are dangerously warming earth’s climate, you have to swallow a bunch of unbelievable notions. You have to think the atmosphere drives temperature, instead of the ocean with 1000 times the heat capacity. You have to disregard the sun despite its obvious effects from summer to winter and longer term. You have to think CO2 drives radiative heat transfers, instead of H2O which does 95% of the radiative work. You have to think rises in CO2 cause temperatures to rise, rather than the other way around. You have to forget it was warmer than now in the Middle Ages, warmer still in the Roman era, and warmest of all during Minoan times.  And on and on. The global warmist narrative is full of ideas upside down and backwards, including many reversals of cause and effect.

It is like a massive hot air balloon, so why doesn’t it deflate? Answer:  It is because so many interests are served by keeping it alive and pumping up public fears. In this brief video, Richard Lindzen explains how it serves politicians, NGOs and the media to be on the global warming bandwagon.

In addition, there are businesses and industries that can and do contribute to global warming fears to further their own interests.  For example,Terence Corcoran explains how the insurance industry benefits by promoting global warming in his Financial Post article Why insurers keep hyping ‘climate risks’ that don’t materialize  Excerpts in italics with my bolds.

Insurers are urging the government to invest in natural, green infrastructure even though engineers call it ineffective

For more than two decades, insurance firms facing rising property damage costs in Canada and abroad have sought some kind of salvation in the environmental movement’s climate change crusade.

The latest insurance industry initiative wanders even deeper into the quagmire of green policy advocacy. Combating Canada’s Rising Flood Costs, a new report from the Insurance Bureau of Canada (IBC), urged governments across the country to adopt “natural infrastructure” to limit escalating climate change risks.

The report continues the insurance industry’s 20-year practice of hyping climate risks. At an industry conference in 1999, one executive warned: “The increase in extreme weather events (in Canada) is part of a global trend in which climate change has played a significant role.”

The evidence was non-existent then, and not much has changed in the interim, despite the industry’s claim that climate-driven flood risk is escalating. According to the insurers, Canada needs all levels of government to turn to natural and “green” infrastructure before installing traditional “grey” infrastructure.

The first priority is to retain existing ponds, streams, trees and other natural infrastructure systems, according to the report. The second is to rebuild and replace natural infrastructure that has been lost. And the third — building new and replacing old sewers, pipes, concrete drainways, diversions, improved building techniques — should be undertaken only on a “build what you must” basis.

However, that’s not what the Ontario Society of Professional Engineers recommends. In an April report for provincial officials it said: “Numerous studies have demonstrated that green infrastructure does not provide a flood risk reduction benefit.” The engineers advised that protective plumbing, pump-station modifications and sanitary-sewer improvements are among the measures that should be taken to control urban flooding.

Insurers have an understandable self-interest in promoting infrastructure spending and government policies, laws and regulations that would protect their businesses from rising insurance claims. But the report reads like a document from the World Wildlife Fund. It was sponsored by the IBC and “generously supported” by Intact Financial Corp., Canada’s largest insurance company. The University of Waterloo-based Intact Centre on Climate Adaptation (funded by Intact, which has given millions to the centre) was also involved.

Despite the heavy corporate involvement, the CBC opened up about 10 minutes of The National, it’s flagship news show, to the industry report when it was released last month. Would The National give the pipeline, mining and telecom companies 10 minutes to promote their views?

The stars of The National that night were Blair Feltmate, head of the Centre on Climate Adaptation, and CBC News meteorologist Johanna Wagstaffe. Both repeated the insurance industry’s 20-year-old claims that climate devastation is ravaging Canada through extreme weather events — and warned the public to look out for rising insurance premiums if nothing is done. Here’s a sample:

Wagstaffe: “Every single extreme weather event is connected to a warming climate because… as we see longer and hotter summers, we see more moisture being held in our atmosphere, we see higher water levels, that means every single event is amplified by climate change.”

Feltmate: “I totally agree. So all the modelling on climate change that’s been done over the last many years by groups like the Intergovernmental Panel on Climate Change, which is a group of several hundred climate scientists… their predictions are that, yes, climate change has happened, is happening and will continue to happen. And we’re seeing the expression of extreme weather events as a result of that.”

Feltmate added the magnitude of flooding, which is the No. 1 cost due to climate change in the country, is increasing.

Such climate warnings have been official insurance industry mantra since the 1990s. Flooding and extreme weather are becoming more frequent, the industry said again and again.

Not true, according to the latest IPCC science report released this month. The impacts chapter said: “There is low confidence due to limited evidence, however, that anthropogenic climate change has affected the frequency and the magnitude of floods.” Furthermore, from 1950 to 2012 “precipitation and (fluvial) runoff have… decreased over most of Africa, East and South Asia, eastern coastal Australia, southeastern and northwestern United States, western and eastern Canada.”

Despite a lack of evidence, the industry recently claimed conditions are so bad in Canada that “weather events that used to occur every 40 years now happen every six years” — a factoid attributed to a 2012 IBC-commissioned report by veteran Western University climatologist and climate-policy activist Gordon McBean. He cited an Environment Canada report to support the 40-to-six claim, but in 2016 Canadian Underwriter magazine published a note quoting an Environment Canada official who said studies “have not shown evidence to support” the 40-to-six year frequency shift. The claim has since been scrubbed from the insurance industry’s communications on climate issues.

The insurers have a newer warning widget in the form of a graphic that appears to show a dramatic rise in catastrophic insurance losses due to climate change. A trend line rises from the mid-1980s to 2017 to a $5-billion peak with the 2016 Fort McMurray fire (see first accompanying chart). The new IBC flood report said these numbers illustrate the financial impacts of climate change and extreme weather events that are being felt by a growing number of homeowners and communities. These losses “averaged $405 million per year between 1983 and 2008, and $1.8 billion between 2009 and 2017.”

The graphic contains three dubious elements as a source for a flood report. First is an inconsistency in the source of data, a problem identified by Robert Muir, a professional engineer and member of in infrastructure task force at the Ontario Society of Professional Engineers. The 1983–2007 data set was collected through informal industry surveys, while the 2008–2017 data are tabulated systematically by an independent agency.

Data inconsistency may explain the bizarre result that the insurance industry had zero losses due to floods, water, rain and storm perils in four of 17 years between 1983 and 2000.

Second, the IBC graph also counts fire losses, including the Fort McMurray fire of 2016 — an event unrelated to flood risk. Removal of fire losses significantly flattens the curve (see the second accompanying chart). If the 2013 floods in Alberta and Toronto are treated as possible one-off freak events, the average insurance losses come to $182 million in the 1990s, $198 million during the 2000s and $268 million over the past nine years, which is not a dramatic shift considering there are many other explanations for insurance losses, including increasing individual wealth beyond mere per capita GDP values, urbanization, failure of governments to maintain decaying ancient water infrastructures, and the risks people take by moving into flood-prone areas.

The insurance industry has an obvious motive in highlighting flood risk. It is part of a concerted climate campaign by NGOs, governments and sustainable development advocates. As one executive put it at a 2016 conference the objective is to “monetize” the flood risk, an idea the IBC is pushing with the help of a relatively new “flood model” that identifies high-risk areas.

When risks are real, people should of course take steps to avoid them or get protection, including taking out insurance. But the industry seems to be heading in a questionable direction by promoting insurance for climate risks that may not exist and at the same time advocating for green protective infrastructure (see below) that will cost more and may — if the engineers are right — increase the risk.

Hurricane Science Expert Q&A

Here is a briefing on the state of hurricane science regarding any discernible effects from humans burning fossil fuels.  Hurricane Florence raises questions about link between climate change, severe storms  Storm expert David Nolan explains what we know and what we’re still trying to figure out.

The questions are posed by NBC News, a source of many stories promoting climate alarm/activism. The answers are from David Nolan, professor and chair of the department of atmospheric sciences at the University of Miami’s Rosenstiel School of Marine and Atmospheric Sciences and a noted expert on hurricanes and tropical weather. Excerpts are in italics with my bolds.

Just where are we with hurricane science? What have we learned, and what questions remain to be answered? And what about the role of climate change in the formation and propagation of severe storms?

Q: The National Hurricane Center today upgraded Florence to a Category 4 storm. What exactly does that mean?
A: It means that, by their best estimate, there are wind speeds somewhere at the surface of 130 miles per hour or greater. This estimate comes from a combination of satellite images, and, in this case, from NOAA [National Oceanic and Atmospheric Administration] aircraft that have been flying in Florence this morning.

Q: How many categories are there?
A: The categories go from 1 to 5.

Q: Could there ever be a Category 6?

A: No. Fives themselves are very rare. And reaching higher speeds — like 170 or 180 mph — is extremely rare. So it doesn’t make sense to make a category for something that will still be extremely rare, even if it happens a little more, like once every five years instead of once every 10 years.

Q: What in general dictates whether a Category 4 storm will turn into an even more destructive Category 5?

A: The conditions that are most favorable are warm ocean temperatures, like above 85 degrees [Fahrenheit], and light winds in the larger environment around the storm. Storms become category 5 in only the most favorable conditions.

Satellite images show three Atlantic hurricanes, from left, Florence, Isaac and Helene.Satellite images show three Atlantic hurricanes, from left, Florence, Isaac and Helene.NOAA

Q:  Is severe weather getting more severe and more frequent?

A: Whether or not severe weather is actually getting more severe is not clear. It is clear that the most extreme rainfall events have increased in frequency, and this is consistent with our understanding of how global warming will change the weather.

Q: Some hurricanes seem to hit land and then quickly dissipate, causing little damage. Others, like Harvey last year, give way to heavy rainfall and flooding. What determines which course a storm will take?

A: There are two factors. First, whether or not the storm keeps moving inland steadily, or whether it lingers near the coast. This is determined by the steering patterns of the larger atmosphere around it, as the hurricane is essentially carried along by the even larger weather systems around it.

Second, it depends on the kind of terrain the storm is over. In the case of Harvey, the land [in and around Houston] is relatively flat and smooth and also still near the Gulf of Mexico, so Harvey did not dissipate quickly.

Q: You said climate change seems to be changing global weather patterns to make extreme rainfall events more frequent. Can you explain exactly what’s happening?

A: The main reason is that warmer air can hold more water vapor. So when air rises and forms clouds and then rain, more water is released and then more water falls to the ground as rain.

Q: But there’s no evidence that climate change is making hurricanes more frequent?

A: There is not. Unfortunately, the existing modern records of hurricanes are only of good quality for about 60 years. Because hurricane activity varies so much from year to year, then it’s not long enough to say for sure if there is a clear trend upward due to global warming.

There has been an enormous amount of research on whether TC numbers or strength will increase in the future because of global warming. But the results of those studies are mixed and sometimes contradictory, so we can’t make a conclusive statement yet. (TC refers to tropical cyclones (hurricanes) that occur each year, in each ocean.)

Q: What exactly is the difference between a hurricane and a cyclone?

Q: Physically, they are the same thing. They are called hurricanes in the Atlantic and in the Eastern Pacific, typhoons in the West Pacific and cyclones around Australia and India.

Q: What causes these storms to form, and what makes them move as they do?

A: Hurricanes form when areas of disturbed weather — rain and thunderstorms — over the ocean start to organize into a swirling pattern. As the winds increase, they extract more and more energy and water from the ocean, thus getting stronger and larger. As for their motion, they are carried along by the larger weather patterns around them, the usual lows and highs that most people often see on weather maps.

Q: How big can hurricanes get?

The areas of hurricanes with significant weather (winds and rain) are usually about 200 miles across. Some can be larger, as much as 300 miles. Some are quite small, only 50 miles.

Q: Do they always swirl in the same direction?

A: In the northern hemisphere, they rotate counterclockwise. In the southern hemisphere, it is the opposite. They get their rotation from the Earth’s rotation, which has an opposite sense whether you are in the Northern or Southern Hemisphere.

Q: You’re an expert in the use of computer modeling to study hurricanes. What have you learned from your research?

A: Most of my research has been about hurricane formation. We’ve used computer models to understand the physical processes by which hurricanes form. There are many “disturbances” over the oceans every summer, but most of them do not become hurricanes. We want to understand why some of them do.

Q: How does computer modeling work?

Computer models attempt to simulate the motions of the atmosphere. The first step is to assemble a digital “image” of the weather right now, much the same way that a camera image is made up of pixels of many different colors. But next, it uses the laws of physics and mathematics to determine how each part of the atmosphere will change with time, as they are influenced by the other pixels around them.

Q: Any new findings?

A: Our work showed the importance of moisture in the middle levels of the atmosphere, around 10,000 to 20,000 feet, in the regions where hurricanes tend to form. Higher-than-average moisture is much more favorable for hurricanes to form.

Q: In addition to computers, aircraft and satellites, are there any new tools that hurricane scientists are now using to facilitate their research?

A: The new generation of satellites, such as the new GOES 16 which recently became operational, are excellent. They make it much easier to see what is going on in these storms. The other developing advance is the use of drones. There are large drones, such as the NASA Global Hawk aircraft, which is about the size of a corporate jet and can fly over a hurricane for 24 hours straight. And there are small drones that can be dropped into a hurricane out of one of the NOAA aircraft, and can get much closer to the ocean’s surface than the NOAA planes (with people in them) are allowed to fly.

Q: What’s next for hurricane science ?

A: Many scientists these days are trying to better understand “rapid intensification,” which is when a hurricane’s winds increase by two or more categories in a single day. But there has been a lot of progress on that, and the computer models have become pretty good at predicting this, just as they are for Hurricane Florence right now.

The other very popular topic is how hurricane activity will (or will not) change with global warming. While everyone seems to think it will make it worse, there is no proof of that yet.

Q: As a hurricane researcher, is there some scenario that keeps researchers up at night?

A: I think it does make us more aware that bad events can and will happen. But we also understand that the chances of it happening to any one place is also very small.

Q: The National Weather Service website has a list of common misperceptions about hurricanes. What do you think are the most common ones people have?

A: I’m not sure about most common. But one that I think is most dangerous is that many people have the perception that they have experienced hurricane conditions before. Many people experience fringe effects of a hurricane and think they have been through a hurricane. Real hurricane conditions (sustained winds of 75 mph or higher) are actually much worse than people realize.


In theory, global warming (for whatever causes) should produce more moisture and extreme rainfall. In practice there is no evidence that this has happened.  It is also not clear that extreme weather events are more severe than in the past, or that hurricanes are more frequent.  The idea of a category six hurricane is an alarmist fantasy, akin to the notion of a geologic period called the “anthropocene.”  “Climate Change” is still something we see in the rear view mirror, not a causal agent in nature.

Hurricanes Not a Problem for Nuclear Power

NASA-NOAA satellite image of the Atlantic Ocean captured on September 11, 2018 at 11:45 AM EDT showing Hurricane Florence approaching the east coast with Tropical Storm Isaac and Hurricane Helene fast on her heels.NASA/NOAA

Dr, James Conca explains in his Forbes article Hurricane Florence No Problem For Nuclear Power Plants  Excerpts in italics with my bolds.

Along with most everyone else, nuclear power plants in North and South Carolina, as well as Virginia, have been preparing for the natural onslaught.

Hurricane Florence will most likely hit two nuclear power plants operated by Duke Energy – their 1,870 megawatt (MW) Brunswick and 932MW Harris nuclear plants in North Carolina. If Florence turns north, it could also hit Dominion Energy’s 1,676MW Surry plant in Virginia. Brunswick is expected to get a direct hit.

The Brunswick Nuclear Power Plant, two miles north of Southport, North Carolina will get a direct hit by Hurricane Florence. But there’s no worry as nuclear plants are the most resistant to severe weather of all energy sources. The plant produces over 15 billion kWhs a year and provides power to over 4 million people.DOC SEARLS

The United States Nuclear Regulatory Commission (NRC) is watching carefully. But no one is really worried that much will happen, contrary to lots of antinuclear fearmongering. Power outages will occur as lines and transformers are destroyed and non-nuclear buildings get damaged, and it might takes a few days to a few weeks to bring power back up, something that includes all energy sources.

‘We anticipate Hurricane Florence to be an historic storm that will impact all customers,’ said Grace Rountree, a spokeswoman for Duke. These reactors provide power to about 4 million customers in the two Carolinas.

The Brunswick plant has withstood several hurricanes since the two reactors there began operation in the mid-1970s, including Category 3 Hurricane Diana in 1984 and Category 3 Hurricane Fran in 1996. Category 4 Hurricane Hugo, the most often-compared with Florence, made landfall about 150 miles southwest of Brunswick in South Carolina in 1989.

Following protocols, the reactors at the nuclear plants have started shutting down before the hurricane is scheduled to arrive. While all nuclear reactors are protected against extreme winds, including tornado-strength gusts up to 300 mph, they shut down as a protective measure.

Food, water and other necessities are kept onsite at these nuclear plants to prepare for potential isolation of the site, and staff needed during the storm are brought in to ensure proper resources are available for an extended period.

The Carolinas have a heavy concentration of power reactors – 12 of the country’s 99 reactors. Four more reactors are in Virginia and five are in coastal Delaware and Maryland. These reactors provide enough electricity to power 30 cities the size of Raleigh.

Nuclear is the only energy source immune to all extreme weather events – by design. Plants have steel-reinforced concrete containments with over 4-foot thick walls. The buildings housing the reactors, vital equipment and used fuel have steel-reinforced concrete walls up to 7 feet thick, which are built to withstand any category hurricane or tornado. They can even withstand a plane flying directly into them.

Whether it’s hurricanes, floods, earthquakes, heat waves or severe cold, nuclear performs more reliably than anything else. There’s no better reason to retain our nuclear fleet, and even expand it, to give us a diverse energy mix that can handle any natural disaster that can occur.

James Conca

I have been a scientist in the field of the earth and environmental sciences for 33 years, specializing in geologic disposal of nuclear waste, energy-related research, planetary surface processes, radiobiology and shielding for space colonies, subsurface transport and environmental clean-up of heavy metals. I am a Trustee of the Herbert M. Parker Foundation, Adjunct at WSU, an Affiliate Scientist at LANL and consult on strategic planning for the DOE, EPA/State environmental agencies, and industry including companies that own nuclear, hydro, wind farms, large solar arrays, coal and gas plants. I also consult for EPA/State environmental agencies and industry on clean-up of heavy metals from soil and water. For over 25 years I have been a member of Sierra Club, Greenpeace, the NRDC, the Environmental Defense Fund and many others, as well as professional societies including the America Nuclear Society, the American Chemical Society, the Geological Society of America and the American Association of Petroleum Geologists.

Florance Not a Climate Fortune Teller

Some rare and timely common sense from David Kreutzer Hurricane Florence Is Not an Omen About Climate Change  Excerpts from Daily Signal article in italics with my bolds.

In today’s hyper-politicized world of climate science, hardly a thunderstorm passes without somebody invoking the “scientists say” trope to blame it on carbon emissions.

The logic seems to be: If it’s bad, it was caused by carbon emissions, and we are only going to see more and worse. More and worse floods, droughts, tornadoes, and of course, hurricanes.

The problem with this argument is that overall, we are not seeing more floods, droughts, tornadoes, or hurricanes in spite of the steady rise in the small amount of carbon dioxide, and in spite of the mild warming of the planet. The data show that there is no significant upward trend in any of these weather events.

These are not the conclusions of climate skeptics. They are conclusions drawn by the Intergovernmental Panel on Climate Change and our own National Oceanic and Atmospheric Administration.

This week, the Carolina coast and some yet-to-be-determined inland counties will endure the heavy and destructive rains of Hurricane Florence. Without a doubt, some places will see records broken.

As the hurricane arrives, talking heads will hit the airwaves claiming that “scientists say” it was caused by carbon emissions. Some may spin it more subtly, saying that while we cannot identify which storms are caused by increased levels of atmospheric carbon dioxide, the storms today are getting stronger and more frequent.

But this simply is not true. We are not seeing more frequent hurricanes, nor are we seeing a greater number of major hurricanes.

The Intergovernmental Panel on Climate Change said as much in its latest science report:

Current data sets indicate no significant observed trends in global tropical cyclone frequency over the past century and it remains uncertain whether any reported long-term increases in tropical cyclone frequency are robust, after accounting for past changes in observing capabilities. … No robust trends in annual numbers of tropical storms, hurricanes, and major hurricanes counts have been identified over the past 100 years in the North Atlantic basin.

Be on the alert for those who quote the Intergovernmental Panel on Climate Change as saying there has been an upward trend in hurricanes since the 1970s. That is a misleading claim. Hurricane landfalls actually fell for the decades before the 1970s.

Cherry-picking endpoints can produce “trends” that are either up or down. The fact is that for the past century, there is no trend.

Furthermore, there was never a time when the climate was stable (as some would claim), when weather events happened with smooth regularity. There have always been cycles—years and decades that included large numbers of hurricanes, and others with few.

Whether carbon dioxide levels rise, fall, or stay the same, we will continue to see hurricanes. Some of these hurricanes will be immensely destructive of both property and human life. Some will break records for wind and/or rain. And they will be tragic.

The fact that tragic weather events have not stopped is not evidence that carbon emissions are leading us to a climate catastrophe. Perhaps we will see a decades-long increase in one category or another, it has happened before—but that will not prove the predictions of catastrophic climate change one way or the other.

Even if all of the mild (though uneven) warming that seems to have occurred over the past century were due to man-made carbon emissions, that would still not be a reason to fear for the future. The overall story does not point to climate catastrophe.

But weather catastrophes will continue to strike, and we will still face the danger wrought by nature’s wrath. Hurricane Florence is shaping up to be exactly such a storm.

X-Weather is Back! Kerala edition


2018 Kerala India Flooding and Rainfall Event

Because of obsession with CO2,  flooding in Kerala S. India is the latest example of the desire to pin the blame on “climate change”. Jonathan Eden wrote a balanced article in WION Why it’s so hard to detect the fingerprints of global warming on monsoon rains  Excerpts in italics with my bolds.

Unlike temperature, rainfall varies hugely in space and time. Even the most sophisticated climate models struggle to simulate physical processes such as convection and evaporation that drive rainfall activity. On top of that, global warming is not expected to change the frequency and intensity of rainfall extremes in the same way in all parts of the world.

The choice to focus solely on the rainfall itself is particularly relevant for flooding events. Though accusations of poor decision-making and mismanagement of water resources are beginning to appear in the Kerala aftermath, the floods simply would not have occurred without a significant amount of rain. Few of those suffering lost homes and livelihoods are likely to care much about where the rain came from or the intricacies of the weather conditions that led to it.

Disentangling these contributory factors takes time. In comparison to droughts and heatwaves, short-term hazards such as floods do not usually give us much chance to report concrete findings while the media and general public are still engaged in the event. In-depth studies may not publish their results for many months, sometimes even years after the event in question.

Many of these issues are not exclusive to extreme rainfall. The excellent US National Academies report on Attribution of Extreme Weather Events in the Context of Climate Change describes the shortcomings in our efforts to attribute a variety of extremes. But for rainfall in particular there is a discrepancy between what we understand about the general effect of global warming and our rather lesser ability to quantify the climate change fingerprint on specific events.

2017 Hurricane Harvey X-Weathermen Event

With Hurricane Harvey making landfall in Texas as a Cat 4, the storm drought is over and claims of linkage to climate change can be expected.  So far (mercifully) articles in Time and Washington Post have been more circumspect than in the past.  Has it become more respectable to look at the proof supporting wild claims?  This post provides background on X-Weathermen working hard to claim extreme weather as proof of climate change.

In the past the media has been awash with claims of “human footprints” in extreme weather events, with headlines like these:

“Global warming is making hot days hotter, rainfall and flooding heavier, hurricanes stronger and droughts more severe.”

“Global climate change is making weather worse over time”

“Climate change link to extreme weather easier to gauge”– U.S. Report

“Heat Waves, Droughts and Heavy Rain Have Clear Links to Climate Change, Says National Academies”

That last one refers to a paper released by the National Academy of Sciences Press: Attribution of Extreme Weather Events in the Context of Climate Change (2016)

And as usual, the headline claims are unsupported by the actual text. From the NAS report (here): (my bolds)

Attribution studies of individual events should not be used to draw general conclusions about the impact of climate change on extreme events as a whole. Events that have been selected for attribution studies to date are not a representative sample (e.g., events affecting areas with high population and extensive infrastructure will attract the greatest demand for information from stakeholders) P 107

Systematic criteria for selecting events to be analyzed would minimize selection bias and permit systematic evaluation of event attribution performance, which is important for enhancing confidence in attribution results. Studies of a representative sample of extreme events would allow stakeholders to use such studies as a tool for understanding how individual events fit into the broader picture of climate change. P 110

Correctly done, attribution of extreme weather events can provide an additional line of evidence that demonstrates the changing climate, and its impacts and consequences. An accurate scientific understanding of extreme weather event attribution can be an additional piece of evidence needed to inform decisions on climate change related actions. P. 112

The Indicative Without the Imperative


The antidote to such feverish reporting is provided by Mike Hulme in a publication: Attributing Weather Extremes to ‘Climate Change’: a Review (here).

He has an insider’s perspective on this issue, and is certainly among the committed on global warming (color him concerned). Yet here he writes objectively to inform us on X-weather, without advocacy: real science journalism and a public service, really. Excerpts below with my bolds.


In this third and final review I survey the nascent science of extreme weather event attribution. The article proceeds by examining the field in four stages: motivations for extreme weather attribution, methods of attribution, some example case studies and the politics of weather event Attribution.

The X-Weather Issue

As many climate scientists can attest, following the latest meteorological extreme one of the most frequent questions asked by media journalists and other interested parties is: ‘Was this weather event caused by climate change?’

In recent decades the meaning of climate change in popular western discourse has changed from being a descriptive index of a change in climate (as in ‘evidence that a climatic change has occurred’) to becoming an independent causative agent (as in ‘climate change caused this event to happen’). Rather than being a descriptive outcome of a chain of causal events affecting how weather is generated, climate change has been granted power to change worlds: political and social worlds as much as physical and ecological ones.

To be more precise then, what people mean when they ask the ‘extreme weather blame’ question is: ‘Was this particular weather event caused by greenhouse gases emitted from human activities and/or by other human perturbations to the environment?’ In other words, can this meteorological event be attributed to human agency as opposed to some other form of agency?

The Motivations

Hulme shows what drives scientists to pursue the “extreme weather blame” question, noting four motivational factors.

Why have climate scientists over the last ten years embarked upon research to provide an answer beyond the stock phrase ‘no individual weather event can directly be attributed to greenhouse gas emissions’?  There seem to be four possible motives.

The first is because the question piques the scientific mind; it acts as a spur to develop new rational understanding of physical processes and new analytic methods for studying them.

A second argument, put forward by some, is that it is important to know whether or not specific instances of extreme weather are human-caused in order to improve the justification, planning and execution of climate adaptation.

A third argument for pursuing an answer to the ‘extreme weather blame’ question is inspired by the possibility of pursuing legal liability for damages caused. . . If specific loss and damage from extreme weather can be attributed to greenhouse gas emissions – even if expressed in terms of increased risk rather than deterministically – then lawyers might get interested.

The liability motivation for research into weather event attribution also bisects the new international political agenda of ‘loss and damage’ which has emerged in the last two years. . . The basic idea is to give recognition that loss and damage caused by climate change is legitimate ground for less developed countries to gain access to new international climate adaptation funds.

4. Persuasion
A final reason for scientists to be investing in this area of climate science – a reason stated explicitly less often than the ones above and yet one which underlies much of the public interest in the ‘extreme weather blame’ question – is frustration with and argument about the invisibility of climate change. . . If this is believed to be true – that only scientists can make climate change visible and real –then there is extra onus on scientists to answer the ‘extreme weather blame’ question as part of an effort to convince citizens of the reality of human-caused climate change.

Attribution Methods

Attributing extreme weather events to human influences requires different approaches, of which four broad categories can be identified.

1. Physical Reasoning
The first and most general approach to attributing extreme weather phenomena to rising greenhouse gas concentrations is to use simple physical reasoning.

General physical reasoning can only lead to broad qualitative statements such as ‘this extreme weather is consistent with’ what is known about the human-enhanced greenhouse effect. Such statements offer neither deterministic nor stochastic answers and clearly underdetermine the ‘weather blame question.’ It has given rise to a number of analogies to try to communicate the non-deterministic nature of extreme event attribution. The three most widely used ones concern a loaded die (the chance of rolling a ‘6’ has increased, but no single ‘6’ can be attributed to the biased die), the baseball player on steroids (the number of home runs hit increases, but no single home run can be attributed to the steroids) and the speeding car-driver (the chance of an accident increases in dangerous conditions, but no specific accident can be attributed to the fast-driving).

2. Classical Statistical Analysis
A second approach is to use classical statistical analysis of meteorological time series data to determine whether a particular weather (or climatic) extreme falls outside the range of what a ‘normal’ unperturbed climate might have delivered.

All such extreme event analyses of meteorological time series are at best able to detect outliers, but can never be decisive about possible cause(s). A different time series approach therefore combines observational data with model simulations and seeks to determine whether trends in extreme weather predicted by climate models have been observed in meteorological statistics (e.g. Zwiers et al., 2011, for temperature extremes and Min et al., 2011, for precipitation extremes). This approach is able to attribute statistically a trend in extreme weather to human influence, but not a specific weather event. Again, the ‘weather blame question’ remains underdetermined.


3. Fractional Attributable Risk (FAR)
Taking inspiration from the field of epidemiology, this method seeks to establish the Fractional Attributable Risk (FAR) of an extreme weather (or short-term climate) event. It asks the counterfactual question, ‘How might the risk of a weather event be different in the presence of a specific causal agent in the climate system?’

The single observational record available to us, and which is analysed in the statistical methods described above, is inadequate for this task. The solution is to use multiple model simulations of the climate system, first of all without the forcing agent(s) accused of ‘causing’ the weather event and then again with that external forcing introduced into the model.

The credibility of this method of weather attribution can be no greater than the overall credibility of the climate model(s) used – and may be less, depending on the ability of the model in question to simulate accurately the precise weather event under consideration at a given scale (e.g. a heatwave in continental Europe, a rain event in northern Thailand) (see Christidis et al., 2013a).

4. Eco-systems Philosophy
A fourth, more philosophical, approach to weather event attribution should also be mentioned. This is the argument that since human influences on the climate system as a whole are now clearly established – through changing atmospheric composition, altered land surface characteristics, and so on – there can no longer be such a thing as a purely natural weather event. All weather — whether it be a raging tempest or a still summer afternoon — is now attributable to human influence, at least to some extent. Weather is the local and momentary expression of a complex system whose functioning as a system is now different to what it would otherwise have been had humans not been active.

Results from Weather Attribution Studies

Hulme provides a table of numerous such studies using various methods, along with his view of the findings.

It is likely that attribution of temperature-related extremes using FAR methods will always be more attainable than for other meteorological extremes such as rainfall and wind, which climate models generally find harder to simulate faithfully at the spatial scales involved. As discussed below, this limitation on which weather events and in which regions attribution studies can be conducted will place important constraints on any operational extreme weather attribution system.

Political Dimensions of Weather Attribution

Hulme concludes by discussing the political hunger for scientific proof in support of policy actions.

But Hulme et al. (2011) show why such ambitious claims are unlikely to be realised. Investment in climate adaptation, they claim, is most needed “… where vulnerability to meteorological hazard is high, not where meteorological hazards are most attributable to human influence” (p.765). Extreme weather attribution says nothing about how damages are attributable to meteorological hazard as opposed to exposure to risk; it says nothing about the complex political, social and economic structures which mediate physical hazards.

And separating weather into two categories — ‘human-caused’ weather and ‘tough-luck’ weather – raises practical and ethical concerns about any subsequent investment allocation guidelines which excluded the victims of ‘tough-luck weather’ from benefiting from adaptation funds.

Contrary to the claims of some weather attribution scientists, the loss and damage agenda of the UNFCCC, as it is currently emerging, makes no distinction between ‘human-caused’ and ‘tough-luck’ weather. “Loss and damage impacts fall along a continuum, ranging from ‘events’ associated with variability around current climatic norms (e.g., weather-related natural hazards) to [slow-onset] ‘processes’ associated with future anticipated changes in climatic norms” (Warner et al., 2012:21). Although definitions and protocols have not yet been formally ratified, it seems unlikely that there will be a role for the sort of forensic science being offered by extreme weather attribution science.


Thank you Mike Hulme for a sane, balanced and expert analysis. It strikes me as being another element in a “Quiet Storm of Lucidity”.

Is that light the end of the tunnel or an oncoming train?

August 2018 Atlantic Hurricane Outlook

Note: MDR refers to the Main Development Region for tropical storms.

Joe D’Aleo provides the explanation (H/T iceagenow) Then the Rains Came

It has been a changeable and at times extreme spring and summer. The cold and snow of March gave way to a cold April and some very chilly days well into the spring. Warmth with some very hot days followed in the early to mid summer. Then the rains came with strong thunderstorms. The wet August spell put an end to a borderline drought spell the last few years. It became very muggy, keeping nighttime temperatures up and air conditioners on.

The changes all have to do with the wind direction. The jet stream brought chilly air masses (and snow) into eastern Canada even into June. The winds around these cool air masses turn to the northeast here in New England coming in off cool land and water. Then increasingly warm air masses built north into the Canadian prairies and came east. The surface winds turned northwesterly. In summer, warm air crossing the Appalachians and sinking down into the Merrimack Valley and coast heats by compression 5F or more. Our hottest days come with these ‘downslope winds’. Historically all the 100F days come with a west to northwest wind.

When late July and August came, our surface winds turned southwesterly.

This change was caused by a sharp cooling of the subtropical Atlantic Ocean waters relative to 2017.

The Atlantic cool subtropical pattern leads to that stronger than normal Atlantic high pressure called the ‘Bermuda High’. In these patterns, this regular feature of our climate expands south and west and acts as a pump for moisture much like we see in the southeastern and eastern Asia monsoon flow. This causes nights to be warm and muggy, and days very warm and showery here in the east.

Uncomfortable yes, unprecedented heat no. You may be surprised that most of the extreme heat records for the region, country and world occurred in the early 20th century or earlier. The 1930s was the record decade in the United States as a whole. For the east, the 1950s was the warmest but extreme heat has occurred even in cold periods.

Summary: Effect on the Hurricane Season

As a general rule, when the subtropical Atlantic is warm, we have more hurricane activity, when it is cool, we have fewer storms.

Why the Left Coast is Burning

It is often said that truth is the first casualty in the fog of war. That is especially true of the war against fossil fuels and smoke from wildfires. The forests are burning in California, Oregon and Washington, all of them steeped in liberal, progressive and post-modern ideology. There are human reasons that fires are out of control in those places, and it is not due to CO2 emissions. As we shall see, Zinke is right and Brown is wrong. Some truths the media are not telling you in their drive to blame global warming/climate change. Text below is excerpted from sources linked at the end.

1. The World and the US are not burning.

The geographic extent of this summer’s forest fires won’t come close to the aggregate record for the U.S. Far from it. Yes, there are some terrible fires now burning in California, Oregon, and elsewhere, and the total burnt area this summer in the U.S. is likely to exceed the 2017 total. But as the chart above shows, the burnt area in 2017 was less than 20% of the record set way back in 1930. The same is true of the global burnt area, which has declined over many decades.

In fact, this 2006 paper reported the following:

“Analysis of charcoal records in sediments [31] and isotope-ratio records in ice cores [32] suggest that global biomass burning during the past century has been lower than at any time in the past 2000 years. Although the magnitude of the actual differences between pre-industrial and current biomass burning rates may not be as pronounced as suggested by those studies [33], modelling approaches agree with a general decrease of global fire activity at least in past centuries [34]. In spite of this, fire is often quoted as an increasing issue around the globe [11,26–29].”

People have a tendency to exaggerate the significance of current events. Perhaps the youthful can be forgiven for thinking hot summers are a new phenomenon. Incredibly, more “seasoned” folks are often subject to the same fallacies. The fires in California have so impressed climate alarmists that many of them truly believe global warming is the cause of forest fires in recent years, including the confused bureaucrats at Cal Fire, the state’s firefighting agency. Of course, the fires have given fresh fuel to self-interested climate activists and pressure groups, an opportunity for greater exaggeration of an ongoing scare story.

This year, however, and not for the first time, a high-pressure system has been parked over the West, bringing southern winds up the coast along with warmer waters from the south, keeping things warm and dry inland. It’s just weather, though a few arsonists and careless individuals always seem to contribute to the conflagrations. Beyond all that, the impact of a warmer climate on the tendency for biomass to burn is considered ambiguous for realistic climate scenarios.

2. Public forests are no longer managed due to litigation.

According to a 2014 white paper titled; ‘Twenty Years of Forest Service Land Management Litigation’, by Amanda M.A. Miner, Robert W. Malmsheimer, and Denise M. Keele: “This study provides a comprehensive analysis of USDA Forest Service litigation from 1989 to 2008. Using a census and improved analyses, we document the final outcome of the 1125 land management cases filed in federal court. The Forest Service won 53.8% of these cases, lost 23.3%, and settled 22.9%. It won 64.0% of the 669 cases decided by a judge based on cases’ merits. The agency was more likely to lose and settle cases during the last six years; the number of cases initiated during this time varied greatly. The Pacific Northwest region along with the Ninth Circuit Court of Appeals had the most frequent occurrence of cases. Litigants generally challenged vegetative management (e.g. logging) projects, most often by alleging violations of the National Environmental Policy Act and the National Forest Management Act. The results document the continued influence of the legal system on national forest management and describe the complexity of this litigation.”

There is abundant evidence to support the position that when any forest project posits vegetative management in forests as a pretense for a logging operation, salvage or otherwise, litigation is likely to ensue, and in addition to NEPA, the USFS uses the Property Clause to address any potential removal of ‘forest products’. Nevertheless, the USFS currently spends more than 50% of its total budget on wildfire suppression alone; about $1.8 billion annually, while there is scant spending for wildfire prevention.

3. Mega fires are the unnatural result of fire suppression.

And what of the “mega-fires” burning in the West, like the huge Mendocino Complex Fire and last year’s Thomas Fire? Unfortunately, many decades of fire suppression measures — prohibitions on logging, grazing, and controlled burns — have left the forests with too much dead wood and debris, especially on public lands. From the last link:

“Oregon, like much of the western U.S., was ravaged by massive wildfires in the 1930s during the Dust Bowl drought. Megafires were largely contained due to logging and policies to actively manage forests, but there’s been an increasing trend since the 1980s of larger fires.

Active management of the forests and logging kept fires at bay for decades, but that largely ended in the 1980s over concerns too many old growth trees and the northern spotted owl. Lawsuits from environmental groups hamstrung logging and government planners cut back on thinning trees and road maintenance.

[Bob] Zybach [a forester] said Native Americans used controlled burns to manage the landscape in Oregon, Washington and northern California for thousands of years. Tribes would burn up to 1 million acres a year on the west coast to prime the land for hunting and grazing, Zybach’s research has shown.

‘The Indians had lots of big fires, but they were controlled,’ Zybach said. ‘It’s the lack of Indian burning, the lack of grazing’ and other active management techniques that caused fires to become more destructive in the 19th and early 20th centuries before logging operations and forest management techniques got fires under control in the mid-20th Century.”

4. Bad federal forest administration started in 1990s.

Bob Zybach feels like a broken record. Decades ago he warned government officials allowing Oregon’s forests to grow unchecked by proper management would result in catastrophic wildfires.

While some want to blame global warming for the uptick in catastrophic wildfires, Zybach said a change in forest management policies is the main reason Americans are seeing a return to more intense fires, particularly in the Pacific Northwest and California where millions of acres of protected forests stand.

“We knew exactly what would happen if we just walked away,” Zybach, an experienced forester with a PhD in environmental science, told The Daily Caller News Foundation.

Zybach spent two decades as a reforestation contractor before heading to graduate school in the 1990s. Then the Clinton administration in 1994 introduced its plan to protect old growth trees and spotted owls by strictly limiting logging.  Less logging also meant government foresters weren’t doing as much active management of forests — thinnings, prescribed burns and other activities to reduce wildfire risk.

Zybach told Evergreen magazine that year the Clinton administration’s plan for “naturally functioning ecosystems” free of human interference ignored history and would fuel “wildfires reminiscent of the Tillamook burn, the 1910 fires and the Yellowstone fire.”

Between 1952 and 1987, western Oregon saw only one major fire above 10,000 acres. The region’s relatively fire-free streak ended with the Silver Complex Fire of 1987 that burned more than 100,000 acres in the Kalmiopsis Wilderness area, torching rare plants and trees the federal government set aside to protect from human activities. The area has burned several more times since the 1980s.

“Mostly fuels were removed through logging, active management — which they stopped — and grazing,” Zybach said in an interview. “You take away logging, grazing and maintenance, and you get firebombs.”

Now, Oregonians are dealing with 13 wildfires engulfing 185,000 acres. California is battling nine fires scorching more than 577,000 acres, mostly in the northern forested parts of the state managed by federal agencies.

The Mendocino Complex Fire quickly spread to become the largest wildfire in California since the 1930s, engulfing more than 283,000 acres. The previous wildfire record was set by 2017’s Thomas Fire that scorched 281,893 acres in Southern California.

While bad fires still happen on state and private lands, most of the massive blazes happen on or around lands managed by the U.S. Forest Service and other federal agencies, Zybach said. Poor management has turned western forests into “slow-motion time bombs,” he said.

A feller buncher removing small trees that act as fuel ladders and transmit fire into the forest canopy.

5. True environmentalism is not nature love, but nature management.

While wildfires do happen across the country, poor management by western states has served to turn entire regions into tinderboxes. By letting nature play out its course so close to civilization, this is the course California and Oregon have taken.

Many in heartland America and along the Eastern Seaboard often see logging and firelines if they travel to a rural area. This is part and parcel of life outside of the city, where everyone knows that because of a few minor eyesores their houses and communities are safer from the primal fury of wildfires.

In other words, leaving the forests to “nature,” and protecting the endangered Spotted Owl created denser forests––300-400 trees per acre rather than 50-80–– with more fuel from the 129 million diseased and dead trees that create more intense and destructive fires. Yet California spends more than ten times as much money on electric vehicle subsidies ($335 million) than on reducing fuel in a mere 60,000 of 33 million acres of forests ($30 million).

Rancher Ross Frank worries that funding to fight fires in Western communities like Chumstick, Wash., has crowded out important land management work. Rowan Moore Gerety/Northwest Public Radio

Once again, global warming “science” is a camouflage for political ideology and gratifying myths about nature and human interactions with it. On the one hand, progressives seek “crises” that justify more government regulation and intrusion that limit citizen autonomy and increase government power. On the other, well-nourished moderns protected by technology from nature’s cruel indifference to all life can afford to indulge myths that give them psychic gratification at little cost to their daily lives.

As usual, bad cultural ideas lie behind these policies and attitudes. Most important is the modern fantasy that before civilization human beings lived in harmony and balance with nature. The rise of cities and agriculture began the rupture with the environment, “disenchanting” nature and reducing it to mere resources to be exploited for profit. In the early 19thcentury, the growth of science that led to the industrial revolution inspired the Romantic movement to contrast industrialism’s “Satanic mills” and the “shades of the prison-house,” with a superior natural world and its “beauteous forms.” In an increasingly secular age, nature now became the Garden of Eden, and technology and science the signs of the fall that has banished us from the paradise enjoyed by humanity before civilization.

The untouched nature glorified by romantic environmentalism, then, is not our home. Ever since the cave men, humans have altered nature to make it more conducive to human survival and flourishing. After the retreat of the ice sheets changed the environment and animal species on which people had depended for food, humans in at least four different regions of the world independently invented agriculture to better manage the food supply. Nor did the American Indians, for example, live “lightly on the land” in a pristine “forest primeval.” They used fire to shape their environment for their own benefit. They burned forests to clear land for cultivation, to create pathways to control the migration of bison and other game, and to promote the growth of trees more useful for them.

Remaining trees and vegetation on the forest floor are more vigorous after removal of small trees for fuels reduction.

And today we continue to improve cultivation techniques and foods to make them more reliable, abundant, and nutritious, not to mention more various and safe. We have been so successful at managing our food supply that today one person out of ten provides food that used to require nine out of ten, obesity has become the plague of poverty, and famines result from political dysfunction rather than nature.

That’s why untouched nature, the wild forests filled with predators, has not been our home. The cultivated nature improved by our creative minds has. True environmentalism is not nature love, but nature management: applying skill and technique to make nature more useful for humans, at the same time conserving resources so that those who come after us will be able to survive. Managing resources and exploiting them for our benefit without destroying them is how we should approach the natural world. We should not squander resources or degrade them, not because of nature, but because when we do so, we are endangering the well-being of ourselves and future generations.


The annual burnt area from wildfires has declined over the past ninety years both in the U.S. and globally. Even this year’s wildfires are unlikely to come close to the average burn extent of the 1930s. The large wildfires this year are due to a combination of decades of poor forest management along with a weather pattern that has trapped warm, dry air over the West. The contention that global warming has played a causal role in the pattern is balderdash, but apparently that explanation seems plausible to the uninformed, and it is typical of the propaganda put forward by climate change interests.


Footnote:  So how do you want your forest fires, some small ones now or mega fires later?