Previous posts described how the Northwest Passage was treacherously laden with ice this year. The image above shows the flash freezing in this region over the last four days. Sept. 19 the CAA ice extent (Canadian Arctic Archipelago) was 320k km2, close to its annual minimum. Yesterday MASIE showed 450k km2, a 40% increase.
The graph shows MASIE reporting ice extents totalling 4.74M km2 yesterday, 124k km2 above the 11 year average (2007 to 2017 inclusive). NOAA’s Sea Ice Index is 119 k km2 lower, 2007 was 462k km2 lower, and 2012 1.2M km2 less ice extent. A dip on day 252 to 4.43M km2 will certainly be the daily minimum for the year. With typical refreezing to month end, we can expect the September monthly average will exceed 2007 by at least 300k to 400k km2.
People are overthinking and over-analyzing Arctic Ice extents, and getting wrapped around the axle (or should I say axis). So let’s keep it simple and we can all readily understand what is happening up North.
I will use the ever popular NOAA dataset derived from satellite passive microwave sensors. It sometimes understates the ice extents, but everyone refers to it and it is complete from 1979 to 2017. Here’s what NOAA reports (in M km2):
If I were adding this to the Ice House of Mirrors, the name would be The X-Ray Ice Mirror, because it looks into the structure of the time series. For even more clarity and simplicity, here is the table:
NOAA NH Annual Average Ice Extents (in M km2). Sea Ice Index v3.0 (here)
Year
Average
Change
Rate of Change
1979
12.328
1994
12.011
-0.317
0.021 per year
2007
10.474
-1.537
0.118 per year
2017
10.393
-0.081
0.008 per year
The satellites involve rocket science, but this does not. There was a small loss of ice extent over the first 15 years, then a dramatic downturn for 13 years, 6 times the rate as before. That was followed by the current plateau with virtually no further loss of ice extent. All the fuss is over that middle period, and we know what caused it. A lot of multi-year ice was flushed out through the Fram Strait, leaving behind more easily melted younger ice. The effects from that natural occurrence bottomed out in 2007.
Kwok et al say this about the Variability of Fram Strait ice flux:
The average winter area flux over the 18-year record (1978–1996) is 670,000 km2, ;7% of the area of the Arctic Ocean. The winter area flux ranges from a minimum of 450,000 km2 in 1984 to a maximum of 906,000 km2 in 1995. . .The average winter volume flux over the winters of October 1990 through May 1995 is 1745 km3 ranging from a low of 1375 km3 in the 1990 flux to a high of 2791 km3 in 1994.
Some complain it is too soon to say Arctic Ice is recovering, or that 2007 is a true change point. The same people were quick to jump on a declining period after 1994 as evidence of a “Death Spiral.”
Update September 25, 2018
Oh the irony! Tamino accuses me of emulating Michael Mann and “hiding the decline” of sea ice, as opposed to temperatures. In that case, where is my Nobel Prize?
But my humble efforts to tell the truth pale in comparison with Mann’s campaign to erase the medieval warm period. His temperature trickery is described in Rise and Fall of the Modern Warming Spike
One week ago on day 252 MASIE reported the lowest daily extent of the year at 4.43M km2. One week later the image above shows how the ice edges have refrozen and extended. Note also the significant snowfall both in Canada and Russia
Mid September we can see the long predicted collapse of Arctic ice is postponed for yet another year. The graph shows MASIE reporting ice extents above 4.5M km2 for the month of September. A dip on day 252 to 4.43M km2 will likely be the daily minimum for the year, since 200k km2 of ice has been added in the last week. The graph also shows that 2018 is presently 96k km2 above the 11 year average ice extent, 350k km2 more than 2016, 472k km2 more than 2007, and 1.2M km2 (a full Wadham!) more than the record setting 2012.
Interestingly, in September until yesterday NOAA’s officially referenced Sea Ice Index (SII) was showing more ice than MASIE, by about 200k km2. That means the SII September monthly result will continue the plateau in Arctic ice since 2007.
The table below shows ice extents in the various basins comprising the Arctic Ocean for day 259 for 2018 and 2007 in comparison to the 11 year averages (2007 to 2017 inclusive).
Region
2018259
Day 259 Average
2018-Ave.
2007259
2018-2007
(0) Northern_Hemisphere
4622484
4526053
96431
4150314
472171
(1) Beaufort_Sea
560233
471483
88750
515813
44420
(2) Chukchi_Sea
163790
150187
13604
48053
115737
(3) East_Siberian_Sea
467586
270891
196695
311
467275
(4) Laptev_Sea
27262
140510
-113248
238846
-211584
(5) Kara_Sea
235
23504
-23268
52498
-52263
(6) Barents_Sea
0
23798
-23798
7420
-7420
(7) Greenland_Sea
83223
203153
-119930
329643
-246420
(8) Baffin_Bay_Gulf_of_St._Lawrence
40128
29596
10532
32287
7841
(9) Canadian_Archipelago
346993
274872
72121
237669
109324
(10) Hudson_Bay
6051
7085
-1034
4355
1697
(11) Central_Arctic
2926584
2929894
-3309
2682144
244440
Presently, 2018 is about 100k km2 or 2% above average, and 472k km2 or 11% more than 2007. The Eastern Arctic shows surpluses in East Siberian, Chukchi, Beaufort, and CAA. The Western Arctic shows deficits in Laptev, Kara, Barents and Greenland seas. The Central Arctic is matching average, while East Siberian sea ice is 73% above average.
As reported previously, the Northwest Passage through Nunavut was closed this year due to excessive and thick multiyear ice blocking the way. The chart below shows the conditions as of Sept. 14.
It won’t get any better than this for yachts attempting the passage west, since more than 3/10 (green) ice conditions blocks their progress. A post at the Northwest Passage blog S/V CRYSTAL Escape from Prince Regent Inlet explains how the passage is closing.
Breaking through the ice corridor, already close to the shore, we suddenly saw something unusual. Static ice so far began to flow very rapidly towards the shore! In this way he closed the road ahead of us and – which is much worse – cut off our retreat. We turned back and rushed to escape. It was the only option.
We drove the gas to the top, and the free water in front of the bow disappeared in her eyes . Giant ice floes moved towards the shore like in a river stream. We jumped out of some of the channels with maybe a meter of side wall, and the path behind us disappeared after a few dozen seconds.
After about thirty minutes of such a crazy slalom we got to the water so slow that the danger of being closed and pressed to the shore was averted . That was good news for us. The bad news was that this time we were unable to get out of Prince Regent Inlet. It was waiting for the next chance . (note: the log is translated from Polish)
Update September 19, 2018
Here is the latest ice chart showing that in Franklin strait, all the green conditions (<3/10) are now gone, while red and brown are taking over.
Bottom line: They succeeded to get out and are now docked on Greenland coast.
The questions are posed by NBC News, a source of many stories promoting climate alarm/activism. The answers are from David Nolan, professor and chair of the department of atmospheric sciences at the University of Miami’s Rosenstiel School of Marine and Atmospheric Sciences and a noted expert on hurricanes and tropical weather. Excerpts are in italics with my bolds.
Just where are we with hurricane science? What have we learned, and what questions remain to be answered? And what about the role of climate change in the formation and propagation of severe storms?
Q: The National Hurricane Center today upgraded Florence to a Category 4 storm. What exactly does that mean? A: It means that, by their best estimate, there are wind speeds somewhere at the surface of 130 miles per hour or greater. This estimate comes from a combination of satellite images, and, in this case, from NOAA [National Oceanic and Atmospheric Administration] aircraft that have been flying in Florence this morning.
Q: How many categories are there? A: The categories go from 1 to 5.
Q: Could there ever be a Category 6?
A: No. Fives themselves are very rare. And reaching higher speeds — like 170 or 180 mph — is extremely rare. So it doesn’t make sense to make a category for something that will still be extremely rare, even if it happens a little more, like once every five years instead of once every 10 years.
Q: What in general dictates whether a Category 4 storm will turn into an even more destructive Category 5?
A: The conditions that are most favorable are warm ocean temperatures, like above 85 degrees [Fahrenheit], and light winds in the larger environment around the storm. Storms become category 5 in only the most favorable conditions.
Satellite images show three Atlantic hurricanes, from left, Florence, Isaac and Helene.Satellite images show three Atlantic hurricanes, from left, Florence, Isaac and Helene.NOAA
Q: Is severe weather getting more severe and more frequent?
A: Whether or not severe weather is actually getting more severe is not clear. It is clear that the most extreme rainfall events have increased in frequency, and this is consistent with our understanding of how global warming will change the weather.
Q: Some hurricanes seem to hit land and then quickly dissipate, causing little damage. Others, like Harvey last year, give way to heavy rainfall and flooding. What determines which course a storm will take?
A: There are two factors. First, whether or not the storm keeps moving inland steadily, or whether it lingers near the coast. This is determined by the steering patterns of the larger atmosphere around it, as the hurricane is essentially carried along by the even larger weather systems around it.
Second, it depends on the kind of terrain the storm is over. In the case of Harvey, the land [in and around Houston] is relatively flat and smooth and also still near the Gulf of Mexico, so Harvey did not dissipate quickly.
Q: You said climate change seems to be changing global weather patterns to make extreme rainfall events more frequent. Can you explain exactly what’s happening?
A: The main reason is that warmer air can hold more water vapor. So when air rises and forms clouds and then rain, more water is released and then more water falls to the ground as rain.
Q: But there’s no evidence that climate change is making hurricanes more frequent?
A: There is not. Unfortunately, the existing modern recordsof hurricanes are only of good quality for about 60 years. Because hurricane activity varies so much from year to year, then it’s not long enough to say for sure if there is a clear trend upward due to global warming.
There has been an enormous amount of research on whether TC numbers or strength will increase in the future because of global warming. But the results of those studies are mixed and sometimes contradictory, so we can’t make a conclusive statement yet. (TC refers to tropical cyclones (hurricanes) that occur each year, in each ocean.)
Q: What exactly is the difference between a hurricane and a cyclone?
Q: Physically, they are the same thing. They are called hurricanes in the Atlantic and in the Eastern Pacific, typhoons in the West Pacific and cyclones around Australia and India.
Q: What causes these storms to form, and what makes them move as they do?
A: Hurricanes form when areas of disturbed weather — rain and thunderstorms — over the ocean start to organize into a swirling pattern. As the winds increase, they extract more and more energy and water from the ocean, thus getting stronger and larger. As for their motion, they are carried along by the larger weather patterns around them, the usual lows and highs that most people often see on weather maps.
Q: How big can hurricanes get?
The areas of hurricanes with significant weather (winds and rain) are usually about 200 miles across. Some can be larger, as much as 300 miles. Some are quite small, only 50 miles.
Q: Do they always swirl in the same direction?
A: In the northern hemisphere, they rotate counterclockwise. In the southern hemisphere, it is the opposite. They get their rotation from the Earth’s rotation, which has an opposite sense whether you are in the Northern or Southern Hemisphere.
Q: You’re an expert in the use of computer modeling to study hurricanes. What have you learned from your research?
A: Most of my research has been about hurricane formation. We’ve used computer models to understand the physical processes by which hurricanes form. There are many “disturbances” over the oceans every summer, but most of them do not become hurricanes. We want to understand why some of them do.
Q: How does computer modeling work?
Computer models attempt to simulate the motions of the atmosphere. The first step is to assemble a digital “image” of the weather right now, much the same way that a camera image is made up of pixels of many different colors. But next, it uses the laws of physics and mathematics to determine how each part of the atmosphere will change with time, as they are influenced by the other pixels around them.
Q: Any new findings?
A: Our work showed the importance of moisture in the middle levels of the atmosphere, around 10,000 to 20,000 feet, in the regions where hurricanes tend to form. Higher-than-average moisture is much more favorable for hurricanes to form.
Q: In addition to computers, aircraft and satellites, are there any new tools that hurricane scientists are now using to facilitate their research?
A: The new generation of satellites, such as the new GOES 16 which recently became operational, are excellent. They make it much easier to see what is going on in these storms. The other developing advance is the use of drones. There are large drones, such as the NASA Global Hawk aircraft, which is about the size of a corporate jet and can fly over a hurricane for 24 hours straight. And there are small drones that can be dropped into a hurricane out of one of the NOAA aircraft, and can get much closer to the ocean’s surface than the NOAA planes (with people in them) are allowed to fly.
Q: What’s next for hurricane science ?
A: Many scientists these days are trying to better understand “rapid intensification,” which is when a hurricane’s winds increase by two or more categories in a single day. But there has been a lot of progress on that, and the computer models have become pretty good at predicting this, just as they are for Hurricane Florence right now.
The other very popular topic is how hurricane activity will (or will not) change with global warming. While everyone seems to think it will make it worse, there is no proof of that yet.
Q: As a hurricane researcher, is there some scenario that keeps researchers up at night?
A: I think it does make us more aware that bad events can and will happen. But we also understand that the chances of it happening to any one place is also very small.
Q: The National Weather Service website has a list of common misperceptions about hurricanes. What do you think are the most common ones people have?
A: I’m not sure about most common. But one that I think is most dangerous is that many people have the perception that they have experienced hurricane conditions before. Many people experience fringe effects of a hurricane and think they have been through a hurricane. Real hurricane conditions (sustained winds of 75 mph or higher) are actually much worse than people realize.
Summary
In theory, global warming (for whatever causes) should produce more moisture and extreme rainfall. In practice there is no evidence that this has happened. It is also not clear that extreme weather events are more severe than in the past, or that hurricanes are more frequent. The idea of a category six hurricane is an alarmist fantasy, akin to the notion of a geologic period called the “anthropocene.” “Climate Change” is still something we see in the rear view mirror, not a causal agent in nature.
NASA-NOAA satellite image of the Atlantic Ocean captured on September 11, 2018 at 11:45 AM EDT showing Hurricane Florence approaching the east coast with Tropical Storm Isaac and Hurricane Helene fast on her heels.NASA/NOAA
Along with most everyone else, nuclear power plants in North and South Carolina, as well as Virginia, have been preparing for the natural onslaught.
Hurricane Florence will most likely hit two nuclear power plants operated by Duke Energy – their 1,870 megawatt (MW) Brunswick and 932MW Harris nuclear plants in North Carolina. If Florence turns north, it could also hit Dominion Energy’s 1,676MW Surry plant in Virginia. Brunswick is expected to get a direct hit.
The Brunswick Nuclear Power Plant, two miles north of Southport, North Carolina will get a direct hit by Hurricane Florence. But there’s no worry as nuclear plants are the most resistant to severe weather of all energy sources. The plant produces over 15 billion kWhs a year and provides power to over 4 million people.DOC SEARLS
The United States Nuclear Regulatory Commission (NRC) is watching carefully. But no one is really worried that much will happen, contrary to lots of antinuclear fearmongering. Power outages will occur as lines and transformers are destroyed and non-nuclear buildings get damaged, and it might takes a few days to a few weeks to bring power back up, something that includes all energy sources.
‘We anticipate Hurricane Florence to be an historic storm that will impact all customers,’ said Grace Rountree, a spokeswoman for Duke. These reactors provide power to about 4 million customers in the two Carolinas.
The Brunswick plant has withstood several hurricanes since the two reactors there began operation in the mid-1970s, including Category 3 Hurricane Diana in 1984 and Category 3 Hurricane Fran in 1996. Category 4 Hurricane Hugo, the most often-compared with Florence, made landfall about 150 miles southwest of Brunswick in South Carolina in 1989.
Following protocols, the reactors at the nuclear plants have started shutting down before the hurricane is scheduled to arrive. While all nuclear reactors are protected against extreme winds, including tornado-strength gusts up to 300 mph, they shut down as a protective measure.
Food, water and other necessities are kept onsite at these nuclear plants to prepare for potential isolation of the site, and staff needed during the storm are brought in to ensure proper resources are available for an extended period.
The Carolinas have a heavy concentration of power reactors – 12 of the country’s 99 reactors. Four more reactors are in Virginia and five are in coastal Delaware and Maryland. These reactors provide enough electricity to power 30 cities the size of Raleigh.
Nuclear is the only energy source immune to all extreme weather events – by design. Plants have steel-reinforced concrete containments with over 4-foot thick walls. The buildings housing the reactors, vital equipment and used fuel have steel-reinforced concrete walls up to 7 feet thick, which are built to withstand any category hurricane or tornado. They can even withstand a plane flying directly into them.
Whether it’s hurricanes, floods, earthquakes, heat waves or severe cold, nuclear performs more reliably than anything else. There’s no better reason to retain our nuclear fleet, and even expand it, to give us a diverse energy mix that can handle any natural disaster that can occur.
James Conca
I have been a scientist in the field of the earth and environmental sciences for 33 years, specializing in geologic disposal of nuclear waste, energy-related research, planetary surface processes, radiobiology and shielding for space colonies, subsurface transport and environmental clean-up of heavy metals. I am a Trustee of the Herbert M. Parker Foundation, Adjunct at WSU, an Affiliate Scientist at LANL and consult on strategic planning for the DOE, EPA/State environmental agencies, and industry including companies that own nuclear, hydro, wind farms, large solar arrays, coal and gas plants. I also consult for EPA/State environmental agencies and industry on clean-up of heavy metals from soil and water. For over 25 years I have been a member of Sierra Club, Greenpeace, the NRDC, the Environmental Defense Fund and many others, as well as professional societies including the America Nuclear Society, the American Chemical Society, the Geological Society of America and the American Association of Petroleum Geologists.
In today’s hyper-politicized world of climate science, hardly a thunderstorm passes without somebody invoking the “scientists say” trope to blame it on carbon emissions.
The logic seems to be: If it’s bad, it was caused by carbon emissions, and we are only going to see more and worse. More and worse floods, droughts, tornadoes, and of course, hurricanes.
The problem with this argument is that overall, we are not seeing more floods, droughts, tornadoes, or hurricanes in spite of the steady rise in the small amount of carbon dioxide, and in spite of the mild warming of the planet. The data show that there is no significant upward trend in any of these weather events.
These are not the conclusions of climate skeptics. They are conclusions drawn by the Intergovernmental Panel on Climate Change and our own National Oceanic and Atmospheric Administration.
This week, the Carolina coast and some yet-to-be-determined inland counties will endure the heavy and destructive rains of Hurricane Florence. Without a doubt, some places will see records broken.
As the hurricane arrives, talking heads will hit the airwaves claiming that “scientists say” it was caused by carbon emissions. Some may spin it more subtly, saying that while we cannot identify which storms are caused by increased levels of atmospheric carbon dioxide, the storms today are getting stronger and more frequent.
But this simply is not true. We are not seeing more frequent hurricanes, nor are we seeing a greater number of major hurricanes.
The Intergovernmental Panel on Climate Change said as much in its latest science report:
Current data sets indicate no significant observed trends in global tropical cyclone frequency over the past century and it remains uncertain whether any reported long-term increases in tropical cyclone frequency are robust, after accounting for past changes in observing capabilities. … No robust trends in annual numbers of tropical storms, hurricanes, and major hurricanes counts have been identified over the past 100 years in the North Atlantic basin.
Be on the alert for those who quote the Intergovernmental Panel on Climate Change as saying there has been an upward trend in hurricanes since the 1970s. That is a misleading claim. Hurricane landfalls actually fell for the decades before the 1970s.
Cherry-picking endpoints can produce “trends” that are either up or down. The fact is that for the past century, there is no trend.
Furthermore, there was never a time when the climate was stable (as some would claim), when weather events happened with smooth regularity. There have always been cycles—years and decades that included large numbers of hurricanes, and others with few.
Whether carbon dioxide levels rise, fall, or stay the same, we will continue to see hurricanes. Some of these hurricanes will be immensely destructive of both property and human life. Some will break records for wind and/or rain. And they will be tragic.
The fact that tragic weather events have not stopped is not evidence that carbon emissions are leading us to a climate catastrophe. Perhaps we will see a decades-long increase in one category or another, it has happened before—but that will not prove the predictions of catastrophic climate change one way or the other.
Even if all of the mild (though uneven) warming that seems to have occurred over the past century were due to man-made carbon emissions, that would still not be a reason to fear for the future. The overall story does not point to climate catastrophe.
But weather catastrophes will continue to strike, and we will still face the danger wrought by nature’s wrath. Hurricane Florence is shaping up to be exactly such a storm.
This week yet another unimaginable calamity if Paris Accord is not fulfilled. That’s right the coordinated reports in the media raise the alarm: The Insects Are Coming For Us (unless we mend our ways!)
Global warming will help insects, hurt crops NBC News
Climate change may boost pests, stress food supplies Axios
Climate Change Will Lead To More Crop-Destroying Insects IFLScience
Global Warming Means More Insects Threatening Food Crops — A Lot More, Study Warns InsideClimate News
Global warming will likely help bugs devour more crops CBC.ca
Global warming could spur more and hungrier crop-eating bugs ABC News
Global warming could spur more crop-eating bugs CTV.ca
Global warming will make insects hungrier, eating up key crops: study AFP
Crop losses due to insects could nearly double in Europe’s bread basket due to climate EurekAlert!
Climate change projected to boost insect activity and crop loss, researchers say EurekAlert!
Rise in insect pests under climate change to hit crop yields, study says Carbon Brief
Swarms of insects will destroy crops across Europe and America by 2050 due to global warming Daily Mail
Global warming: More insects, eating more crops Phys.org
Climate change to accelerate crop losses from insects Cornell Alliance for Science
Climate Change Means Insects Are Coming for Our Food The Atlantic
Well, at least we know who is keen to reprint press releases from Alarmist Central. I am not an entomologist, not are the journalists who are piling on this story. So let’s hear from some insect experts.
Adult insects generally are of smaller body size when larvae are reared at higher temperatures. For example, females of Bicyclus butterflies reared at 20°C were larger than those reared at 27°C. Moreover, females laid larger eggs when they were reared or acclimatized for 10 days at the lower temperature compared to the higher temperature.
LDT: actual lower developmental threshold; T0: predicted lower developmental threshold; UDT: upper developmental threshold; TO: thermal optimum (maximum) for developmental rate. Total optimum for population growth is usually at moderate temperatures, not at such high extremes.
Development time (dt) is the time required to complete specified stage or instar and can be described as dt = SET/(T-T0). SET is the sum of effective temperatures or “thermal constant,” expressed as the number of degree days. T0 is the lower developmental threshold (LDT, or base temperature Tb), the hypothetical temperature at which developmental time would be infinite or developmental rate would be zero. The product of developmental time and the amount to which ambient temperature is above the threshold was found to be constant (= SET), that is, development will take a fixed number of degree days essentially independent of the temperature at which the animal is reared. The thermal parameters are determined in defined conditions (set of constant temperatures, suitable nutrition).
The LDT and SET values are population-specific characteristics. The LDT values are similar for all developmental stages of a given population, even when they develop in diverse seasons and experience disparate temperature fluctuations. The stability of LDT is manifested as developmental rate isometry, that is, the percentage of time spent in a particular stage at any constant physiological temperature is a stable fraction of the entire developmental time.
Tropical species have higher values of LDT than temperate ones. SET decreases as LDT increases. Insects that have spread to temperature zones from the tropical regions often maintain a high LDT and can reproduce and develop only in the hot season, spending most of the year in a state of dormancy.
A general response of insects to temperatures just below their LDT or above their UDT is the cessation of development and reproduction while the insects remain active and feed. The larvae may slowly grow and the adults accumulate reserves. These processes are terminated at more extreme temperatures.
During cooling, motility gradually decreases. At certain temperature, the neural and muscular activities are impaired and the insect lapses into cold stupor (chill coma). The stupor point is as high as 12°C in tropical insects including stored product pests, and in honey bees, around 5°C in many temperate species, near 0°C in most overwintering insects, and even below the freezing point in species living in very cold areas.
Gradual warming above UDT, which is for many species around 35°C but is never sharply delimited, increases the metabolic rate, loss of water, and motility. Around 40°C, the water loss increases sharply: the spiracles are wide open and the melting of cuticular lipids permits evaporation through the body surface. Exhaustion of water and nutrients leads to rapid decrease of motility and a drop of transpiration. At a certain temperature, heat stupor occurs. Survival at temperatures above the threshold is a function of temperature and length of exposure. Warming to the absolute upper lethal temperature, which is usually around 50-55°C, causes fast, irreversible tissue damage and death.
The data for terrestrial ectotherms discussed previously point to species from mid-latitudes in particular being closest to their thermal maxima. Moreover, although data are still quite scanty, species may have only a limited capacity to deal with changes in upper thermal limits. Under an expected 2–4 °C warming scenario (IPCC 2007), mid-latitude populations near limits are likely to face the threat of extinction because they cannot adapt to new environmental conditions.
There is almost no information on how thermal limits are influenced by combinations of stressors. Changes in the conditions that organisms experience during thermal stress could lead to quite unpredictable upper thermal limits (Terblanche et al. 2011; Overgaard, Kristensen & Sørensen 2012). Moreover, thermal stress can influence susceptibility to other selective agents; tropical Bicyclus anynana butterflies lose immune function as measured by phenoloxidase (PO) activity and haemocyte numbers when exposed to warm conditions, and the effects are particularly marked when adults have a limited food supply.
Summary
These scares always sound plausible, but on closer inspection are simplistic and unrealistic. The above shows that each type of insect has a range of temperatures they can tolerate and allow them to develop. They are stressed and populations decrease when colder than the lower limit and also when hotter than the upper limit. Every species will adapt to changing conditions as they always have. Those at their upper limit will decline, not increase, and their place will be taken by others. Of course, if it gets colder, the opposite occurs. Don’t let them scare you that insects are taking over.
Unlike temperature, rainfall varies hugely in space and time. Even the most sophisticated climate models struggle to simulate physical processes such as convection and evaporation that drive rainfall activity. On top of that, global warming is not expected to change the frequency and intensity of rainfall extremes in the same way in all parts of the world.
The choice to focus solely on the rainfall itself is particularly relevant for flooding events. Though accusations of poor decision-making and mismanagement of water resources are beginning to appear in the Kerala aftermath, the floods simply would not have occurred without a significant amount of rain. Few of those suffering lost homes and livelihoods are likely to care much about where the rain came from or the intricacies of the weather conditions that led to it.
Disentangling these contributory factors takes time. In comparison to droughts and heatwaves, short-term hazards such as floods do not usually give us much chance to report concrete findings while the media and general public are still engaged in the event. In-depth studies may not publish their results for many months, sometimes even years after the event in question.
Many of these issues are not exclusive to extreme rainfall. The excellent US National Academies report on Attribution of Extreme Weather Events in the Context of Climate Change describes the shortcomings in our efforts to attribute a variety of extremes. But for rainfall in particular there is a discrepancy between what we understand about the general effect of global warming and our rather lesser ability to quantify the climate change fingerprint on specific events.
2017 Hurricane Harvey X-Weathermen Event
With Hurricane Harvey making landfall in Texas as a Cat 4, the storm drought is over and claims of linkage to climate change can be expected. So far (mercifully) articles in Time and Washington Post have been more circumspect than in the past. Has it become more respectable to look at the proof supporting wild claims? This post provides background on X-Weathermen working hard to claim extreme weather as proof of climate change.
In the past the media has been awash with claims of “human footprints” in extreme weather events, with headlines like these:
“Global warming is making hot days hotter, rainfall and flooding heavier, hurricanes stronger and droughts more severe.”
“Global climate change is making weather worse over time”
“Climate change link to extreme weather easier to gauge”– U.S. Report
“Heat Waves, Droughts and Heavy Rain Have Clear Links to Climate Change, Says National Academies”
That last one refers to a paper released by the National Academy of Sciences Press: Attribution of Extreme Weather Events in the Context of Climate Change (2016)
And as usual, the headline claims are unsupported by the actual text. From the NAS report (here): (my bolds)
Attribution studies of individual events should not be used to draw general conclusions about the impact of climate change on extreme events as a whole. Events that have been selected for attribution studies to date are not a representative sample (e.g., events affecting areas with high population and extensive infrastructure will attract the greatest demand for information from stakeholders) P 107
Systematic criteria for selecting events to be analyzed would minimize selection bias and permit systematic evaluation of event attribution performance, which is important for enhancing confidence in attribution results. Studies of a representative sample of extreme events would allow stakeholders to use such studies as a tool for understanding how individual events fit into the broader picture of climate change. P 110
Correctly done, attribution of extreme weather events can provide an additional line of evidence that demonstrates the changing climate, and its impacts and consequences. An accurate scientific understanding of extreme weather event attribution can be an additional piece of evidence needed to inform decisions on climate change related actions. P. 112
The Indicative Without the Imperative
The antidote to such feverish reporting is provided by Mike Hulme in a publication: Attributing Weather Extremes to ‘Climate Change’: a Review (here).
He has an insider’s perspective on this issue, and is certainly among the committed on global warming (color him concerned). Yet here he writes objectively to inform us on X-weather, without advocacy: real science journalism and a public service, really. Excerpts below with my bolds.
Overview
In this third and final review I survey the nascent science of extreme weather event attribution. The article proceeds by examining the field in four stages: motivations for extreme weather attribution, methods of attribution, some example case studies and the politics of weather event Attribution.
The X-Weather Issue
As many climate scientists can attest, following the latest meteorological extreme one of the most frequent questions asked by media journalists and other interested parties is: ‘Was this weather event caused by climate change?’
In recent decades the meaning of climate change in popular western discourse has changed from being a descriptive index of a change in climate (as in ‘evidence that a climatic change has occurred’) to becoming an independent causative agent (as in ‘climate change caused this event to happen’). Rather than being a descriptive outcome of a chain of causal events affecting how weather is generated, climate change has been granted power to change worlds: political and social worlds as much as physical and ecological ones.
To be more precise then, what people mean when they ask the ‘extreme weather blame’ question is: ‘Was this particular weather event caused by greenhouse gases emitted from human activities and/or by other human perturbations to the environment?’ In other words, can this meteorological event be attributed to human agency as opposed to some other form of agency?
The Motivations
Hulme shows what drives scientists to pursue the “extreme weather blame” question, noting four motivational factors.
Why have climate scientists over the last ten years embarked upon research to provide an answer beyond the stock phrase ‘no individual weather event can directly be attributed to greenhouse gas emissions’? There seem to be four possible motives.
1.Curiosity The first is because the question piques the scientific mind; it acts as a spur to develop new rational understanding of physical processes and new analytic methods for studying them.
2.Adaptation A second argument, put forward by some, is that it is important to know whether or not specific instances of extreme weather are human-caused in order to improve the justification, planning and execution of climate adaptation.
3.Liability A third argument for pursuing an answer to the ‘extreme weather blame’ question is inspired by the possibility of pursuing legal liability for damages caused. . . If specific loss and damage from extreme weather can be attributed to greenhouse gas emissions – even if expressed in terms of increased risk rather than deterministically – then lawyers might get interested.
The liability motivation for research into weather event attribution also bisects the new international political agenda of ‘loss and damage’ which has emerged in the last two years. . . The basic idea is to give recognition that loss and damage caused by climate change is legitimate ground for less developed countries to gain access to new international climate adaptation funds.
4. Persuasion A final reason for scientists to be investing in this area of climate science – a reason stated explicitly less often than the ones above and yet one which underlies much of the public interest in the ‘extreme weather blame’ question – is frustration with and argument about the invisibility of climate change. . . If this is believed to be true – that only scientists can make climate change visible and real –then there is extra onus on scientists to answer the ‘extreme weather blame’ question as part of an effort to convince citizens of the reality of human-caused climate change.
Attribution Methods
Attributing extreme weather events to human influences requires different approaches, of which four broad categories can be identified.
1. Physical Reasoning The first and most general approach to attributing extreme weather phenomena to rising greenhouse gas concentrations is to use simple physical reasoning.
General physical reasoning can only lead to broad qualitative statements such as ‘this extreme weather is consistent with’ what is known about the human-enhanced greenhouse effect. Such statements offer neither deterministic nor stochastic answers and clearly underdetermine the ‘weather blame question.’ It has given rise to a number of analogies to try to communicate the non-deterministic nature of extreme event attribution. The three most widely used ones concern a loaded die (the chance of rolling a ‘6’ has increased, but no single ‘6’ can be attributed to the biased die), the baseball player on steroids (the number of home runs hit increases, but no single home run can be attributed to the steroids) and the speeding car-driver (the chance of an accident increases in dangerous conditions, but no specific accident can be attributed to the fast-driving).
2. Classical Statistical Analysis A second approach is to use classical statistical analysis of meteorological time series data to determine whether a particular weather (or climatic) extreme falls outside the range of what a ‘normal’ unperturbed climate might have delivered.
All such extreme event analyses of meteorological time series are at best able to detect outliers, but can never be decisive about possible cause(s). A different time series approach therefore combines observational data with model simulations and seeks to determine whether trends in extreme weather predicted by climate models have been observed in meteorological statistics (e.g. Zwiers et al., 2011, for temperature extremes and Min et al., 2011, for precipitation extremes). This approach is able to attribute statistically a trend in extreme weather to human influence, but not a specific weather event. Again, the ‘weather blame question’ remains underdetermined.
3. Fractional Attributable Risk (FAR) Taking inspiration from the field of epidemiology, this method seeks to establish the Fractional Attributable Risk (FAR) of an extreme weather (or short-term climate) event. It asks the counterfactual question, ‘How might the risk of a weather event be different in the presence of a specific causal agent in the climate system?’
The single observational record available to us, and which is analysed in the statistical methods described above, is inadequate for this task. The solution is to use multiple model simulations of the climate system, first of all without the forcing agent(s) accused of ‘causing’ the weather event and then again with that external forcing introduced into the model.
The credibility of this method of weather attribution can be no greater than the overall credibility of the climate model(s) used – and may be less, depending on the ability of the model in question to simulate accurately the precise weather event under consideration at a given scale (e.g. a heatwave in continental Europe, a rain event in northern Thailand) (see Christidis et al., 2013a).
4. Eco-systems Philosophy A fourth, more philosophical, approach to weather event attribution should also be mentioned. This is the argument that since human influences on the climate system as a whole are now clearly established – through changing atmospheric composition, altered land surface characteristics, and so on – there can no longer be such a thing as a purely natural weather event. All weather — whether it be a raging tempest or a still summer afternoon — is now attributable to human influence, at least to some extent. Weather is the local and momentary expression of a complex system whose functioning as a system is now different to what it would otherwise have been had humans not been active.
Results from Weather Attribution Studies
Hulme provides a table of numerous such studies using various methods, along with his view of the findings.
It is likely that attribution of temperature-related extremes using FAR methods will always be more attainable than for other meteorological extremes such as rainfall and wind, which climate models generally find harder to simulate faithfully at the spatial scales involved. As discussed below, this limitation on which weather events and in which regions attribution studies can be conducted will place important constraints on any operational extreme weather attribution system.
Political Dimensions of Weather Attribution
Hulme concludes by discussing the political hunger for scientific proof in support of policy actions.
But Hulme et al. (2011) show why such ambitious claims are unlikely to be realised. Investment in climate adaptation, they claim, is most needed “… where vulnerability to meteorological hazard is high, not where meteorological hazards are most attributable to human influence” (p.765). Extreme weather attribution says nothing about how damages are attributable to meteorological hazard as opposed to exposure to risk; it says nothing about the complex political, social and economic structures which mediate physical hazards.
And separating weather into two categories — ‘human-caused’ weather and ‘tough-luck’ weather – raises practical and ethical concerns about any subsequent investment allocation guidelines which excluded the victims of ‘tough-luck weather’ from benefiting from adaptation funds.
Contrary to the claims of some weather attribution scientists, the loss and damage agenda of the UNFCCC, as it is currently emerging, makes no distinction between ‘human-caused’ and ‘tough-luck’ weather. “Loss and damage impacts fall along a continuum, ranging from ‘events’ associated with variability around current climatic norms (e.g., weather-related natural hazards) to [slow-onset] ‘processes’ associated with future anticipated changes in climatic norms” (Warner et al., 2012:21). Although definitions and protocols have not yet been formally ratified, it seems unlikely that there will be a role for the sort of forensic science being offered by extreme weather attribution science.
Conclusion
Thank you Mike Hulme for a sane, balanced and expert analysis. It strikes me as being another element in a “Quiet Storm of Lucidity”.
Is that light the end of the tunnel or an oncoming train?
It has been a changeable and at times extreme spring and summer. The cold and snow of March gave way to a cold April and some very chilly days well into the spring. Warmth with some very hot days followed in the early to mid summer. Then the rains came with strong thunderstorms. The wet August spell put an end to a borderline drought spell the last few years. It became very muggy, keeping nighttime temperatures up and air conditioners on.
The changes all have to do with the wind direction. The jet stream brought chilly air masses (and snow) into eastern Canada even into June. The winds around these cool air masses turn to the northeast here in New England coming in off cool land and water. Then increasingly warm air masses built north into the Canadian prairies and came east. The surface winds turned northwesterly. In summer, warm air crossing the Appalachians and sinking down into the Merrimack Valley and coast heats by compression 5F or more. Our hottest days come with these ‘downslope winds’. Historically all the 100F days come with a west to northwest wind.
When late July and August came, our surface winds turned southwesterly.
This change was caused by a sharp cooling of the subtropical Atlantic Ocean waters relative to 2017.
The Atlantic cool subtropical pattern leads to that stronger than normal Atlantic high pressure called the ‘Bermuda High’. In these patterns, this regular feature of our climate expands south and west and acts as a pump for moisture much like we see in the southeastern and eastern Asia monsoon flow. This causes nights to be warm and muggy, and days very warm and showery here in the east.
Uncomfortable yes, unprecedented heat no. You may be surprised that most of the extreme heat records for the region, country and world occurred in the early 20th century or earlier. The 1930s was the record decade in the United States as a whole. For the east, the 1950s was the warmest but extreme heat has occurred even in cold periods.
Summary: Effect on the Hurricane Season
As a general rule, when the subtropical Atlantic is warm, we have more hurricane activity, when it is cool, we have fewer storms.
It is often said that truth is the first casualty in the fog of war. That is especially true of the war against fossil fuels and smoke from wildfires. The forests are burning in California, Oregon and Washington, all of them steeped in liberal, progressive and post-modern ideology. There are human reasons that fires are out of control in those places, and it is not due to CO2 emissions. As we shall see, Zinke is right and Brown is wrong. Some truths the media are not telling you in their drive to blame global warming/climate change. Text below is excerpted from sources linked at the end.
1. The World and the US are not burning.
The geographic extent of this summer’s forest fires won’t come close to the aggregate record for the U.S. Far from it. Yes, there are some terrible fires now burning in California, Oregon, and elsewhere, and the total burnt area this summer in the U.S. is likely to exceed the 2017 total. But as the chart above shows, the burnt area in 2017 was less than 20% of the record set way back in 1930. The same is true of the global burnt area, which has declined over many decades.
“Analysis of charcoal records in sediments [31] and isotope-ratio records in ice cores [32] suggest that global biomass burning during the past century has been lower than at any time in the past 2000 years. Although the magnitude of the actual differences between pre-industrial and current biomass burning rates may not be as pronounced as suggested by those studies [33], modelling approaches agree with a general decrease of global fire activity at least in past centuries [34]. In spite of this, fire is often quoted as an increasing issue around the globe [11,26–29].”
People have a tendency to exaggerate the significance of current events. Perhaps the youthful can be forgiven for thinking hot summers are a new phenomenon. Incredibly, more “seasoned” folks are often subject to the same fallacies. The fires in California have so impressed climate alarmists that many of them truly believe global warming is the cause of forest fires in recent years, including the confused bureaucrats at Cal Fire, the state’s firefighting agency. Of course, the fires have given fresh fuel to self-interested climate activists and pressure groups, an opportunity for greater exaggeration of an ongoing scare story.
This year, however, and not for the first time, a high-pressure system has been parked over the West, bringing southern winds up the coast along with warmer waters from the south, keeping things warm and dry inland. It’s just weather, though a few arsonists and careless individuals always seem to contribute to the conflagrations. Beyond all that, the impact of a warmer climate on the tendency for biomass to burn is considered ambiguous for realistic climate scenarios.
2. Public forests are no longer managed due to litigation.
According to a 2014 white paper titled; ‘Twenty Years of Forest Service Land Management Litigation’, by Amanda M.A. Miner, Robert W. Malmsheimer, and Denise M. Keele: “This study provides a comprehensive analysis of USDA Forest Service litigation from 1989 to 2008. Using a census and improved analyses, we document the final outcome of the 1125 land management cases filed in federal court. The Forest Service won 53.8% of these cases, lost 23.3%, and settled 22.9%. It won 64.0% of the 669 cases decided by a judge based on cases’ merits. The agency was more likely to lose and settle cases during the last six years; the number of cases initiated during this time varied greatly. The Pacific Northwest region along with the Ninth Circuit Court of Appeals had the most frequent occurrence of cases. Litigants generally challenged vegetative management (e.g. logging) projects, most often by alleging violations of the National Environmental Policy Act and the National Forest Management Act. The results document the continued influence of the legal system on national forest management and describe the complexity of this litigation.”
There is abundant evidence to support the position that when any forest project posits vegetative management in forests as a pretense for a logging operation, salvage or otherwise, litigation is likely to ensue, and in addition to NEPA, the USFS uses the Property Clause to address any potential removal of ‘forest products’. Nevertheless, the USFS currently spends more than 50% of its total budget on wildfire suppression alone; about $1.8 billion annually, while there is scant spending for wildfire prevention.
3. Mega fires are the unnatural result of fire suppression.
And what of the “mega-fires” burning in the West, like the huge Mendocino Complex Fire and last year’s Thomas Fire? Unfortunately, many decades of fire suppression measures — prohibitions on logging, grazing, and controlled burns — have left the forests with too much dead wood and debris, especially on public lands. From the last link:
“Oregon, like much of the western U.S., was ravaged by massive wildfires in the 1930s during the Dust Bowl drought. Megafires were largely contained due to logging and policies to actively manage forests, but there’s been an increasing trend since the 1980s of larger fires.
Active management of the forests and logging kept fires at bay for decades, but that largely ended in the 1980s over concerns too many old growth trees and the northern spotted owl. Lawsuits from environmental groups hamstrung logging and government planners cut back on thinning trees and road maintenance.
[Bob] Zybach [a forester] said Native Americans used controlled burns to manage the landscape in Oregon, Washington and northern California for thousands of years. Tribes would burn up to 1 million acres a year on the west coast to prime the land for hunting and grazing, Zybach’s research has shown.
‘The Indians had lots of big fires, but they were controlled,’ Zybach said. ‘It’s the lack of Indian burning, the lack of grazing’ and other active management techniques that caused fires to become more destructive in the 19th and early 20th centuries before logging operations and forest management techniques got fires under control in the mid-20th Century.”
4. Bad federal forest administration started in 1990s.
Bob Zybach feels like a broken record. Decades ago he warned government officials allowing Oregon’s forests to grow unchecked by proper management would result in catastrophic wildfires.
While some want to blame global warming for the uptick in catastrophic wildfires, Zybach said a change in forest management policies is the main reason Americans are seeing a return to more intense fires, particularly in the Pacific Northwest and California where millions of acres of protected forests stand.
“We knew exactly what would happen if we just walked away,” Zybach, an experienced forester with a PhD in environmental science, told The Daily Caller News Foundation.
Zybach spent two decades as a reforestation contractor before heading to graduate school in the 1990s. Then the Clinton administration in 1994 introduced its plan to protect old growth trees and spotted owls by strictly limiting logging. Less logging also meant government foresters weren’t doing as much active management of forests — thinnings, prescribed burns and other activities to reduce wildfire risk.
Zybach told Evergreen magazine that year the Clinton administration’s plan for “naturally functioning ecosystems” free of human interference ignored history and would fuel “wildfires reminiscent of the Tillamook burn, the 1910 fires and the Yellowstone fire.”
Between 1952 and 1987, western Oregon saw only one major fire above 10,000 acres. The region’s relatively fire-free streak ended with the Silver Complex Fire of 1987 that burned more than 100,000 acres in the Kalmiopsis Wilderness area, torching rare plants and trees the federal government set aside to protect from human activities. The area has burned several more times since the 1980s.
“Mostly fuels were removed through logging, active management — which they stopped — and grazing,” Zybach said in an interview. “You take away logging, grazing and maintenance, and you get firebombs.”
Now, Oregonians are dealing with 13 wildfires engulfing 185,000 acres. California is battling nine fires scorching more than 577,000 acres, mostly in the northern forested parts of the state managed by federal agencies.
The Mendocino Complex Fire quickly spread to become the largest wildfire in California since the 1930s, engulfing more than 283,000 acres. The previous wildfire record was set by 2017’s Thomas Fire that scorched 281,893 acres in Southern California.
While bad fires still happen on state and private lands, most of the massive blazes happen on or around lands managed by the U.S. Forest Service and other federal agencies, Zybach said. Poor management has turned western forests into “slow-motion time bombs,” he said.
A feller buncher removing small trees that act as fuel ladders and transmit fire into the forest canopy.
5. True environmentalism is not nature love, but nature management.
While wildfires do happen across the country, poor management by western states has served to turn entire regions into tinderboxes. By letting nature play out its course so close to civilization, this is the course California and Oregon have taken.
Many in heartland America and along the Eastern Seaboard often see logging and firelines if they travel to a rural area. This is part and parcel of life outside of the city, where everyone knows that because of a few minor eyesores their houses and communities are safer from the primal fury of wildfires.
In other words, leaving the forests to “nature,” and protecting the endangered Spotted Owl created denser forests––300-400 trees per acre rather than 50-80–– with more fuel from the 129 million diseased and dead trees that create more intense and destructive fires. Yet California spends more than ten times as much money on electric vehicle subsidies ($335 million) than on reducing fuel in a mere 60,000 of 33 million acres of forests ($30 million).
Rancher Ross Frank worries that funding to fight fires in Western communities like Chumstick, Wash., has crowded out important land management work. Rowan Moore Gerety/Northwest Public Radio
Once again, global warming “science” is a camouflage for political ideology and gratifying myths about nature and human interactions with it. On the one hand, progressives seek “crises” that justify more government regulation and intrusion that limit citizen autonomy and increase government power. On the other, well-nourished moderns protected by technology from nature’s cruel indifference to all life can afford to indulge myths that give them psychic gratification at little cost to their daily lives.
As usual, bad cultural ideas lie behind these policies and attitudes. Most important is the modern fantasy that before civilization human beings lived in harmony and balance with nature. The rise of cities and agriculture began the rupture with the environment, “disenchanting” nature and reducing it to mere resources to be exploited for profit. In the early 19thcentury, the growth of science that led to the industrial revolution inspired the Romantic movement to contrast industrialism’s “Satanic mills” and the “shades of the prison-house,” with a superior natural world and its “beauteous forms.” In an increasingly secular age, nature now became the Garden of Eden, and technology and science the signs of the fall that has banished us from the paradise enjoyed by humanity before civilization.
The untouched nature glorified by romantic environmentalism, then, is not our home. Ever since the cave men, humans have altered nature to make it more conducive to human survival and flourishing. After the retreat of the ice sheets changed the environment and animal species on which people had depended for food, humans in at least four different regions of the world independently invented agriculture to better manage the food supply. Nor did the American Indians, for example, live “lightly on the land” in a pristine “forest primeval.” They used fire to shape their environment for their own benefit. They burned forests to clear land for cultivation, to create pathways to control the migration of bison and other game, and to promote the growth of trees more useful for them.
Remaining trees and vegetation on the forest floor are more vigorous after removal of small trees for fuels reduction.
And today we continue to improve cultivation techniques and foods to make them more reliable, abundant, and nutritious, not to mention more various and safe. We have been so successful at managing our food supply that today one person out of ten provides food that used to require nine out of ten, obesity has become the plague of poverty, and famines result from political dysfunction rather than nature.
That’s why untouched nature, the wild forests filled with predators, has not been our home. The cultivated nature improved by our creative minds has. True environmentalism is not nature love, but nature management: applying skill and technique to make nature more useful for humans, at the same time conserving resources so that those who come after us will be able to survive. Managing resources and exploiting them for our benefit without destroying them is how we should approach the natural world. We should not squander resources or degrade them, not because of nature, but because when we do so, we are endangering the well-being of ourselves and future generations.
Conclusion
The annual burnt area from wildfires has declined over the past ninety years both in the U.S. and globally. Even this year’s wildfires are unlikely to come close to the average burn extent of the 1930s. The large wildfires this year are due to a combination of decades of poor forest management along with a weather pattern that has trapped warm, dry air over the West. The contention that global warming has played a causal role in the pattern is balderdash, but apparently that explanation seems plausible to the uninformed, and it is typical of the propaganda put forward by climate change interests.