When is it Warming?–The Real Reason for the Pause

June 21 E.M. Smith made an intriguing comment on the occasion of Summer Solstice (NH) and Winter Solstice (SH):

“This is the time when the sun stops the apparent drift in the sky toward one pole, reverses, and heads toward the other. For about 2 more months, temperatures lag this change of trend. That is the total heat storage capacity of the planet. Heat is not stored beyond that point and there can not be any persistent warming as long as winter brings a return to cold.

I’d actually assert that there are only two measurements needed to show the existence or absence of global warming. Highs in the hottest month must get hotter and lows in the coldest month must get warmer. BOTH must happen, and no other months matter as they are just transitional.

I’m also pretty sure that the comparison of dates of peaks between locations could also be interesting. If one hemisphere is having a drift to, say, longer springs while the other is having longer falls, that’s more orbital mechanics than CO2 driven and ought to be reflected in different temperature trends / rates of drift.”

https://chiefio.wordpress.com/2015/06/21/summer-solstice-is-here/

Notice that the global temperature tracks with the seasons of the NH. The reason for this is simple. The NH has twice as much land as the Southern Hemisphere (SH). Oceans have greater heat capacity and do not change temperatures as much as land does. So every year when there is almost a 4 °C swing in the temperature of the Earth, it follows the seasons of the NH. This is especially interesting because the Earth gets the most energy from the sun in January right now. That is because of the orbit of the Earth. The perihelion is when the Earth is closest to the sun and that currently takes place in January.

http://theinconvenientskeptic.com/2010/10/how-the-northern-hemisphere-drives-the-modern-climate/

Observations and Analysis:

My curiosity piqued by Chiefio’s comment, I went looking for data to analyze to test his proposition. As it happens, Berkeley Earth provides data tables for monthly Tmax and Tmin by hemisphere (NH and SH), from land station records. Setting aside any concerns about adjustments or infilling I did the analysis taking the BEST data tables at face value. Since land surface temperatures are more variable than sea surface temps, it seems like a reasonable dataset to analyze for the mentioned patterns.

Tmax Records

NH and SH long-term trends are the same 0.07C/decade, and in both there was cooling before 1979 and above average warming since. However, since 1950 NH warmed more strongly, and mostly prior to 1998, while SH has warmed strongly since 1998. (Trends below are in C/yr.)

 Tmax Trends NH Tmax SH Tmax
All years 0.007 0.007
1998-2013 0.018 0.030
1979-1998 0.029 0.017
1950-1979 -0.003 -0.003
1950-2013 0.020 0.014

Summer Comparisons:

NH summer months are June, July, August, (6-8) and SH summer is December, January, February (12-2). The trends for each of those months were computed and the annual trends subtracted to show if summer months were warming more than the rest of the year (Trends below are in C/yr.).

Month less Annual NH
Tmax
NH Tmax NH Tmax SH Tmax SH Tmax SH Tmax
Summer Trends

6

7 8 12 1

2

All years -0.002 -0.004 -0.004 0.000 0.003 0.002
1998-2013 0.026 0.002 0.006 0.022 0.004 -0.029
1979-1998 0.003 -0.004 -0.003 -0.014 -0.029 0.001
1950-1979 -0.002 -0.002 -0.005 0.004 0.005 -0.005
1950-2013 -0.002 -0.003 -0.002 -0.002 -0.002 -0.002

NH summer months are cooler than average overall and since 1950. Warming does appear since 1998 with a large anomaly in June and also warming in August.SH shows no strong pattern of Tmax warming in summer months. A hot December trend since 1998 is offset by a cold February. Overall SH summers are just above average, and since 1950 have been slightly cooler.

Tmin Records

Both NH and SH show Tmin rising 0.12C/decade, much more strongly warming than Tmax. SH show that average warming persisting throughout the record, slightly higher prior to 1979. NH Tmin is more variable, showing a large jump 1979-1998, a rate of 0.25 C/decade (Trends below are in C/yr.).

 Trends NH Tmin SH Tmin
All years 0.012 0.012
1998-2013 0.010 0.010
1979-1998 0.025 0.011
1950-1979 0.006 0.014
1950-2013 0.022 0.014

Winter Comparisons:

SH winter months are June, July, August, (6-8) and NH winter is December, January, February (12-2). The trends for each of those months were computed and the annual trends subtracted to show if winter months were warming more than the rest of the year (Trends below are in C/yr.).

Month less Annual NH Tmin NH Tmin NH Tmin SH Tmin SH Tmin SH Tmin
Winter Trends

12

1 2 6 7

8

All years 0.007 0.008 0.007 0.005 0.003 0.004
1998-2013 -0.045 -0.035 -0.076 -0.043 -0.024 -0.019
1979-1998 -0.018 -0.005 0.024 0.034 0.008 -0.008
1950-1979 0.008 0.005 0.007 0.008 0.012 0.013
1950-2013 0.001 0.007 0.008 -0.001 -0.002 0.002

NH winter Tmin warming is stronger than SH Tmin trends, but shows quite strong cooling since 1998. An anomalously warm February is the exception in the period 1979-1998.Both NH and SH show higher Tmin warming in winter months, with some irregularities. Most of the SH Tmin warming was before 1979, with strong cooling since 1998. June was anomalously warming in the period 1979 to 1998.

Summary

Tmin did trend higher in winter months but not consistently. Mostly winter Tmin warmed 1950 to 1979, and was much cooler than other months since 1998.

Tmax has not warmed in summer more than in other months, with the exception of two anomalous months since 1998: NH June and SH December.

Conclusion:

I find no convincing pattern of summer Tmax warming carrying over into winter Tmin warming. In other words, summers are not adding warming more than other seasons. There is no support for concerns over summer heat waves increasing as a pattern.

It is interesting to note that the plateau in temperatures since the 1998 El Nino is matched by winter months cooler than average during that period, leading to my discovering the real reason for lack of warming recently.

The Real Reason for the Pause in Global Warming

These data suggest warming trends are coming from less cold overnight temperatures as measured at land weather stations. Since stations exposed to urban heat sources typically show higher minimums overnight and in winter months, this pattern is likely an artifact of human settlement activity rather than CO2 from fossil fuels.

Thus the Pause (more correctly the Plateau) in global warming is caused by end of the century completion of urbanization around most surface stations. With no additional warming from additional urban heat sources, temperatures have remained flat for more than 15 years.

 

Data is here:
http://berkeleyearth.lbl.gov/regions/northern-hemisphere
http://berkeleyearth.lbl.gov/regions/southern-hemisphere

Ecomodernist Critiques the Encyclical

Updated June 30, 2015  New GWPF paper by Indur M. Goklany linked below.

Mark Lynas Responds to the papal encyclical

“While the Pope bemoans the burning of coal, oil and gas he does so without recognizing that increasing energy consumption in developing countries is a precondition for poverty reduction. He seems to have no understanding of trade-offs — or the fact that pretty much all the projected future carbon emissions increases will come from the developing world.

Consider that while the coal boom creates global warming, it also frees people from burning wood — the toxic smoke of which killed four million people last year. And as Europe, the US and China have gotten richer, all have stepped up efforts to replace coal with solar, wind, natural gas and nuclear.”

Laudato Si is very relevant to the emerging ecomodernist movement because it makes explicit the ascetisism, romanticism and reactionary paternalism inherent in many aspects of traditional environmentalist thinking. It also helpfully draws out the religiously-originated narratives that underpin a lot of green themes of sinfulness/redemption and end-times doomsaying on issues like climate change.”

By Mark Lynas, Ted Nordhaus and Michael Shellenberger – See more at: http://www.marklynas.org/2015/06/a-pope-against-progress/#sthash.IKnWQMrm.dpuf

Mitigation is Bad for Us and the Planet

In the excitement over the Pope’s encyclical and upcoming Paris conference, people are not talking about how CO2 mitigation (treaties, carbon pricing or regulations), if successful, would put civilization onto an unsustainable path.

Fortunately, Matt Ridley is shining some light in this direction:

Until now, green thinking has wanted us to go back to nature: to reject innovations such as genetically modified food, give up commerce and consumption and energy and materials and live simpler lives so that nature is not abused and the climate is not wrecked. The eco-modernists, who include the veteran Californian green pioneer Stewart Brand and the British green campaigner Mark Lynas, say this is a mistake. “Absent a massive human die-off, any large-scale attempt at recoupling human societies to nature using these [ancestral] technologies would result in an unmitigated ecological and human disaster.”

The Ecomodernist Manifesto promises a much needed reformation in the green movement. Its 95 theses should be nailed to the door of the Vatican when the pope’s green-tinged encyclical comes out next month, because unlike the typical eco-wail, it contains good news for the poor. It says: no, we are not going to stop you getting rich and adopting new technologies and leaving behind the misery of cooking over wood fires in smoky huts with no artificial light. No, we do not want you to stay as subsistence farmers. Indeed, the quicker we can get you into a city apartment with a car, a phone, a fridge and a laptop, the better. Because then you won’t be taking wood and bushmeat from the forest.

http://www.rationaloptimist.com/blog/eco-modernism-and-sustainable-intensification.aspx

Now, I am not thrilled with using the term “manifesto” or with references to the “anthropocene” era, but there is a positive direction in this.

“We offer this statement in the belief that both human prosperity and an ecologically vibrant planet are not only possible, but also inseparable. By committing to the real processes, already underway, that have begun to decouple human well-being from environmental destruction, we believe that such a future might be achieved. As such, we embrace an optimistic view toward human capacities and the future.”

These folks are saying we can use fossil fuels in a way that benefits both humans and nature, and in fact, if we don’t do so, our civilization is not sustainable.  Using fossil fuels is good for people and the planet and is the only sustainable way forward.

Strong stuff, but it doesn’t fit into the usual boxes.

More on Ecomodernism here:

http://thebreakthrough.org/index.php/voices/michael-shellenberger-and-ted-nordhaus/an

From Indur M. Goklany in The Pontifical Academies’ BROKEN MORAL COMPASS

But the statement is fatally flawed. It is riddled with sins of omission and commission bolstered by wishful thinking. For instance, it ignores decades of well documented empirical data that show that human wellbeing has advanced throughout the world and that the terrestrial biosphere’s productivity has increased above preindustrial levels, allowing it to support more biomass, in no small part because of carbon dioxide emissions from humanity’s use of fossil fuels. The advances in human wellbeing include reductions in poverty, hunger, malnutrition, death and disease, and increases in life expectancy and the standard of living across the world. The poor have been major beneficiaries of these advances.

http://www.thegwpf.org/content/uploads/2015/06/Vatican-compass.pdf

Dueling Encyclicals

With the Vatican declaring UN IPCC science as Christian Truth, I am reminded of Aristotle (384 to 322 BC) who said:

“Give me a child until he is 7 and I will show you the man.”

If Aristotle knew what we know today about how oceans make the climate, how might he convey that meaning to one of his young Greek students?

Perhaps he would tell the story this way.

Poseidon, Lord of the Oceans

I am Poseidon and I rule the oceans, and with them I make the climate what it is.

I store the sun’s energy in my ocean water so that our world is neither too hot nor too cold.

I add water and energy into the air and together we spread warmth from the tropics to the poles. There are many obstacles and delays along the way, and there are clashes between hot and cold, which you know as storms.

The land masses make basins to collect water and energy and I send heat to each basin to form its own climate. Water heat is transported slowly, between basins and from equator to pole and back again.

The water in the air returns as rain falling on land and sea. Near the poles the water freezes and stays, sometimes for many years, until rejoining the ocean. Always the water returns and the cycles continue.

Do not be afraid of the future. Respect the oceans, take care of the land and each other, and all will be well.

The Climate According to Poseidon

Basics of Ocean Acidification

Updates added below June 20 and 24, 2015

Update below July 2, 2015: Ocean pH is actually trending alkaline

Update below September 15, 2015: Extensive discussion of ocean chemistry

If surface temperatures don’t skyrocket soon, expect to hear a lot in the coming months about “ocean acidification.”  This sounds scary, and that is the point of emphasizing it in the runup to Paris COP.

So here’s the basic chemistry of CO2 and H20:

8lrtxibuouhqy8limppbfwkc76e5k_rxa9xbrm8mssw

That seems straight forward,  So what is the problem?

That looks fairly serious.  So what does the IPCC have to say about this issue?

What does it say in the SPM (Summary for Policy Makers)?

For this issue, I looked at the topic of ocean acidification and fish productivity. The SPM asserts on Page 17 that fish habitats and production will fall and that ocean acidification threatens marine ecosystems.

“Open-ocean net primary production is projected to redistribute and, by 2100, fall globally under all RCP scenarios. Climate change adds to the threats of over-fishing and other non-climatic stressors, thus complicating marine management regimes (high confidence).” Pg 17 SPM

“For medium- to high-emission scenarios (RCP4.5, 6.0, and 8.5), ocean acidification poses substantial risks to marine ecosystems, especially polar ecosystems and coral reefs, associated with impacts on the physiology, behavior, and population dynamics of individual species from phytoplankton to animals (medium to high confidence).” Pg 17 SPM

So, the IPCC agrees that ocean acidification is a serious problem due to rising CO2 emissions from burning fossil fuels.

What does it say in the Working Group Reports?

But wait a minute.  Let’s see what is in the working group reports that are written by scientists, not politicians.

WGII Report, Chapter 6 covers Ocean Systems. There we find a different story with more nuance and objectivity:

“Few field observations conducted in the last decade demonstrate biotic responses attributable to anthropogenic ocean acidification” pg 4

“Due to contradictory observations there is currently uncertainty about the future trends of major upwelling systems and how their drivers (enhanced productivity, acidification, and hypoxia) will shape ecosystem characteristics (low confidence).” Pg 5

“Both acclimatization and adaptation will shift sensitivity thresholds but the capacity and limits of species to acclimatize or adapt remain largely unknown” Pg 23

“Production, growth, and recruitment of most but not all non-calcifying
seaweeds also increased at CO2 levels from 700 to 900 µatm Pg 25

“Contributions of anthropogenic ocean acidification to climate-induced alterations in the field have rarely been established and are limited to observations in individual species” Pg. 27

“To date, very few ecosystem-level changes in the field have been attributed to anthropogenic or local ocean acidification.” Pg 39

Ocean Chemistry on the Record

Contrast the IPCC headlines with the the Senate Testimony of John T. Everett, in which he said:

“There is no reliable observational evidence of negative trends that can be traced definitively to lowered pH of the water. . . Papers that herald findings that show negative impacts need to be dismissed if they used acids rather than CO2 to reduce alkalinity, if they simulated CO2 values beyond triple those of today, while not reporting results at concentrations of half, present, double and triple, or as pointed out in several studies, they did not investigate adaptations over many generations.”

“In the oceans, major climate warming and cooling and pH (ocean pH about 8.1) changes are a fact of life, whether it is over a few years as in an El Niño, over decades as in the Pacific Decadal Oscillation or the North Atlantic Oscillation, or over a few hours as a burst of upwelling (pH about 7.59-7.8) appears or a storm brings acidic rainwater (pH about 4-6) into an estuary.”
http://www.epw.senate.gov/public/index.cfm?FuseAction=Files.View&FileStore_id=db302137-13f6-40cc-8968-3c9aac133b16

Many organisms benefit from less alkaline water.

(Added in thanks to David A.’s comment below)

In addition, IPCC has ignored extensive research showing positive impacts on marine life from lower pH. These studies are catalogued at CO2 Science with this summary:

There are numerous observations of improvement in calcification of disparate marine life in realistic rates of PH change due to increased CO2.

“In the final graphical representations of the information contained in our Ocean Acidification Database, we have plotted the averages of all responses to seawater acidification (produced by additions of both HCl and CO2) for all five of the life characteristics of the various marine organisms that we have analyzed over the five pH reduction ranges that we discuss in our Description of the Ocean Acidification Database Tables, which pH ranges we illustrate in the figure below.”

“The most striking feature of Figure 11 is the great preponderance of data located in positive territory, which suggests that, on the whole, marine organisms likely will not be harmed to any significant degree by the expected decline in oceanic pH. If anything, in fact, the results suggest that the world’s marine life may actually slightly benefit from the pH decline, which latter possibility is further borne out by the scatter plot of all the experimental data pertaining to all life characteristic categories over the same pH decline range, as shown below in Figure 12.”

At PH decline from control of .125, calcification, metabolism, fertility, growth and survival all moved into positive territory.

http://www.co2science.org/data/acidification/acidification.php

Summary

The oceans are buffered by extensive mineral deposits and will never become acidic. Marine life is well-adapted to the fluctuations in pH that occur all the time.

This is another example of climate fear-mongering:  It never happened before, it’s not happening now, but it surely will happen if we don’t DO SOMETHING!.

Conclusion

Many know of the Latin phrase “caveat emptor,” meaning “Let the buyer beware”.

When it comes to climate science, remember also “caveat lector”–”Let the reader beware”.

Update added June 20, 2015

For additional commentary on ocean acidification:

http://www.newsmax.com/FastFeatures/ocean-acidification-global-warming-quotes-debate/2015/05/06/id/642876/

Update added June 24, 2015

Patrick Moore also provides a thorough debunking here:

“It is a fact that people who have saltwater aquariums sometimes add CO2 to the water in order to increase coral growth and to increase plant growth. The truth is CO2 is the most important food for all life on Earth, including marine life. It is the main food for photosynthetic plankton (algae), which in turn is the food for the entire food chain in the sea.”

http://news.heartland.org/editorial/2015/05/27/why-coral-reefs-and-shellfish-will-not-die-ocean-acidification

Update added July 2, 2015

Scientists have had pH meters and measurements of the oceans for one hundred years. But experts decided that computer simulations in 2014 were better at measuring the pH in 1910 than the pH meters were. The red line (below) is the models recreation of ocean pH. The blue stars are the data points — the empirical evidence.

What we have here is one of the basic foundations of the climate change scare, that is falling ocean pH levels with increased atmospheric CO2 content, being completely dismissed by the empirical ocean pH data the alarmist climate scientists didn’t want to show anyone because it contradicted their ‘increasing ocean acidity’ narrative.

http://joannenova.com.au/2015/01/oceans-not-acidifying-scientists-hid-80-years-of-ph-data/

Update added September 15, 2015

In summary, recent research publications are using a term (OA) that is technically incorrect, misleading, and pejorative; it could not be found in the oceanography literature before about 15 years ago. . .

The claim that the surface-water of the oceans has declined in pH from 8.2 to 8.1, since the industrial revolution, is based on sparse, contradictory evidence, at least some of which is problematic computer modeling. Some areas of the oceans, not subject to algal blooms or upwelling, may be experiencing slightly lower pH values than were common before the industrial revolution. However, forecasts for ‘average’ future pH values are likely exaggerated and of debatable consequences. The effects of alkaline buffering and stabilizing biological feedback loops seem to be underappreciated by those who carelessly throw around the inaccurate term “ocean acidification.”

http://wattsupwiththat.com/2015/09/15/are-the-oceans-becoming-more-acidic/

Mitigation is Bad for Us and the Planet

Updated June 30, 2015  New GWPF paper by Indur M. Goklany linked below.

In the excitement over the Pope’s encyclical and upcoming Paris conference, people are not talking about how CO2 mitigation (treaties, carbon pricing or regulations), if successful, would put civilization onto an unsustainable path.

Fortunately, Matt Ridley is shining some light in this direction:

Until now, green thinking has wanted us to go back to nature: to reject innovations such as genetically modified food, give up commerce and consumption and energy and materials and live simpler lives so that nature is not abused and the climate is not wrecked. The eco-modernists, who include the veteran Californian green pioneer Stewart Brand and the British green campaigner Mark Lynas, say this is a mistake. “Absent a massive human die-off, any large-scale attempt at recoupling human societies to nature using these [ancestral] technologies would result in an unmitigated ecological and human disaster.”

The Ecomodernist Manifesto promises a much needed reformation in the green movement. Its 95 theses should be nailed to the door of the Vatican when the pope’s green-tinged encyclical comes out next month, because unlike the typical eco-wail, it contains good news for the poor. It says: no, we are not going to stop you getting rich and adopting new technologies and leaving behind the misery of cooking over wood fires in smoky huts with no artificial light. No, we do not want you to stay as subsistence farmers. Indeed, the quicker we can get you into a city apartment with a car, a phone, a fridge and a laptop, the better. Because then you won’t be taking wood and bushmeat from the forest.

http://www.rationaloptimist.com/blog/eco-modernism-and-sustainable-intensification.aspx

Now, I am not thrilled with using the term “manifesto” or with references to the “anthropocene” era, but there is a positive direction in this.

“We offer this statement in the belief that both human prosperity and an ecologically vibrant planet are not only possible, but also inseparable. By committing to the real processes, already underway, that have begun to decouple human well-being from environmental destruction, we believe that such a future might be achieved. As such, we embrace an optimistic view toward human capacities and the future.”

These folks are saying we can use fossil fuels in a way that benefits both humans and nature, and in fact, if we don’t do so, our civilization is not sustainable.  Using fossil fuels is good for people and the planet and is the only sustainable way forward.

Strong stuff, but it doesn’t fit into the usual boxes.

More on Ecomodernism here:

http://thebreakthrough.org/index.php/voices/michael-shellenberger-and-ted-nordhaus/an

From Indur M. Goklany in The Pontifical Academies’ BROKEN MORAL COMPASS

But the statement is fatally flawed. It is riddled with sins of omission and commission bolstered by wishful thinking. For instance, it ignores decades of well documented empirical data that show that human wellbeing has advanced throughout the world and that the terrestrial biosphere’s productivity has increased above preindustrial levels, allowing it to support more biomass, in no small part because of carbon dioxide emissions from humanity’s use of fossil fuels. The advances in human wellbeing include reductions in poverty, hunger, malnutrition, death and disease, and increases in life expectancy and the standard of living across the world. The poor have been major beneficiaries of these advances.

Click to access Vatican-compass.pdf

Comet Lander Wakes Up!

In Nov. 2014, In a stunning historical achievement the European Space Agency succeeded not only to put the Rosetta probe into orbit around a comet, but amazingly to place the Philae lander on the comet’s surface.

Subsequently the batteries died, and signals stopped for 7 months, but have now resumed.

Philae is best known for transmitting startling images such as these:

OK, I’m not so sure about that last one.

How About That Blob? (June 13 Update)

June 13, 2015

As hoped for by Paris COP promoters, and by Californians looking for El Nino precipitation, the Blob in the North Pacific has intensified and may at least partly fulfill both expectations.

HADSST3 results for May are now in, and the sea surface temperature warming anomaly is up:

Global +0.12C over last May,
NH +0.16C over last May.

That will show up also in air temperature estimates, since 71% of the earth’s surface is covered by oceans. For example, UAH TLT anomalies show Global oceans +0.06C over last May, but Global land -0.1C, so Global UAH is only up +0.02C over May 2014. (Note: UAH uses satellites to measure air temperatures many meters above land or ocean, while surface datasets like HADCRUT, BEST, GISTEMP use the measured SSTs in their global mean temperature estimates).

The Blob difference shows up in UAH in the NH results: NH anomaly is +0.07 over last year, with the same increase showing over land and ocean.  Interestingly, UAH shows the North Pole cooler than a year ago, the TLT over the Arctic being -0.06 less than a year ago.  The South Pole land air temps are a whopping -0.2C colder than last May.

As far as Arctic Ice is concerned, the Blob probably caused the Bering Sea to melt out more than one month earlier than last year.  About 10% of the water entering the Arctic Ocean comes from Bering, so there should be some impact on ice melting the immediate BCE region (Beaufort, Chukchi, East Siberian Seas). So far, in that region, 2015 is tracking last year’s melt at a slightly lower extent -4%, not yet a significant effect from the Blob.

More on Arctic Ice melt season here:

https://rclutz.wordpress.com/2015/06/02/arctic-ice-watch-june-daily/

Background on the Blob

Many have noticed the warm water anomaly in the Northern Pacific, which shows up as a weak El Nino, but somewhat unexpected and out of the ordinary pattern. The warm Pacific SST last year almost pushed 2014 to a new record average surface temperature, and fossil fuel activists are pinning their Paris hopes on this year.

So it is timely for the Meteorologist who named this event to provide a clear explanation of the natural causes of the Blob phenomenon.

From Nicholas Bond (excerpted from post linked below):

Blob 101
The development of the blob of unusually warm water can be attributed largely to an unusual weather pattern that set up shop over a large region extending from the North Pacific Ocean across North America from October 2013 into February 2014.

This pattern featured a strong and long-lasting weather pattern with higher-than-normal pressure – called a ridge – over the ocean centered offshore of the Pacific Northwest. This ridge of high pressure reduced the number and intensity of storms making landfall, leading to reduced precipitation west of the Continental Divide compared to seasonal norms.

In a study published earlier this month, my colleagues and I fingered the stubborn high-pressure ridge mentioned above, and in particular the weak winds associated with it. The result was a lower-than-normal rate in how quickly heat is transferred from the ocean to the atmosphere, and slower movement of cooler water into the formation region of the blob.
In other words, the unusual atmospheric conditions produced less cooling than typical for the season from fall 2013 through much of the following winter, yielding the sea surface temperature anomaly pattern. So we can essentially blame the ridge for the blob, but what caused the ridge in the first place?

The ocean circulation – that is, the currents – and the weather during the past year, which was unusual in its own right, combined to cause the blob to evolve into a wide strip of relatively warm water along the entire West Coast of North America (see image, below).

This happens to be a pattern that has occurred before in association with decades-long shifts in ocean temperature known as the Pacific Decadal Oscillation (PDO). Previous expressions of the PDO have had major and wide-ranging impacts on the marine ecosystem including salmon and other species of fish; recent developments are receiving a great deal of attention from fishery-oceanographers along the West Coast.

http://theconversation.com/what-is-the-warm-blob-in-the-pacific-and-what-can-it-tell-us-about-our-future-climate-40140

In Praise of Michael Crichton

I am grateful for Michael Crichton speaking out so forcefully and effectively on global warming before he died in 2008. His novel “State of Fear” presaged the media circus of the last decade since the book appeared in 2004. In particular, the attitudes and behavior of climate fear mongers in the novel seemed incredible at the time, but were validated in spades by the Climategate revelations.

To be honest, I was not that thrilled reading the story in 2005, lacking any awareness that such a campaign was underway. I was reading anything Crichton wrote because of my interest in science and technology, and his focus on the near future effects. His other novels were much more compelling: Jurassic Park, Congo, Disclosure, Air Frame, to name a few.

But State of Fear included not only a novel, but many pages of graphs and analyses that showed the weaknesses in alarmist claims. And there were transcripts of Crichton speeches that laid waste to climate claims. The appendices got into my awareness much more than did the story itself.

Judging by what others have said on blogs, I was not the only one for whom this book triggered a skeptical stance toward global warming alarm. It was a wake up call for some, and for others, like myself, it was an inoculation against the viral media onslaught to come.

Let me explain with a brief tangent

Edward De Bono wrote extensively some decades ago on the subject of lateral thinking, or “thinking outside of the box.” His studies of human problem solving showed emphatically that everything depends on the sequence in which information enters a person’s awareness.

Thus, on a topic where I have no opinion, my mind is open to various perspectives. Fairly soon, however, I am likely to form a gestalt, or paradigm that makes sense of what I know, which involves discounting or dismissing facts that don’t fit. This occurs because uncertainly and ambiguity is uncomfortable, even painful if the matter is of great consequence.

Over time, I accept any and all information that reinforces my gestalt, and become increasingly resistant to facts that challenge or contradict my paradigm. This occurs because it is even more painful to discard a gestalt that I used to organize my thinking, since I now become disoriented, unable to process new information that comes to me.

This is background to understand how serious is the educational propagation of the man-made climate change notion. I went through this process regarding global warming starting in 2009 as an adult with a background of a degree in organic chemistry, and I came out skeptical of the IPCC consensus perspective. That outcome would be less likely if I were a young student today.

And the critical event was Crichton’s novel, evidence and speeches that preceded my exposure to the alarms, and predisposed me to question and examine critically. My takeaway message from Crichton had been: “When you hear scary things about the climate, don’t take them at face value–investigate and get a second or third opinion.”

In fact that experience regarding climate science led me to the slogan of this blog: Reading between the lines and underneath the hype.

Michael Crichton had two principle concerns concerning science and society, which led to his criticism of global warming. First, he warned against governments capturing science as a tool to cow the population into funding and submitting to politicians’ policies. Second, he thought scientists in many fields were far too certain and trusting of their knowledge and tools, especially computerized systems.

Some Quotes:

On Rampant Media Speculation (2002):

But just in terms of the general emotional tenor of life, I often think people are nervous, jittery in this media climate of what if, what if, maybe, perhaps, could be…when there is usually no sensible reason to feel nervous.

Like a bearded nut in robes on the sidewalk proclaiming the end of the world is near, the media is just doing what makes it feel good, not reporting hard facts. We need to start seeing the media as a bearded nut on the sidewalk, shouting out false fears. It’s not sensible to listen to it.

On Enivronmentalism (2003)

Today, one of the most powerful religions in the Western World is environmentalism. Environmentalism seems to be the religion of choice for urban atheists. Why do I say it’s a religion? Well, if you look carefully at the core beliefs, you will see that environmentalism is in fact a perfect 21st century remapping of traditional Judeo-Christian beliefs and myths.

And so, sadly, with environmentalism it ncreasingly seems facts aren’t necessary, because the tenets of environmentalism are all about belief. It’s about whether you are going to be a sinner or be saved, whether you are going to be on the side of salvation or on the side of doom, whether you are going to be one of us or one of them.

Am I exaggerating to make a point? I am afraid not. Because we know a lot more about the world than we did forty or fifty years ago. And what we know now is no longer supportive of many core environmental myths, yet the myths do not die. Let’s examine some of those beliefs. . .

I want to argue that it is now time for us to make a major shift in our thinking about the environment, similar to the shift that occurred around the first Earth Day in 1970, when this awareness was first heightened. But this time around, we need to get environmentalism out of the sphere of religion. We need to stop the mythic fantasies, and we need to stop the doomsday predictions. We need to start doing hard science instead.

On Politicized Science (2003):

Finally, I would remind you to notice where the claim of consensus is invoked. Nobody says the consensus of scientists agrees that E=mc2. Nobody says the consensus is that the sun is 93 million miles away. It would never occur to anyone to speak that way. Consensus is invoked only in situations where the science is not solid enough. Which means, in turn, that if somebody tells you the consensus of scientists believes something or other, you should be immediately suspicious.

Once you abandon strict adherence to what science tells us, once you start arranging the truth in a press conference, then anything is possible. In one context, perhaps you will get mobilization against nuclear war. But in another context, you get Lysenkoism. In another, you get Nazi euthanasia. The danger is always there, if you subvert science to political ends.

That is why it is so important for the future of science that the line between what science can say with certainty, and what it cannot, be drawn clearly—and defended.

What, then, can we say were the lessons of Nuclear Winter? I believe the lesson was that given a catchy name, a strong policy position, and an aggressive media campaign, nobody will dare to criticize the science, and in short order, a terminally weak thesis can be established as fact. After that, any criticism becomes beside the point. The war is already over, without a shot being fired. That was the lesson, and we had a textbook application soon afterward, with second hand smoke.

And so, in this elastic anything-goes world where science—or nonscience—is the handmaiden of questionable public policy, we arrive at last at global warming. It is not my purpose here to rehash the details of this most magnificent of the demons haunting the world. I would just remind you of the now-familiar pattern by which these things are established. Dramatic announcements are carefully contrived. Evidentiary uncertainties are glossed over in the unseemly rush for an overarching policy, and for grants to support the policy by delivering findings that are desired by the patron. Next, the isolation of those scientists who won’t get with the program, and the characterization of those scientists as outsiders and “skeptics” in quotation marks—suspect individuals with suspect motives, industry flunkies, reactionaries, or simply anti-environmental nutcases. In short order, debate ends, even though prominent scientists in many fields are uncomfortable about how things are being done.

To an outsider, the most significant innovation in the global warming controversy is the overt reliance that is being placed on models. Back in the days of nuclear winter, computer models were invoked to add weight to a conclusion: “These results are derived with the help of a computer model.” But now, large-scale computer models are seen as generating data in themselves. No longer are models judged by how well they reproduce data from the real world—increasingly, models provide the data. As if they were themselves a reality. And indeed they are, when we are projecting forward. There can be no observational data about the year 2100. There are only model runs.

Stepping back, I have to say the arrogance of the model makers is breathtaking. There have been, in every century, scientists who say they know it all. Since climate may be a chaotic system—no one is sure about
that—these predictions are inherently doubtful, to be polite. But more to the point, even if the models get the science spot-on, they can never get the sociology. To predict anything about the world a hundred years
from now is simply absurd.
Wow. Talk about hitting the nail on the head a decade in advance.

For those who want to read more (highly recommended):

State of Fear pdf is here:

Click to access 0060820640.pdf

State of Fear ebook is here:
http://books4all.cc:8080/mobile?num=100000

And a final word from the novel:

Our planet is five billion years old, and it has been changing constantly all during that time. […] Our atmosphere is as violent as the land beneath it. At any moment there are one thousand five hundred electrical storms across the planet. Eleven lightning bolts strike the ground each second. A tornado tears across the surface every six hours. And every four days, a giant cyclonic storm, hundreds of miles in diameter, spins over the ocean and wreaks havoc on the land.

The nasty little apes that call themselves human beings can do nothing except run and hide. For these same apes to imagine they can stabilize this atmosphere is arrogant beyond belief. They can’t control the climate.

The reality is, they run from the storms.

Thank you Michael Crichton. I hope we can still pull the plug on the present day State of Fear.

Climate Models Explained

A comment by Dr. R.G. Brown of Duke University posted on June 11 at WUWT.

noaa climate model

Overview of the structure of a state-of-the-art climate model. From the NOAA website http://www.research.noaa.gov/climate/t_modeling.html

First about the way weather models work

That is not quite what they do in GCMs. There are two reasons for this. One is that a global grid of 2 million temperatures sounds like a lot, but it’s not. Remember the atmosphere has depth, and they have to initialize at least to the top of the troposphere, and if they use 1 km thick cells there are 9 or 10 layers. Say 10. Then they have 500 million square kilometers of area to cover. Even if the grid itself has two million cells, that is still cells that contain 250 square km. This isn’t terrible — 16x16x1 km cells (20 million of them assuming they follow the usual practice of slabs 1 km thick) are small enough that they can actually resolve largish individual thunderstorms — but is still orders of magnitude larger than distinct weather features like individual clouds or smaller storms or tornadoes or land features (lakes, individual hills and mountains) that can affect the weather.

There is also substantial error in their initial conditions — as you say, they smooth temperatures sampled at a lot fewer than 2 million points to cover vast tracts of the grid where there simply are no thermometers, and even where they have surface thermometers they do not generally have soundings (temperature measurements from e.g. balloons that ride up the air column at a location) so they do not know the temperature in depth. The model initialization has to do things like take the surface temperature guess (from a smoothing model) and guess the temperature profile overhead using things like the adiabatic lapse rate, a comparative handful of soundings, knowledge of the cloudiness or whatever of the cell obtained from satellite or radar (where available) or just plain rules of thumb (all built into a model to initialize the model.

Then there is the ocean. Sea surface temperatures matter a great deal, but so do temperatures down to some depth (more for climate than for weather, but when large scale phenomena like hurricanes come along, the heat content of the ocean down to some depth very much plays a role in their development) so they have to model that, and the better models often contain at least one if not more layers down into the dynamic ocean. The Gulf Stream, for example, is a river in the Atlantic that transports heat and salinity and moves around 200 kilometers in a day on the surface, less at depth, which means that fluctuations in surface temperature, fed back or altered by precipitation or cloudiness or wind, move across many cells over the course of a day.  (My Bold)

Even with all of the care I describe above and then some, weather models computed at close to the limits of our ability to compute (and get a decent answer faster than nature “computes” it by making it actually happen) track the weather accurately for a comparatively short time — days — before small variations between the heavily modeled, heavily under-sampled model initial conditions and the actual initial state of the weather plus errors in the computation due to many things — discrete arithmetic, the finite grid size, errors in the implementation of the climate dynamics at the grid resolution used (which have to be approximated in various ways to “mimic” the neglected internal smaller scaled dynamics that they cannot afford to compute) cause the models to systematically diverge from the actual weather.

If they run the model many times with small tweaks of the initial conditions, they have learned empirically that the distribution of final states they obtain can be reasonably compared to the climate for a few days more in an increasingly improbable way, until around a week or ten days out the variation is so great that they are just as well off predicting the weather by using the average weather for a date over the last 100 years and a bit of sense, just as is done in almanacs.

In other words, the models, no matter how many times they are run or how carefully they are initialized, produce results with no “lift” over ordinary statistics at around 10 days. (My bold)

Evolution of state-of-the-art Climate Models from the mid 70s to the mid 00s. From IPCC (2007)

Evolution of state-of-the-art Climate Models from the mid 70s to the mid 00s. From IPCC (2007)

Now How Climate Models Work:

Then here is the interesting point. Climate models are just weather models run in exactly this way, with one exception. Since they know that the model will produce results indistinguishable from ordinary static statistics two weeks in, they don’t bother initializing them all that carefully. The idea is that no matter how then initialize them, after running them out to weeks or months the bundle of trajectories they produce from small perturbations will statistically “converge” at any given time to what is supposed to be the long time statistical average, which is what they are trying to predict.

This assumption is itself dubious, as neither the weather nor the climate is stationary and it is most definitely non-Markovian so that the neglected details in the initial state do matter in the evolution of both, and there is also no theorem of which I am aware that states that the average or statistical distribution of a bundle of trajectories generated from a nonlinear chaotic model of this sort will in even the medium run be an accurate representation of the nonstationary statistical distribution of possible future climates. But it’s the only game in town, so they give it a try.

They then run this re-purposed, badly initialized weather model out until they think it has had time to become a “sample” for the weather for some stationary initial condition (fixed date, sunlight, atmosphere, etc) and then they vary things like CO_2 systematically over time while integrating and see how the run evolves over future decades. The bundle of future climate trajectories thus generated from many tweaks of initial conditions and sometimes the physical parameters as well is then statistically analyzed, and its mean becomes the central prediction of the model and the variance or envelope of all of the trajectories become confidence intervals of its predictions.

The problem is that they aren’t really confidence intervals because we don’t really have any good reason to think that the integration of the weather ten years into the future at an inadequate grid size, with all of the accumulation of error along the way, is actually a sample from the same statistical distribution that the real weather is being drawn from subject to tiny perturbations in its initial state. The climate integrates itself down to the molecular level, not on a 16×16 km grid, and climate models can’t use that small a grid size and run in less than infinite time, so the highest resolution I’ve heard of is 100×100 km^2 cells (10^4 square km, which is around 50,000 cells, not two million).

At this grid size they cannot see individual thunderstorms at all. Indeed, many extremely dynamic features of heat transport in weather have to be modeled by some sort of empirical “mean field” approximation of the internal cell dynamics — “average thunderstormicity” or the like as thunderstorms in particular cause rapid vertical transport of a lot of heat up from the surface and rapid transport of chilled/chilling water down to the surface, among other things. The same is true of snowpack — even small errors in average snowpack coverage make big differences in total heat received in any given winter and this can feed back to kick a model well off of the real climate in a matter of years.

So far, it looks like (not unlike the circumstance with weather) climate models can sometimes track the climate for a decade or so before they diverge from it.   (My Bold)

They suffer from many other ailments as well — if one examines the actual month to month or year to year variance of the “weather” they predict, it has the wrong amplitude and decay times compared to the actual climate, which is basically saying (via the fluctuation-dissipation theorem) that they have the physics of the open system wrong. The models heavily exaggerate the effect of aerosols and tend to overreact to things like volcanic eruptions that dump aerosols into the atmosphere.

The models are tuned to cancel the exaggerated effect of aerosols with an exaggerated feedback on top of CO_2 driven warming to make them “work” to track the climate over a 20 year reference period. Sadly, this 20 year reference period was chosen to be the single strongest warming stretch of the 20th century, ignoring cooling periods and warming periods that preceded it and (probably as a consequence) diverging from the flat-to-slightly cooling period we’ve been in for the last 16 or so years (or more, or less, depending on who you are talking to, but even the IPCC formally recognizes “the pause, the hiatus”, the lack of warming for this interval, in AR5. It is a serious problem for the models and everybody knows it.

The IPCC then takes the results of many GCMs and compounds all errors by super-averaging their results (which has the effect of hiding the fluctuation problem from inquiring eyes), ignoring the fact that some models in particular truly suck in all respects at predicting the climate and that others do much better, because the ones that do better predict less long run warming and that isn’t the message they want to convey to policy makers, and transform its envelope into a completely unjustifiable assertion of “statistical confidence”.

This is a simple lie. Each model one at a time can have the confidence interval produced by the spread in long-run trajectories produced by the perturbation of its initial conditions compared to the actual trajectory of the climate and turned into a p-value. The p-value is a measure of the probability of the truth of the null hypothesis — “This climate model is a perfect model in that its bundle of trajectories is a representation of the actual distribution of future climates”. This permits the estimation of the probability of getting our particular real climate given this distribution, and if the probability is low, especially if it is very low, we under ordinary circumstances would reject the huge bundle of assumptions tied up in as “the hypothesis” represented by the model itself and call the model “failed”, back to the drawing board.

One cannot do anything with the super-average of 36 odd non-independent grand average per-model results. To even try to apply statistics to this shotgun blast of assumptions one has to use something called the Bonferroni correction, which basically makes the p-value for failure of individual models in the shotgun blast much, much larger (because they have 36 chances to get it right, which means that even if all 36 are wrong pure chance can — no, probably will — make a bad model come out within a p = 0.05 cutoff as long as the models aren’t too wrong yet.

By this standard, “the set of models in CMIP5″ has long since failed. There isn’t the slightest doubt that their collective prediction is statistical nonsense. It remains to be seen if individual models in the collection deserve to be kept in the running as not failed yet, because even applying the Bonferroni correction to the “ensemble” of CMIP5 is not good statistical practice. Each model should really be evaluated on its own merits as one doesn’t expect the “mean” or “distribution” of individual model results to have any meaning in statistics (note that this is NOT like perturbing the initial conditions of ONE model, which is a form of Monte Carlo statistical sampling and is something that has some actual meaning).

Hope this helps.

rgb
http://wattsupwiththat.com/2015/06/09/huge-divergence-between-latest-uah-and-hadcrut4-revisions-now-includes-april-data/#comment-1960561

In the conclusion of a recent paper, Valerio Lucarini adds:

We have briefly recapitulated some of the scientific challenges and epistemological issues related to climate science. We have discussed the formulation and testing of theories and numerical models, which, given the presence of unavoidable uncertainties in observational data, the nonrepeatability of world-experiments, and the fact that relevant processes occur in a large variety of spatial and temporal scales, require a rather different approach than in other scientific contexts.

In particular, we have clarified the presence of two different levels of unavoidable uncertainties when dealing with climate models, related to the complexity and chaoticity of the system under investigation. The first is related to the imperfect knowledge of the initial conditions, the second is related to the imperfect representation of the processes of the system, which can be referred to as structural uncertainties of the model. We have discussed how Monte Carlo methods provide partial but very popular solutions to these problems. A third level of uncertainty is related to the need for a, definitely non-trivial, definition of the appropriate metrics in the process of validation of the climate models. We have highlighted the difference between metrics aimed at providing information of great relevance for the end-user from those more focused on the audit of the most important physical processes of the climate system.

It is becoming clearer and clearer that the current strategy of incremental improvements of climate models is failing to produce a qualitative change in our ability to describe the climate system, also because the gap between the simulation and the understanding of the climate system is widening (Held 2005, Lucarini 2008a). Therefore, the pursuit of a “quantum leap” in climate modeling – which definitely requires new scientific ideas rather than just faster supercomputers – is becoming more and more of a key issue in the climate community (Shukla et al. 2009).

Lucarini goes further: Our proposal: a Thermodynamic perspective

While acknowledging the scientific achievements obtained along the above mentioned line, we propose a different approach for addressing the big picture of a complex system like climate is. An alternative way for providing a new, satisfactory theory of climate dynamics able to tackle simultaneously balances of physical quantities and dynamical instabilities is to adopt a thermodynamic perspective, along the lines proposed by Lorenz (1967). We consider simultaneously two closely related approaches, a phenomenological outlook based on the macroscopic theory of non-equilibrium thermodynamics (see e.g., de Groot and Mazur 1962), and, a more fundamental outlook, based on the paradigm of ergodic theory (Eckmann and Ruelle 1985) and more recent developments of the non-equilibrium statistical mechanics (Ruelle 1998, 2009).

The concept of the energy cycle of the atmosphere introduced by Lorenz (1967) allowed for defining an effective climate machine such that the atmospheric and oceanic motions simultaneously result from the mechanical work (then dissipated in a turbulent cascade) produced by the engine, and re-equilibrate the energy balance of the climate system. One of the fundamental reasons why a comprehensive understanding of climate dynamics is hard to achieve lies on the presence of such a nonlinear closure. Recently, Johnson (2000) introduced a Carnot engine–equivalent picture of the climate system by defining effective warm and the cold reservoirs and their temperatures.

From Modelling Complexity: the case of Climate Science, V. Lucarini

Click to access 1106.1265.pdf

For more on Climate models see:

https://rclutz.wordpress.com/2015/03/24/temperatures-according-to-climate-models/

https://rclutz.wordpress.com/2015/03/25/climate-thinking-out-of-the-box/

Spitsbergen Triangle: Ground Zero for Climate Mysteries

Credit to Dr. Bernaerts for his writings on this subject, excerpts of which appear below.

The Island Nexus for Ocean Currents

From the Dutch: spits – pointed, bergen – mountains

The largest and only permanently populated island of the Svalbard archipelago in northern Norway. Constituting the westernmost bulk of the archipelago, it borders the Arctic Ocean, the Norwegian Sea, and the Greenland Sea. Spitsbergen covers an area of 39,044 km2 (15,075 sq mi), making it the largest island in Norway and the 36th-largest in the world.

The fact is that the winter temperatures made a jump of more than eight degrees Celsius at the gate of the Arctic Basin, after 1918. Nowadays, one century later, the event is still regarded as “one of the most puzzling climate anomalies of the 20th century”.

Dr. Bernaerts:

The overriding aspect of the location is the sea; the sea around Spitsbergen, the sea between particularly the Norwegian, the Greenland, and the Barents Seas (Nordic Sea). The Norwegian Sea is a huge, 3000 metres deep basin. This huge water mass stores a great amount of energy, which can transfer warmth into the atmosphere for a long time. In contrast the Barents Sea, in the southeast of Spitsbergen has an average depth of just around 230 metres. In- and outflow are so high that the whole water body is completely renewed in less than 5 years. However, both sea areas are strongly influenced by the water masses coming from the South. The most important element is a separate branch of the North Atlantic Gulf Current, which brings very warm and very salty water into the Norwegian Sea and into the Spitsbergen region. Water temperature and degree of saltiness play a decisive role in the internal dynamics of the sea body. And what might be the role of the huge basin of the Arctic Ocean, 3000 meters depth and a size of about 15 million square kilometers?

The difference towards the other seas mentioned is tremendous. The Arctic Ocean used to be widely ice covered in the first half of the 20th Century, the other seas only partly on a seasonal basis. Only between the open sea and the atmosphere an intensive heat transfer is permanently taking place. Compact sea ice reduces this transfer about 90% and more, broken or floating ice may change the proportion marginally. In this respect an ice covered Arctic Ocean has not an oceanic but ‘continental’ impact on the climate.

The Arctic Ocean is permanently supplied with new water from the Gulf Current, which enters the sea close at the surface near Spitsbergen. This current is called the West Spitsbergen current. The arriving water is relatively warm (6 to 8°C) and salty (35.1 to 35.3%) and has a mean speed of ca. 30 cm/sec-1. The warm Atlantic water represents almost 90% of all water masses the Arctic receives. The other ~10% comes via the Bering Strait or rivers. Due to the fact that the warm Atlantic water reaches usually the edge of the Arctic Ocean at Spitsbergen in open water, the cooling process starts well before entering the Polar Sea.

A further highly significant climate aspect of global dimension is the water masses the Arctic releases back to oceans. Actually, the outflow occurs mainly via the Fram Strait between Northeast Greenland and Spitsbergen, and together with very cold water from the Norwegian Sea basin the deep water spreads below the permanent thermocline into the three oceans.

http://www.arctic-heats-up.com/pdf/chapter_2.pdf

The Spitsbergen Event 1918-1919

Beginning around 1850 the Little Ice Age ended and the climate began warming. Before that, at least since 1650 marked the first climatic minimum after a Medieval warm period, the Little Ice Age brought bitterly cold winters to many parts of the world, most thoroughly documented in the Northern Hemisphere in Europe and North America. The decreased solar activity and the increased volcanic activity are considered as causes. However, the temperature increase was remote and once again effected by the last major volcanic eruption of the Krakatoa in 1883. Up to the 1910s the warming of the world was modest.

Suddenly that changed. In the Arctic the temperatures literally exploded in winter 1918/19. The extraordinary event lasted from 1918 to 1939 is clearly demonstrated in the graph showing the ‘Arctic Annual Mean Temperature Anomalies 1880 – 2004’. But this extraordinary event has a number of facets, which could have been researched and explained. Meanwhile almost a full century has passed, and what do we know about this event today? Very little!

Studies considering the causation of the warming offer sketchy rather than well founded ideas. Here are a few examples:
• Natural variability is the most likely cause (Bengtsson, 2004);
• We theorize that the Arctic warming in the 1920s/1930s was due to natural fluctuations internal to the climate system (Johannessen, 2004).
• The low Arctic temperatures before 1920 had been caused by volcanic aerosol loading and solar radiation, but since 1920 increasing greenhouse gas concentration dominated the temperatures (Overpeck, 1997).
• The earlier warming shows large region-to-region, month-to-month, and year-to-year variability, which suggests that these composite temperature anomalies are due primarily to natural variability in weather systems (Overland, 2004).
• A combination of a global warming signal and fortuitous phasing of intrinsic climate patterns (Overland, 2008).

Arctic Regime Change

These explanations (and others such as CO2 or the AMOC) do not come to grips with how extreme and abrupt was this event. In the Spring of 1917, sea ice reached all the way to Spitsbergen, the only time in a century.

And the next year, temperatures rocketed upward, as shown by the weather station there:

A look at the SST history shows clearly an event as dramatic as a super El Nino causing a regime change. But this is the Atlantic, not the Pacific. Cooling followed, but temperatures stayed at a higher level than before.

Summary

The warming at Spitsbergen is one of the most outstanding climatic events since the volcanic eruption of Krakatoa, in 1883. The dramatic warming at Spitsbergen may hold key aspects for understanding how climate ticks. The following elaboration intends to approach the matter from different angles, but on a straight line of thoughts, namely:

  • WHERE: the warming was caused and sustained by the northern part of the Nordic Sea in the sea area of West Spitsbergen the pass way of the Spitsbergen Current.
  • WHEN: The date of the commencement of warming can be established with high precision of few months, and which was definitely in place by January 1919.
  • WHY: the sudden and significant temperature deviation around the winter of 1918/19 was with considerable probability caused, at least partly, by a devastating naval war which took place around  the British Isles, between 1914 and 1918.

There is much more evidence and analysis supporting Dr. Bernaerts’ conclusions here:

http://climate-ocean.com/arctic-book/index.html


Conclusion:  Unless your theory of climate change can make sense of the Spitsbergen Event, then it cannot inspire confidence. You may not be entirely convinced by Dr. Bernaerts’ explanation, but he at least has one–nobody else  has even tried.