IPCC Racketeers Order Hit on Exxon

A lot of alarmist voices are charging Exxon with all kinds of misdoings with respect to climate science. The usual suspects are implicated, including Bill McKibben, Naomi Oreskes and Bob Ward.

InsideClimateNews broke the story, with the others piling on. Exxon is fighting back and tell their story here:  http://www.exxonmobilperspectives.com/2015/10/21/when-it-comes-to-climate-change-read-the-documents/

The documents referred to are here.

Exxon’s Position:

“Reading the documents shows that these allegations are based on deliberately cherry-picked statements attributed to various ExxonMobil employees to wrongly suggest definitive conclusions were reached decades ago by company researchers. These statements were taken completely out of context and ignored other readily available statements demonstrating that our researchers recognized the developing nature of climate science at the time which, in fact, mirrored global understanding.

What these documents actually demonstrate is a robust culture of scientific discourse on the causes and risks of climate change that took place at ExxonMobil in the 1970s and ’80s and continues today. They point to corporate efforts to fill the substantial gaps in knowledge that existed during the earliest years of climate change research.

They also help explain why ExxonMobil would work with the Intergovernmental Panel on Climate Change and leading universities like MIT and Stanford on ways to expand climate science knowledge.”

The Royal Society

The list of documents includes an interchange with the Royal Society, and their spokesman, Bob Ward. He criticizes Exxon’s publications for not saying the same things as IPCC documents. He accuses Exxon of funding “organizations that have been misinforming the public about the science of climate change.” That sounds so much like the RICO20 letter.

Kenneth Cohen of ExxonMobil responded in a letter to Lord Rees, President of the Royal Society at that time:

“The Royal Society should welcome the diversity of opinions on all scientific issues. Taking the position that any person or organization that disagrees with the Royal Society on an important scientific issue should be publicly vilified is surely counterproductive for the development of scientific theory, ignores freedom of expression and is hardly consistent with the Society’s stated objective of promoting excellence in science.”

Cohen’s full letter is here:
http://insideclimatenews.org/sites/default/files/documents/Exxon%20Letter%20to%20Royal%20Society%20%282006%29.pdf

Exxon says that they are part of the solution and not the problem, and are asking people to read the documents and decide for themselves.  Sounds reasonable.

Background on RICO and IPCC:

https://rclutz.wordpress.com/2015/09/18/anti-racketeering-initiative/

To My Prime Minister, Justin Trudeau

Congratulations on winning a majority in the election and now to form the government. I voted for Ramez Ayoub, your newly elected MP in my riding north of Montreal.

The point of this letter is to alert you to the issue of climate change. It was little discussed during the campaign, but it will be immediately forced onto your attention due to the conference next month in Paris.

The knee-jerk reaction would be to declare the Conservatives wrong on this issue and that Liberals will demonstrate change by reversing Canada’s position. That would be unfortunate and premature, considering all of the pitfalls and ramifications tied to this.

For an example of how to mismanage this issue, you need only look south to the US self-imposed predicament. President Obama picked a radical environmentalist, John Holdren, as his science adviser. Uncritically following that advice, Obama has now painted himself into an ideological corner, and will find it difficult to deny claims for payment of reparations from dozens of developing countries.

You could make the same mistake by appointing David Suzuki as your science adviser. He is a renowned environmentalist and biologist, but has no expertise in climate science, energy or economics. The so-called climate consensus surveys of scientists carefully excluded anyone not working for government or academia. That sort of unbalanced approach is wrong-headed.

Despite the pressure to make early commitments on this issue, I urge you to keep a cool head, have a scientific curiosity, and pick a team of advisers providing a balance of environmental and industrial perspectives. You might want to make an announcement that your government will respect the scientific and economic realities concerning the climate, including attention to cost-benefit analyses of policy proposals.

Sincerely,

Ronald R. Clutz
Therese-De-Blainville Riding

https://rclutz.wordpress.com/category/science-and-society/

 

Just Say No!

There was a time when our leaders appealed to reason and common sense:

But today, CO2 Fever is upon us, and there is no CO2 rumor too outrageous to be broadcast, repeated and exaggerated.

Just in the last few hours, we have these headlines (from Google News) threatening global warming:

Climate change clips wings of migratory birds
Miami and New Orleans will sink
Global warming could lead to worldwide wars
Coral reefs are dying
25 million Americans could lose their homes to global warming and rising seas
Ocean food chains will collapse
Climate change major threat to global economic stability
Etc., etc. Etc.

Meanwhile, the good news about CO2 is not mentioned in the press, unless you look very hard for it.

London 12 October: In an important new report published today by the Global Warming Policy Foundation, former IPCC delegate Dr Indur Goklany calls for a reassessment of carbon dioxide, which he says has many benefits for the natural world and for humankind.

Dr Goklany said: “Carbon dioxide fertilises plants, and emissions from fossil fuels have already had a hugely beneficial effect on crops, increasing yields by at least 10-15%. This has not only been good for humankind but for the natural world too, because an acre of land that is not used for crops is an acre of land that is left for nature”.

http://www.thegwpf.org/climate-doomsayers-ignore-benefits-of-carbon-dioxide-emissions/

In the Forward, world-renowned physicist Freeman Dyson says this:

“To any unprejudiced person reading this account, the facts should be obvious: that the non-climatic effects of carbon dioxide as a sustainer of wildlife and crop plants are enormously beneficial, that the possibly harmful climatic effects of carbon dioxide have been greatly exaggerated, and that the benefits clearly outweigh the possible damage.”
The full document can be accessed here:

Click to access benefits1.pdf

CO2 hysteria is addictive. Here’s what it does to your brain:

Just say No!

Ice House of Mirrors

In the fable of Snow White, the evil step-mother asked her hand mirror: “Who is the fairest in the land?” The mirror on the wall always flattered her, but this mirror did not. As we all know, the hand mirror told the truth, the queen was angry and people had to suffer.

We expect mirrors to tell the truth, to show us the objective reality. That is why it is amusing at the carnival sideshow to gaze into mirrors that make normal people look obese or like a beanpole, or otherwise distort one’s appearance.

This post is about what Arctic Ice Extent looks like in the mirrors available to the public.

Mirror #1

If you wonder what is happening with Arctic Ice, the first (and maybe only) depiction you encounter will look like this:

I call this The Incredibly Shrinking Ice Mirror, because it is certainly scary (Death Spiral comes to mind). Ice is obviously going to hell in a hand-basket. Once your fright abates, you might wonder about the scale or how much ice is measured. And you might notice this is about September; other months of the year are excluded from view.

You wouldn’t know from this chart that scary looking 2012 had one of the higher March maximums and on average was not so far out. But that wouldn’t be as exciting.

Mirror #2

Just for fun, let’s make a mirror from the same dataset, just one month later. I call this The Incredibly Persisting Ice Mirror.

october extents lrg2

You won’t find this one on the Internet because it is politically incorrect and pretty boring. But it is just as valid as Mirror #1. It is showing a very slow, unalarming decline with something unusual in 2007, but recovering after that.

Is it a distortion? Absolutely, it is incomplete in the same way as Mirror #1, but gives the opposite impression, just by choosing a different month. But at least this one informs your impression with the actual monthly average extents in M km2 (no anomalies or %s).

Mirror #3

There are more informative pictures of Arctic Ice dynamics if you look for them. For example, there is this presentation of the complete dataset:

NOAA NH Ice Extent

I call this The Bird’s Eye Ice Mirror because you can see the big picture from the satellite passive microwave sensors: the full range of annual variation, the actual measured extents and averages. Note the trend line looks much more like October than September. Is there a reason September is preferred?

Mirror #4

As I have pointed out there are other views of ice extent patterns, such as this:

masie annuallarge273

Let’s call it the The Navigator’s Eye Ice Mirror because it is the accumulation of the ice extents you could expect to observe at sea level from buoys or from the deck of a ship operating in the Arctic region (source: MASIE ice charts).

Of course, there are other mirrors trying in their own ways to tell us about Arctic Ice.

Here’s two Magnifying Ice Mirrors, giving you closeups of what is happening with the ice:

JAXA 2006 to 2015

Let’s not forget The Rear-view Ice Mirrors showing that there were ice observations long before the satellite record started in 1979.

Figure 16-3: Time series of April sea-ice extent in Nordic Sea (1864-1998) given by 2-year running mean and second-order polynomial curves. Top: Nordic Sea; middle: eastern area; bottom: western area (after Vinje, 2000). IPCC Third Assessment Report

Conclusion:

We know as a fact of life that any mirror contains some distortion or bias, even those trying to tell the truth. So it is wise to look at several of them, and pay attention to the frames, before concluding what is happening. Be sure to have a chuckle when you pass by Mirror #1. Although setting energy policies and investing billions of research dollars based on that distortion is not amusing.

Footnote:

Some commentators wondered whether the statistics are affected by icebreakers.  I don’t know, but there is evidence of a Norwegian icebreaker in operation:

 

Pause Deniers Busted

Updates October 3 and 30 below

With Paris COP drawing near, the lack of warming this century is inconvenient and undermines the cause.

As Dr. Judith Curry said, “I have been expecting to start seeing papers on the ‘hiatus is over.’ Instead I am seeing papers on ‘the hiatus never happened.’”

One that was trumpeted came out of my Alma Mater, Stanford.  They garnered the expected headlines from the usual places:

Global Warming “Hiatus” Never Happened: Eos

There never was any global warming “pause.”:  Washington Post

The text is here: Debunking the climate hiatus

http://download.springer.com/static/pdf/969/art%253A10.1007%252Fs10584-015-1495-y.pdf?originUrl=http%3A%2F%2Flink.springer.com%2Farticle%2F10.1007

The write up has statistical razzle-dazzle and lots of opaque sentences, but let’s not get lost in the weeds.

Let’s not talk about the multiple tamperings to the land records they chose to study.  Let’s even overlook their including the bogus upward adjustments to the SSTs by Karl et al.  Bob Tisdale dissected that here: https://bobtisdale.wordpress.com/2015/06/04/noaancdcs-new-pause-buster-paper-a-laughable-attempt-to-create-warming-by-adjusting-past-data/

We don’t need to get into the technicalities of why they stopped with 2013 data, the suitability of the tests applied or their interpretations of the results.

Here’s what you need to know about this study:

They ignored the satellite records (RSS and UAH), the gold standard of temperature measurements, because the absence of warming there is undeniable.

For the land and ocean datasets they analyzed, they ignored the huge divergence between observations and the predictions (projections) from climate models.

Conclusion:

Natural variability in the climate system has neutralized any warming from increased CO2 this century, and also offset most, if not all of the secular rise in temperature since the Little Ice Age.  The models did not forecast this; they can only project warming, and do so at rates several times higher than observations.  The models fail for three reasons:  high sensitivity to CO2; positive feedback from water vapor; and lack of thermal inertia by the oceans.

For more on climate models and temperature projections:
https://rclutz.wordpress.com/2015/03/24/temperatures-according-to-climate-models/

The Stanford football team was impressive beating highly-rated Southern Cal on their home field last Saturday.  The work of the research team, however, looks like pandering rather than science.  They need to up their game: No cookies.

Update October 3

I found the time to look into the details of this paper and the statistical trick comes to light.

They took as the null hypothesis: “Temperatures are not rising.”  After applying several statistical tests, they conclude that the statement is not supported by the data, so we cannot say with certainty temperatures are not rising.

And what about the other null hypothesis: “Temperatures are rising.”  Silence.

I suspect they didn’t want to admit that the same statistical tests would also disprove that statement.

A reasonable person concludes: When you can not say for sure that temperatures are not rising, or that they are rising, that would surely indicate a plateau in temperatures.

Update October 30–Another classic from Josh

A Welcome Voice in the Climate Debate

Someone has written a book much needed, adding a welcome voice into rational consideration of climate matters. The book is entitled: Doubt and Certainty in Climate Science, it is free and can be downloaded here:

Click to access longhurst-final.pdf

About the author

Alan R. Longhurst is a biological oceanographer who has studied the ecology of the continental shelf of the Gulf of Guinea (1954-63), and the trophic structure and flux of energy through the pelagic ecosystems of the eastern Pacific (1963-71), the Barents Sea (1973), the Canadian Arctic (1983-89) and the Northwest Atlantic (1978-94). He coordinated the international EASTROPAC expeditions in the 1960s and directed the NOAA SW Science Center on the Scripps campus at La Jolla (1967-71), the Marine Ecology Laboratory at the Bedford Institute of Oceanography (1977-79) and was Director-General of that Institute (1970-86). He has published 80-odd research papers and his most recent books are “Ecological Geography of the Sea” (Elsevier, 1998 & 2007) and “Mismanagement of Marine Fisheries” (Cambridge, 2010).

I recommend this climate science book as readable, thorough, considerate, and well-documented.  He also gives insightful personal experiences from his oceanographic career. I particularly appreciate his emphasis on the ocean’s complex role in climate dynamics. Also his discussion of surface temperature measurements  has echos in my own analyses of the records.

For a review and overview by Dr. Judith Curry, her post is here:

http://judithcurry.com/2015/09/20/new-book-doubt-and-certainty-in-climate-science/

Longhurst concludes with this:

Perhaps the one thing that would shake the collective certainty would be if the simple, single value used to represent global surface temperature continued to languish at around the same value as it has for the last 15 years for, say, another 5 years? Of course, it may not – simply because the next Nino will quickly reduce the area of cold, upwelled water exposed at the sea surface and global SST will suddenly rise, as it did in 1998. In fact, as I write, this is occurring and the anticipated announcement has already been made NOAA that this year we experienced the warmest July ever recorded.

But if a new Gleissberg cycle makes itself felt when the equatorial Pacific has settled back into its ‘normal’ Trade Wind state, and if the new cycle overwhelms the effect on SAT measurements of urbanisation and land use change so that the GSMT index cools significantly, then the earth sciences will have a heavy bill to be paid in the arena of public support. And the more so if a Convention concerning measures agreed to be taken has already been signed into effect…

The Climates, They are A-changing.

Updated below with comments and additional links September 17-19

Seeing a lot more of this lately, along with hearing the geese  honking. And in the next month or so, we expect that trees around here will lose their leaves. It definitely is climate change of the seasonal variety.

Interestingly, the science on this is settled: It is all due to reduction of solar energy because of the shorter length of days (LOD). The trees drop their leaves and go dormant because of less sunlight, not because of lower temperatures. The latter is an effect, not the cause.

Of course, the farther north you go, the more remarkable the seasonal climate change. St. Petersburg, Russia has their balmy “White Nights” in June when twilight is as dark as it gets, followed by the cold, dark winter and a chance to see the Northern Lights.

And as we have been monitoring, the Arctic ice has been melting from sunlight in recent months, but will now begin to build again in the darkness to its maximum in March.

We can also expect in January and February for another migration of millions of Canadians (nicknamed “snowbirds”) to fly south in search of a summer-like climate to renew their memories and hopes. As was said to me by one man in Saskatchewan (part of the Canadian wheat breadbasket region): “Around here we have Triple-A farmers: April to August, and then Arizona.” Here’s what he was talking about: Quartzsite Arizona annually hosts 1.5M visitors, mostly between November and March.

Of course, this is just North America. Similar migrations occur in Europe, and in the Southern Hemisphere, the climates are changing in the opposite direction, Springtime currently. Since it is so obviously the sun causing this seasonal change, the question arises: Does the sunlight vary on longer than annual timescales?

The Solar-Climate Debate

And therein lies a great, enduring controversy between those (like the IPCC) who dismiss the sun as a driver of multi-Decadal climate change, and those who see a connection between solar cycles and Earth’s climate history. One side can be accused of ignoring the sun because of a prior commitment to CO2 as the climate “control knob”.

The other side is repeatedly denounced as “cyclomaniacs” in search of curve-fitting patterns to prove one or another thesis. It is also argued that a claim of 60-year cycles can not be validated with only 150 years or so of reliable data. That point has weight, but it is usually made by those on the CO2 bandwagon despite temperature and CO2 trends correlating for only 2 decades during the last century.

One scientist in this field is Nicola Scaffeta, who presents the basic concept this way:

“The theory is very simple in words. The solar system is characterized by a set of specific gravitational oscillations due to the fact that the planets are moving around the sun. Everything in the solar system tends to synchronize to these frequencies beginning with the sun itself. The oscillating sun then causes equivalent cycles in the climate system. Also the moon acts on the climate system with its own harmonics. In conclusion we have a climate system that is mostly made of a set of complex cycles that mirror astronomical cycles. Consequently it is possible to use these harmonics to both approximately hindcast and forecast the harmonic component of the climate, at least on a global scale. This theory is supported by strong empirical evidences using the available solar and climatic data.”

He goes on to say:

“The global surface temperature record appears to be made of natural specific oscillations with a likely solar/astronomical origin plus a noncyclical anthropogenic contribution during the last decades. Indeed, because the boundary condition of the climate system is regulated also by astronomical harmonic forcings, the astronomical frequencies need to be part of the climate signal in the same way the tidal oscillations are regulated by soli-lunar harmonics.”

He has concluded that “at least 60% of the warming of the Earth observed since 1970 appears to be induced by natural cycles which are present in the solar system.” For the near future he predicts a stabilization of global temperature until about 2016 and cooling until 2030-2040.

https://tallbloke.wordpress.com/2014/07/28/nicola-scafetta-global-temperatures-and-sunspot-numbers-are-they-related-yes-but-non-linearly/

A Deeper, but Accessible Presentation of Solar-Climate Theory

I have found this presentation by Ian Wilson to be persuasive while honestly considering all of the complexities involved.

The author raises the question: What if there is a third factor that not only drives the variations in solar activity that we see on the Sun but also drives the changes that we see in climate here on the Earth?

The linked article is quite readable by a general audience, and comes to a similar conclusion as Scaffeta above: There is a connection, but it is not simple cause and effect. And yes, length of day (LOD) is a factor beyond the annual cycle.

Click to access IanwilsonForum2008.pdf

It is fair to say that we are still at the theorizing stage of understanding a solar connection to earth’s climate. And at this stage, investigators look for correlations in the data and propose theories (explanations) for what mechanisms are at work. Interestingly, despite the lack of interest from the IPCC, solar and climate variability is a very active research field these days.

A summary of current studies is provided at NoTricksZone:

http://notrickszone.com/2015/09/14/already-23-papers-supporting-sun-as-major-climate-factor-in-2015-burgeoning-evidence-no-longer-dismissible/#sthash.2MviVRWR.dpbs

Ian Wilson has much more to say at his blog: http://astroclimateconnection.blogspot.com.au/

Once again, it appears that the world is more complicated than a simple cause and effect model suggests.

For everything there is a season, a time for every purpose under heaven.

What has been will be again, what has been done will be done again; there is nothing new under the sun.

(Ecclesiastes 3:1 and 1:9)

Update Sept. 17: Commentary with Dr. Arnd Bernaerts

ArndB comments:

Fine writing, Ron, well done!
No doubt the sun is the by far the most important factor for not living on a globe with temperatures down to minus 200°C. That makes me hesitating to comment on „solar and climate variability” or “the sun drives climate” (currently at NTZ – link above), but today merely requesting humbly that the claimed correlation should be based at least on some evidence showing that the sun has ever caused a significant climatic shift during the last one million years, which was not only a bit air temperature variability due to solar cycles that necessarily occur in correlation with the intake and release of solar-radiation by the oceans and seas.

Interestingly the UK MetOffice just released a report (Sept.2015, pages 21) titled:
“Big Changes Underway in the Climate System?”
by attributing the most possible and likely changes to the current status of El Niño, PDO, and AMO, and – of course – carbon dioxide -, and a bit speculation on less sun-energy (see following excerpt at link)

Click to access Changes_In_The_Climate_System.pdf

From p. 13: “It is well established that trace gases such as carbon dioxide warm our planet through the “greenhouse effect”. These gases are relatively transparent to incoming sunlight, but trap some of the longer-wavelength radiation emitted by the Earth. However, other factors, both natural and man-made, can also change global temperatures. For example, a cooling could be caused by a downturn of the amount of energy received from the sun, or an increase in the sunlight reflected back to space by aerosol particles in the atmosphere. Aerosols increase temporarily after volcanic eruptions, but are also generated by pollution such as sulphur dioxide from factories.
These “external” factors are imposed on the climate system and may also affect the ENSO, PDO and AMO variations……

My Reply:

Thanks Arnd for engaging in this topic.

My view is that the ocean makes the climate by means of its huge storage of solar energy, and the fluctuations, oscillations in the processes of distributing that energy globally and to the poles. In addition, the ocean is the most affected by any variation in the incoming solar energy, both by the sun outputting more or less, and also by clouds and aerosols blocking incoming radiation more or less (albedo or brightness variability).

https://rclutz.wordpress.com/2015/04/21/the-climate-water-wheel/

The oscillations you mention, including the present El Nino (and Blob) phenomenon, show natural oceanic variability over years and decades. Other ocean cycles occur over multi-decadal and centennial scales, and are still being analyzed.

At the other end of the scale, I am persuaded that the earth switches between the “hot house” and the “ice house” mainly due to orbital cycles, which are an astronomical phenomenon. These are strong enough to overwhelm the moderating effect of the ocean thermal flywheel.

The debate centers on the extent to which solar activity has contributed to climate change over the last 3000 years of our current interglacial period, including current solar cycles.

Update September 19

Additional studies showing a solar-climate connection are here: https://translate.google.com/translate?hl=&sl=de&tl=en&u=http%3A%2F%2Fwww.kaltesonne.de%2Fsonne-macht-klima-neues-aus-europa%2F

Everywhere Elsewhere Climate Claims

We often hear reports that something is occurring around the world, and then someone responds: “That’s not happening where I live.” And the rebuttal is, “Your neighborhood is not typical of the rest of the world.” In other words, the claim is: this trend is going on everywhere elsewhere despite your not observing it.

For a month now we have been reading in the media about how July was the hottest month in recorded history.

“July was Earth’s hottest month on record, NOAA says” http://www.bbc.com/news/world-us-canada-34009289

And at the same time, we read reports about how cool the summer was in Canada, in the US, in the UK, in parts of Europe and how cold was the winter in Australia.

“What a washout! A British summer to forget. In the UK July was colder than average, and we had 140% of average rainfall.” http://www.theguardian.com/uk-news/2015/aug/16/washout-british-summer-witness-holiday-experts

“The July contiguous U.S. average temperature was 73.9°F, 0.2°F above the 20th century average and ranked near the middle in the 121-year period of record.” http://www.ncdc.noaa.gov/sotc/national/201507

“Wetter than normal summer for most of Canada except B.C.” http://www.vancitybuzz.com/2015/08/wetter-than-normal-summer-canada-except-bc/

“A large swath stretching from eastern Scandinavia into western Siberia was cooler than average, with part of western Russia much cooler than average. Cooler than average temperatures were also observed across parts of eastern and southern Asia and scattered areas in central and northern North America.” (Source: NOAA)

So the question arises: Is there global warming unseen in most observations? How would we know what was observed in July and whether it was unusual or not?

NOAA provides this analysis of July 2015.

Continental Temperature Anomalies July 2015

CONTINENT ANOMALY (1910-2000) TREND (1910-2015) RANK
°C °F °C °F (OUT OF 106 YEARS)
North America 0.53 0.95 0.08 0.14 Warmest 16ᵗʰ
Coolest 90ᵗʰ
Ties: 1941
South America 1.43 2.57 0.14 0.25 Warmest 5ᵗʰ
Coolest 102ⁿᵈ
Europe 1.53 2.75 0.12 0.21 Warmest 6ᵗʰ
Coolest 101ˢᵗ
Africa 1.2 2.16 0.1 0.18 Warmest 2ⁿᵈ
Coolest 105ᵗʰ
Asia 0.7 1.26 0.07 0.13 Warmest 10ᵗʰ
Coolest 97ᵗʰ
Oceania 0.57 1.03 0.11 0.19 Warmest 26ᵗʰ
Coolest 81ˢᵗ

https://www.ncdc.noaa.gov/sotc/global-regions/201507

The table shows that no continent had the warmest July ever.  Africa came close and also South America, which means a milder mid-winter than usual in the southern hemisphere.  So how come they claim a record July?

The answer is provided by another NOAA analysis.

Global Analysis of July 2015

JULY ANOMALY RANK RECORDS
°C °F (OUT OF 136 YEARS) YEAR(S) °C °F
Global
Land +0.96 ± 0.18 +1.73 ± 0.32 Warmest 6th 1998 1.11 2
Coolest 131st 1884 -0.68 -1.22
Ocean +0.75 ± 0.07 +1.35 ± 0.13 Warmest 1st 2015 0.75 1.35
Coolest 136th 1911 -0.5 -0.9
Land and Ocean +0.81 ± 0.14 +1.46 ± 0.25 Warmest 1st 2015 0.81 1.46
Coolest 136th 1904, 1911 -0.47 -0.85

 

So there you have it.  Once again the ocean is making the climate, with July SSTs higher because of the Blob and the long-developing El Nino.  And we can expect that with all the heat now being released upward from the water, there will be cooling of SSTs and a La Nina in response.

Climate Models Explained

A comment by Dr. R.G. Brown of Duke University posted on June 11 at WUWT.

noaa climate model

Overview of the structure of a state-of-the-art climate model. From the NOAA website http://www.research.noaa.gov/climate/t_modeling.html

First about the way weather models work

That is not quite what they do in GCMs. There are two reasons for this. One is that a global grid of 2 million temperatures sounds like a lot, but it’s not. Remember the atmosphere has depth, and they have to initialize at least to the top of the troposphere, and if they use 1 km thick cells there are 9 or 10 layers. Say 10. Then they have 500 million square kilometers of area to cover. Even if the grid itself has two million cells, that is still cells that contain 250 square km. This isn’t terrible — 16x16x1 km cells (20 million of them assuming they follow the usual practice of slabs 1 km thick) are small enough that they can actually resolve largish individual thunderstorms — but is still orders of magnitude larger than distinct weather features like individual clouds or smaller storms or tornadoes or land features (lakes, individual hills and mountains) that can affect the weather.

There is also substantial error in their initial conditions — as you say, they smooth temperatures sampled at a lot fewer than 2 million points to cover vast tracts of the grid where there simply are no thermometers, and even where they have surface thermometers they do not generally have soundings (temperature measurements from e.g. balloons that ride up the air column at a location) so they do not know the temperature in depth. The model initialization has to do things like take the surface temperature guess (from a smoothing model) and guess the temperature profile overhead using things like the adiabatic lapse rate, a comparative handful of soundings, knowledge of the cloudiness or whatever of the cell obtained from satellite or radar (where available) or just plain rules of thumb (all built into a model to initialize the model.

Then there is the ocean. Sea surface temperatures matter a great deal, but so do temperatures down to some depth (more for climate than for weather, but when large scale phenomena like hurricanes come along, the heat content of the ocean down to some depth very much plays a role in their development) so they have to model that, and the better models often contain at least one if not more layers down into the dynamic ocean. The Gulf Stream, for example, is a river in the Atlantic that transports heat and salinity and moves around 200 kilometers in a day on the surface, less at depth, which means that fluctuations in surface temperature, fed back or altered by precipitation or cloudiness or wind, move across many cells over the course of a day.  (My Bold)

Even with all of the care I describe above and then some, weather models computed at close to the limits of our ability to compute (and get a decent answer faster than nature “computes” it by making it actually happen) track the weather accurately for a comparatively short time — days — before small variations between the heavily modeled, heavily under-sampled model initial conditions and the actual initial state of the weather plus errors in the computation due to many things — discrete arithmetic, the finite grid size, errors in the implementation of the climate dynamics at the grid resolution used (which have to be approximated in various ways to “mimic” the neglected internal smaller scaled dynamics that they cannot afford to compute) cause the models to systematically diverge from the actual weather.

If they run the model many times with small tweaks of the initial conditions, they have learned empirically that the distribution of final states they obtain can be reasonably compared to the climate for a few days more in an increasingly improbable way, until around a week or ten days out the variation is so great that they are just as well off predicting the weather by using the average weather for a date over the last 100 years and a bit of sense, just as is done in almanacs.

In other words, the models, no matter how many times they are run or how carefully they are initialized, produce results with no “lift” over ordinary statistics at around 10 days. (My bold)

Evolution of state-of-the-art Climate Models from the mid 70s to the mid 00s. From IPCC (2007)

Evolution of state-of-the-art Climate Models from the mid 70s to the mid 00s. From IPCC (2007)

Now How Climate Models Work:

Then here is the interesting point. Climate models are just weather models run in exactly this way, with one exception. Since they know that the model will produce results indistinguishable from ordinary static statistics two weeks in, they don’t bother initializing them all that carefully. The idea is that no matter how then initialize them, after running them out to weeks or months the bundle of trajectories they produce from small perturbations will statistically “converge” at any given time to what is supposed to be the long time statistical average, which is what they are trying to predict.

This assumption is itself dubious, as neither the weather nor the climate is stationary and it is most definitely non-Markovian so that the neglected details in the initial state do matter in the evolution of both, and there is also no theorem of which I am aware that states that the average or statistical distribution of a bundle of trajectories generated from a nonlinear chaotic model of this sort will in even the medium run be an accurate representation of the nonstationary statistical distribution of possible future climates. But it’s the only game in town, so they give it a try.

They then run this re-purposed, badly initialized weather model out until they think it has had time to become a “sample” for the weather for some stationary initial condition (fixed date, sunlight, atmosphere, etc) and then they vary things like CO_2 systematically over time while integrating and see how the run evolves over future decades. The bundle of future climate trajectories thus generated from many tweaks of initial conditions and sometimes the physical parameters as well is then statistically analyzed, and its mean becomes the central prediction of the model and the variance or envelope of all of the trajectories become confidence intervals of its predictions.

The problem is that they aren’t really confidence intervals because we don’t really have any good reason to think that the integration of the weather ten years into the future at an inadequate grid size, with all of the accumulation of error along the way, is actually a sample from the same statistical distribution that the real weather is being drawn from subject to tiny perturbations in its initial state. The climate integrates itself down to the molecular level, not on a 16×16 km grid, and climate models can’t use that small a grid size and run in less than infinite time, so the highest resolution I’ve heard of is 100×100 km^2 cells (10^4 square km, which is around 50,000 cells, not two million).

At this grid size they cannot see individual thunderstorms at all. Indeed, many extremely dynamic features of heat transport in weather have to be modeled by some sort of empirical “mean field” approximation of the internal cell dynamics — “average thunderstormicity” or the like as thunderstorms in particular cause rapid vertical transport of a lot of heat up from the surface and rapid transport of chilled/chilling water down to the surface, among other things. The same is true of snowpack — even small errors in average snowpack coverage make big differences in total heat received in any given winter and this can feed back to kick a model well off of the real climate in a matter of years.

So far, it looks like (not unlike the circumstance with weather) climate models can sometimes track the climate for a decade or so before they diverge from it.   (My Bold)

They suffer from many other ailments as well — if one examines the actual month to month or year to year variance of the “weather” they predict, it has the wrong amplitude and decay times compared to the actual climate, which is basically saying (via the fluctuation-dissipation theorem) that they have the physics of the open system wrong. The models heavily exaggerate the effect of aerosols and tend to overreact to things like volcanic eruptions that dump aerosols into the atmosphere.

The models are tuned to cancel the exaggerated effect of aerosols with an exaggerated feedback on top of CO_2 driven warming to make them “work” to track the climate over a 20 year reference period. Sadly, this 20 year reference period was chosen to be the single strongest warming stretch of the 20th century, ignoring cooling periods and warming periods that preceded it and (probably as a consequence) diverging from the flat-to-slightly cooling period we’ve been in for the last 16 or so years (or more, or less, depending on who you are talking to, but even the IPCC formally recognizes “the pause, the hiatus”, the lack of warming for this interval, in AR5. It is a serious problem for the models and everybody knows it.

The IPCC then takes the results of many GCMs and compounds all errors by super-averaging their results (which has the effect of hiding the fluctuation problem from inquiring eyes), ignoring the fact that some models in particular truly suck in all respects at predicting the climate and that others do much better, because the ones that do better predict less long run warming and that isn’t the message they want to convey to policy makers, and transform its envelope into a completely unjustifiable assertion of “statistical confidence”.

This is a simple lie. Each model one at a time can have the confidence interval produced by the spread in long-run trajectories produced by the perturbation of its initial conditions compared to the actual trajectory of the climate and turned into a p-value. The p-value is a measure of the probability of the truth of the null hypothesis — “This climate model is a perfect model in that its bundle of trajectories is a representation of the actual distribution of future climates”. This permits the estimation of the probability of getting our particular real climate given this distribution, and if the probability is low, especially if it is very low, we under ordinary circumstances would reject the huge bundle of assumptions tied up in as “the hypothesis” represented by the model itself and call the model “failed”, back to the drawing board.

One cannot do anything with the super-average of 36 odd non-independent grand average per-model results. To even try to apply statistics to this shotgun blast of assumptions one has to use something called the Bonferroni correction, which basically makes the p-value for failure of individual models in the shotgun blast much, much larger (because they have 36 chances to get it right, which means that even if all 36 are wrong pure chance can — no, probably will — make a bad model come out within a p = 0.05 cutoff as long as the models aren’t too wrong yet.

By this standard, “the set of models in CMIP5″ has long since failed. There isn’t the slightest doubt that their collective prediction is statistical nonsense. It remains to be seen if individual models in the collection deserve to be kept in the running as not failed yet, because even applying the Bonferroni correction to the “ensemble” of CMIP5 is not good statistical practice. Each model should really be evaluated on its own merits as one doesn’t expect the “mean” or “distribution” of individual model results to have any meaning in statistics (note that this is NOT like perturbing the initial conditions of ONE model, which is a form of Monte Carlo statistical sampling and is something that has some actual meaning).

Hope this helps.

rgb
http://wattsupwiththat.com/2015/06/09/huge-divergence-between-latest-uah-and-hadcrut4-revisions-now-includes-april-data/#comment-1960561

In the conclusion of a recent paper, Valerio Lucarini adds:

We have briefly recapitulated some of the scientific challenges and epistemological issues related to climate science. We have discussed the formulation and testing of theories and numerical models, which, given the presence of unavoidable uncertainties in observational data, the nonrepeatability of world-experiments, and the fact that relevant processes occur in a large variety of spatial and temporal scales, require a rather different approach than in other scientific contexts.

In particular, we have clarified the presence of two different levels of unavoidable uncertainties when dealing with climate models, related to the complexity and chaoticity of the system under investigation. The first is related to the imperfect knowledge of the initial conditions, the second is related to the imperfect representation of the processes of the system, which can be referred to as structural uncertainties of the model. We have discussed how Monte Carlo methods provide partial but very popular solutions to these problems. A third level of uncertainty is related to the need for a, definitely non-trivial, definition of the appropriate metrics in the process of validation of the climate models. We have highlighted the difference between metrics aimed at providing information of great relevance for the end-user from those more focused on the audit of the most important physical processes of the climate system.

It is becoming clearer and clearer that the current strategy of incremental improvements of climate models is failing to produce a qualitative change in our ability to describe the climate system, also because the gap between the simulation and the understanding of the climate system is widening (Held 2005, Lucarini 2008a). Therefore, the pursuit of a “quantum leap” in climate modeling – which definitely requires new scientific ideas rather than just faster supercomputers – is becoming more and more of a key issue in the climate community (Shukla et al. 2009).

Lucarini goes further: Our proposal: a Thermodynamic perspective

While acknowledging the scientific achievements obtained along the above mentioned line, we propose a different approach for addressing the big picture of a complex system like climate is. An alternative way for providing a new, satisfactory theory of climate dynamics able to tackle simultaneously balances of physical quantities and dynamical instabilities is to adopt a thermodynamic perspective, along the lines proposed by Lorenz (1967). We consider simultaneously two closely related approaches, a phenomenological outlook based on the macroscopic theory of non-equilibrium thermodynamics (see e.g., de Groot and Mazur 1962), and, a more fundamental outlook, based on the paradigm of ergodic theory (Eckmann and Ruelle 1985) and more recent developments of the non-equilibrium statistical mechanics (Ruelle 1998, 2009).

The concept of the energy cycle of the atmosphere introduced by Lorenz (1967) allowed for defining an effective climate machine such that the atmospheric and oceanic motions simultaneously result from the mechanical work (then dissipated in a turbulent cascade) produced by the engine, and re-equilibrate the energy balance of the climate system. One of the fundamental reasons why a comprehensive understanding of climate dynamics is hard to achieve lies on the presence of such a nonlinear closure. Recently, Johnson (2000) introduced a Carnot engine–equivalent picture of the climate system by defining effective warm and the cold reservoirs and their temperatures.

From Modelling Complexity: the case of Climate Science, V. Lucarini

Click to access 1106.1265.pdf

For more on Climate models see:

https://rclutz.wordpress.com/2015/03/24/temperatures-according-to-climate-models/

https://rclutz.wordpress.com/2015/03/25/climate-thinking-out-of-the-box/

On Climate Theories–Response to David A.

David, thanks for elaborating on your thinking and questions on this topic. There is much uncertain and unknown about the functioning of our climate system. I listen when a seasoned expert such as John Christy says:

“The reason there is so much contention regarding “global warming” is relatively simple to understand: In climate change science we basically cannot prove anything about how the climate will change as a result of adding extra greenhouse gases to the atmosphere.

So we are left to argue about unprovable claims.”
http://www.centredaily.com/2014/03/20/4093680/john-r-christy-climate-science.html

So everyone is theorizing and wondering if and when the best theory will win–that is, become the new conventional wisdom. According to Christy, the science is far from settled, and he has examined the datasets extensively, having built some of them himself.

I have also learned a lot from Nullius in Verba, who is one of best explaining these things to us laymen. For example, he comments:

“It would be slightly more accurate to say that the lapse rate is the vertical temperature gradient at which convection switches off and therefore stops cooling the surface.

The sun warms the surface, but the heat escapes very quickly by convection so the build-up of heat near the surface is limited. In an incompressible atmosphere, it would *all* escape, and you’d get no surface warming. But because air is compressible, and because gases warm up when they’re compressed and cool down when allowed to expand, air circulating vertically by convection will warm and cool at a certain rate due to the changing atmospheric pressure. Air cools as it rises and expands, and warms as it descends and is compressed. This warming/cooling effect means that hot air no longer rises when it would cool faster from expansion than the surrounding air. Cold air can sit on top of warm air and be stable. The adiabatic lapse rate is why the tops of mountains are colder than their bottoms.

It’s a bit like the way a pot of boiling water sticks at a temperature of 100 C. If you turn the gas up, the water boils more vigorously, carrying more energy off as steam, which balances the extra energy supplied and keeps the temperature still at exactly 100 C. The rate at which heat escapes is very non-linear – extremely fast for temperatures above the threshold, extremely slow for temperatures below it. So long as the system is driven hard enough, it will get driven up against the non-linear limit and held there. The lapse rate does the same thing, except that instead of fixing the temperature, it fixes its gradient so you get a rigid slope that can freely float up and down in level.

The temperature at the average altitude of emission to space converges on the temperature that radiates the same energy the Earth absorbs. All levels above and below it are held in a fixed relationship to it by the lapse rate. The temperature at any other level is the temperature at the emission altitude plus the lapse rate times the difference in heights. Hence, the temperature at the surface differs by the lapse rate times the average height of emissions to space.

It’s interesting to consider what would happen if you had a strongly absorbing greenhouse material but a zero lapse rate. You’d get lots of backradiation, but no greenhouse warming. By marvelous happenstance we do have such a physical situation in the oceans. Water absorbs all thermal radiation within about 20 microns, making it something like 20,000 times more powerful a greenhouse material than the atmosphere. It’s a (relatively) easy calculation to show that if radiation was the only way heat could be transported, as the backradiation argument assumes, the temperature a metre down would be several thousand degrees! But water is almost incompressible, having a lapse rate of around 0.1 C/km, and so convection nullifies it entirely. Fortunate, eh? . . .

“The direction of net energy flow is determined only by the difference in temperatures, not the amount of stuff. If you have a big body at a cold temperature next to a small body at a very hot temperature, the cold body might be emitting more heat overall because of its bigger surface area, but the net flow is still from the hot body to the cold. Most of the heat emitted from the big cold body doesn’t hit the small body, because it’s so small. Only the temperature matters.

The way this is arranged varies depending on the configuration, but it always happens. People have had a lot of fun over the years trying to construct exotic arrangements of mirrors and radiators and insulators and heat engines to try to break the rule, but nobody has succeeded yet. The second law of thermodynamics is one on the most thoroughly challenged and tested of all the laws of physics. I do encourage people to try though. The prize on offer is a perpetual motion machine to the lucky winner who defeats it!

hat tip to Homer Simpson

Nullius in Verba holds forth here:
http://bishophill.squarespace.com/discussion/post/2471448?currentPage=5

David, I am not a fan of thought experiments about hypothetical worlds with or without CO2. I have read too many threads that go around in circles until everyone turns into wheels.

I do like what E.M. Smith (Chiefio) said sometime ago:

“It is peculiar that everyone is so taken in by the whole notion of the so-called ’radiative greenhouse effect’ being such an ingrained necessity, such a self-evident, requisite part, as it were, of our atmosphere’s inner workings. The ’truth’ and the ’reality’ of the effect is completely taken for granted, a priori. And yet, the actual effect is still only a theoretical construct.

In fact, when looking at the real Earth system, it’s quite evident that this effect is not what’s setting the surface temperature of our planet.

The whole thing can be stated in a simple, yet accurate manner.

The Earth, a rocky sphere at a distance from the Sun of ~149.6 million kilometers, where the Solar irradiance comes in at 1361.7 W/m2, with a mean global albedo, mostly from clouds, of 0.3 and with an atmosphere surrounding it containing a gaseous mass held in place by the planet’s gravity, producing a surface pressure of ~1013 mb, with an ocean of H2O covering 71% of its surface and with a rotation time around its own axis of ~24h, boasts an average global surface temperature of +15°C (288K).

Why this specific temperature? Because, with an atmosphere weighing down upon us with the particular pressure that ours exerts, this is the temperature level the surface has to reach and stay at for the global convectional engine to be able to pull enough heat away fast enough from it to be able to balance the particular averaged out energy input from the Sun that we experience.

It’s that simple.”

Update 1 May 5,2015

David, an additional point of some importance: There is empirical support for the lapse rate existing independent of IR activity.

Global warmists share an assumption that CO2 raises the effective radiating altitude, thereby warming the troposphere and the surface. Now this notion can be found in textbooks and indeed operates in all the climate models. Yet there is no empirical evidence supporting it. What data there is (radiosonde balloon readings) detects no effect from IR active gases upon the temperature profile in the atmosphere.

“It can be seen from the infra-red cooling model of Figure 19 that the greenhouse effect theory predicts a strong influence from the greenhouse gases on the barometric temperature profile. Moreover, the modeled net effect of the greenhouse gases on infra-red cooling varies substantially over the entire atmospheric profile.

However, when we analysed the barometric temperature profiles of the radiosondes in this paper, we were unable to detect any influence from greenhouse gases. Instead, the profiles were very well described by the thermodynamic properties of the main atmospheric gases, i.e., N 2 and O 2 , in a gravitational field.”

While water vapour is a greenhouse gas, the effects of water vapour on the temperature profile did not appear to be related to its radiative properties, but rather its different molecular structure and the latent heat released/gained by water in its gas/liquid/solid phase changes.

For this reason, our results suggest that the magnitude of the greenhouse effect is very small, perhaps negligible. At any rate, its magnitude appears to be too small to be detected from the archived radiosonde data.” Pg. 18 of referenced research paper

Open Peer Rev. J., 2014; 19 (Atm. Sci.), Ver. 0.1. http://oprj.net/articles/atmospheric-science/19 page 18 of 28

In summary David, it is observed and accepted by all that there is a ~33C difference between the temperature at the surface and at the effective radiating level (the tropopause, where convection stops). Warmists attribute that increase in temperature to the IR activity of CO2.

Others, including me, contend that it is the mass of the atmosphere, mostly O2 and N2 delaying the loss of heat from the surface until IR active gases are able to cool the planet effectively without obstruction. That retention of heat in the atmosphere is measurable in the lapse rate. And 90% of the IR activity is due to H2O, especially in the lower troposphere.