Duped into War on Plastic

Step aside Polar Bear, It’s Turtle Time!

Everyday now, everywhere in the media someone else is lamenting the presence of plastics and proposing ways to end straws and other plastic items.  Terence Corcoran in Financial Post explains how we got here:  How green activists manipulated us into a pointless war on plastic  Excerpts in italics with my bolds.

The disruptive Internet mass-persuasion machine controlled by the major corporate tech giants, capable of twisting and manipulating the world’s population into believing concocted stories and fake news, is at it again, this time spooking billions of people into panic over plastic. Except…hold on: Those aren’t big greedy corporations and meddling foreign governments flooding the blue planet with alarming statistics of microplastics in our water and gross videos of turtles with straws stuck up their noses and dead birds with bellies stuffed with plastic waste.

As Earth Day/Week 2018 came to a close, the greatest professional twisters and hypers known to modern mass communications — green activists and their political and mainstream media enablers — had succeeded in creating a global political wing-flap over all things plastic.

That turtle video, viewed by millions and no doubt many more through Earth Day, is a classic of the genre, along with clubbed baby seals and starving polar bears. Filmed in 2015 near waters off the coast of Guanacaste, Costa Rica, the video was uploaded as news by the Washington Post in 2017 and reworked last week by the CBC into its series on the curse of plastics: “ ‘We need to rethink the entire plastics industry’: Why banning plastic straws isn’t enough.”

New York Governor Andrew Cuomo introduced a bill to ban single-use plastic shopping bags. In Ottawa, Prime Minister Justin Trudeau wants the G7 leaders to sign on to a “zero plastics waste charter” while British Prime Minister Theresa May promised to ban plastic straws.

No need for a secret data breach, a Russian bot or a covert algorithm to transform a million personal psychological profiles into malleable wads of catatonic dough. All it takes is a couple of viral videos, churning green activists and a willing mass media. “Hello, is this CBC News? I have a video here of a plastic straw being extracted from the nose of sea turtle. Interested?”

One turtle video is worth 50-million Facebook data breaches, no matter how unlikely the chances are that more than one turtle has faced the plastic-straw problem. If the object in the unfortunate turtle’s nasal passage was a plastic straw (was it analyzed?), it would have likely come from one of the thousands of tourists who visit Costa Rica to watch hundreds of plastics-free healthy turtles storm the country’s beaches for their annual egg-hatching ritual.

That the turtles are not in fact threatened by plastic straws would be no surprise. It is also hard to see how banning straws in pubs in London and fast-food joints in Winnipeg would save turtles in the Caribbean or the Pacific Ocean.

Creating such environmental scares is the work of professional green activists. A group called Blue Ocean Network has been flogging the turtle video for three years, using a propaganda technique recently duplicated by polar-bear activists. Overall, the plastic chemical scare follows a familiar pattern. Canadians will remember Rick Smith, the former executive director of Environmental Defense Canada and co-author of a 2009 book, Slow Death by Rubber Duck. In the book, Smith warned of how the toxic chemistry of everyday life was ruining our health, reducing sperm counts and threatening mothers and all of humanity. The jacket cover of Slow Death included a blurb from Sophie Gregoire-Trudeau, who expressed alarm about all the chemicals “absorbed into our bodies every day.”

To mark Earth Day 2018, orchestrated as part of a global anti-plastics movement, Smith was back at his old schtick with an op-ed in The Globe and Mail in which he warns “We must kill plastics to save ourselves.” Smith, now head of the left-wing Broadbent Institute in Ottawa, reminded readers that since his Slow Death book, a new problem has emerged, “tiny plastic particles (that) are permeating every human on earth.” He cites a study that claimed “83 per cent of tap water in seven countries was found to contain plastic micro-fibres.”

You would think Smith would have learned by now that such data is meaningless. Back in 2009, Smith issued a similar statistical warning. “A stunning 93 per cent of Americans tested have measurable amounts of BPA in their bodies.” BPA (bisphenol A) is a chemical used in plastics that Smith claimed produced major human-health problems.

Turns out it wasn’t true. The latest — and exhaustive — science research on BPA, published in February by the U.S. National Toxicology Program, concluded that “BPA produces minimal effects that were indistinguishable from background.” Based on this comprehensive research, the U.S. Food and Drug Administration said current uses of BPA “continue to be safe for consumers.”

Might the same be true for barely-measurable amounts of micro-plastics found today in bottled water and throughout the global ecosystem? That looks possible. A new meta-analysis of the effects of exposure to microplastics on fish and aquatic invertebrates suggests there may be nothing to worry about. Dan Barrios-O’Neill, an Irish researcher who looked at the study, tweeted last week that “There are of course many good reasons to want to curb plastic use. My reading of the evidence — to date — is that negative ecological effects might not be one of them.”

Instead of responding to turtle videos and images of plastic and other garbage swirling in ocean waters with silly bans and high talk of zero plastic waste, it might be more useful to zero in on the real sources of the floating waste: governments that allow it to be dumped in the oceans in the first place.

Update: August 4, 2018

Claim: There is a “sea of plastic” the size of Texas in the North Pacific Gyre north of Hawaii

First question: have you ever seen an aerial or satellite photograph of the “sea of plastic”? Probably not, because it doesn’t really exist. But it makes a good word- picture and after all plastic is full of deadly poisons and is killing seabirds and marine mammals by the thousands.

This is also fake news and gives rise to calls for bans on plastic and other drastic measures. Silly people are banning plastic straws as if they were a dire threat to the environment. The fact is a piece of plastic floating in the ocean is no more toxic than a piece of wood. Wood has been entering the sea in vast quantities for millions of years. And in the same way that floating woody debris provides habitat for barnacles, seaweeds, crabs, and many other species of marine life, so does floating plastic. That’s why seabirds and fish eat the bits of plastic, to get the food that is growing on them. While it is true that some individual birds and animals are harmed by plastic debris, discarded fishnets in particular, this is far outweighed by the additional food supply it provides. Plastic is not poison or pollution, it is litter.

Patrick Moore, PhD

Footnote:  Dare I say it?  (I know you are thinking it.): “They are grasping at straws, with no end in sight.”

 

 

Raw Water: More Post-Modern Insanity

Available from Amazon

Contemporary style-setters display great nostalgia for pre-industrial ways of living, without ever having to subsist in the natural world. Thus they advocate getting energy from burning trees or windmills so that evil fossil fuels can be left in the ground. Now these Luddites want to turn back scientific progress in water purification, claiming that untreated water is superior.

John Robson explains in the National Post article Raw water is proof the comforts of pampered modernity have gone too far   Excerpts below with my bolds.

With the raw-water craze, people are deliberately drinking unhealthy water for their health, writes John Robson.Postmedia News

In case you’re also in hiding from the insanity we call “popular culture,” there’s this new trend where you get healthy by drinking “naturally probiotic” water that hasn’t been treated to remove animal poop. No, I mean to remove essential minerals, ions and, um, animal poop.

The National Post says people aren’t just deliberately drinking unhealthy water for their health, they’re paying nearly $10 per litre for non-vintage Eau de Lac. Yet they would riot if asked to pay such a price for gasoline or, indeed, to drink ditch water from their tap.

Many reputable people have leapt up to condemn this fad as obviously unhealthy. But they are getting the same sani-wiped elbow that common sense, authority and pride in past achievement now routinely receive. (Can I just note here that the Oprah for President boom, which in our fast-paced social-media times lasted roughly 17 hours, foundered partly because she rose to fame and fortune peddling outrageous quackery? Donald Trump did not invent or patent contempt for logic and evidence.)

Raw water is hardly the only fad to gain in strength, the more reputable opinion condemns it. And let’s face it; reputable opinion has dug itself a pretty deep hole with its propensity for disregarding evidence and silencing dissent. I don’t just mean in the bad old days. But there must be some kind of golden mean between believing every news story with “experts say” in the headline and refusing to vaccinate your children or boil your water.

Seriously. Raw water? Doesn’t everybody know if you must drink from a tainted source it is vital to cook the stuff first? Tea wasn’t healthy primarily because of the plant’s alleged medicinal properties. Boiling water to make it meant you killed the bacteria … before they killed you.

My late friend Tom Davey, publisher of Environmental Science & Engineering, was routinely indignant that people could be induced to pay premium prices for bottled water when safe tap water was the single greatest environmental triumph in human history. But today some trendies are willing to pay premium prices to avoid safe tap water, partly on the basis of the same hooey about trace elements that made “mineral” water popular, partly out of paranoia once the purview of anti-fluoridation Red-baiters, and partly out of amazing scientific ignorance including about the presence of vital nutrients in food, especially if you don’t just eat the super-processed kind.

There. I said it. Some of what we ingest is overly processed, relentlessly scientifically improved until it becomes harmful (a problem by no means restricted to food). But some isn’t, including tap water.

I realize safe drinking water was hailed as an achievement back when mainstream environmentalists wanted the planet to be nice for people. Today’s far greater skepticism about whether human and environmental well-being are compatible creates considerable reluctance to make our well-being a significant measure of progress. But I am in the older camp. Without being insensible to the “crowding out” of ecosystems even by flourishing human communities, let alone poor ones, I still believe we can live well in harmony with nature, and only thus.

Some conservative associates think my deep unease with factory farming requires me to line my hat with tin foil. Other people believe my support for conservatism requires me to line my head with it. But I can only fit so much metal into either, and I draw the line at deliberately drinking the kind of water that used to bring us cholera epidemics.

Would it be impolite to cite this trend as proof that modernity has more money than brains, that the more a life of luxury is delivered to us as a birthright rather than being a hard-won and inherently precarious achievement, the less we are able to count our blessings or act prudently?

By all means save the whales. Get plastic out of the oceans. Protect ugly as well as cute species and their ecosystems. Know that man cannot flourish cut off from nature, and weep at Saruman’s conversion of the Shire from bucolic to industrial in the Lord of the Rings. But you can’t do yourself or the Earth any good while dying of dysentery you brought on yourself by pampered stupidity.

Ross Pomeroy adds an essay at RealClearScience ‘Raw’ Water Is Insulting (my bolds)

In 2015, 844 million people lacked access to even a basic drinking water service. These people, almost entirely from developing areas in Africa and Asia, are forced to play roulette by drinking water potentially contaminated with bacteria and viruses that cause diseases like diarrhea, cholera, dysentery, typhoid, and polio, as well as a variety of parasitic infections. Globally, a half million people die each year from diarrhea contracted via contaminated drinking water, many of them children. Another 240 million suffer from schistosomiasis, a parasitic infestation of flatworms originating from snail feces.

Here in the United States, we generally don’t have to worry about waterborne illness. That’s because our tap-water travels through a rigorous system of mechanical filtration and chemical treatment which expunges contaminants, resulting in H2O that’s clean, refreshing, and among the safest in the world.

Raw water is insulting; insulting to the health of those that drink it, to the intelligence of those who consider it, and to the hundreds of millions of people around the world who yearn for treated water free from raw contamination.

 

Progressively Scaring the World (Lewin book synopsis)

 

Bernie Lewin has written a thorough history explaining a series of environmental scares building up to the current obsession with global warming/climate change. The story is enlightening to people like me who were not paying attention when much of this activity was going down, prior to Copenhagen COP in my case.  It also provides a rich description of happenings behind the scenes.

As Lewin explains, it is a particularly modern idea to scare the public with science, and thereby advance a policy agenda. The power of this approach is evident these days, but his book traces it back to more humble origins and describes the process bringing us to the present state of full-blown climate fear. It is a cautionary tale.

“Those who don’t know history are doomed to repeat it.”
― Edmund Burke (1729-1797)

This fearful belief evolved through a series of expanding scares as diagrammed below:This article provides only some highlights while the book exposes the maneuvers and the players, their interests and tactics. Quotes from Lewin appear in italics, with my titles, summaries and bolds.
sfw_ddt-wars-cover

 

In the Beginning: The DDT Scare

The Context

A new ‘environmentalism’ arose through a broadening of specific campaigns against environmental destruction and pollution. It began to target more generally the industries and technologies deemed inherently damaging. Two campaigns in particular facilitated this transition, as they came to face-up squarely against the dreams of a fantastic future delivered by unfettered sci-tech progress.

One of these challenged the idea that we would all soon be tearing through the sky and crossing vast oceans in just a few hours while riding our new supersonic jets. But even before the ‘Supersonic Transportation Program’ was announced in 1963, another campaign was already gathering unprecedented support. This brought into question the widely promoted idea that a newly invented class of chemicals could safely bring an end to so much disease and destruction—of agriculture, of forests, and of human health—through the elimination of entire populations of insects. Pg.16

When the huge DDT spraying programs began, the Sierra Club’s immediate concern was the impact on nature reserves. But then, as the movement against DDT developed, and as it became increasingly involved, it began to broaden its interest and transform. By the end of the 1960s it and other similar conservation organisations were leading the new environmentalism in a broader campaign against DDT and other technological threats to the environment. Pg.18

The Alarm

This transformation was facilitated by the publication of a single book that served to consolidate the case against the widespread and reckless use of organic pesticides: Silent Spring. The author, Rachel Carson, had published two popular books on ocean ecology and a number of essays on ecological themes before Silent Spring came out in 1962. As with those earlier publications, one of the undoubted contributions of the book was the education of the public in a scientific understanding of nature. Pg.18

We will never know how Carson would have responded to the complete ban on DDT in the USA. She was suffering from cancer while writing Silent Spring and died shortly after publication (leaving the royalties from its sale to the Sierra Club), but the ban was not achieved for another decade. What we do know is that a full ban was never her intention. She supported targeted poisoning programs in place of blanket spraying, and she urged the authorities to look for alternative and ‘integrated control’, along the lines of the ‘Integrated Pest Management’ approach that is common and accepted today. Pg.19

The Exaggeration

Overall, by today’s standards at least, Carson’s policy position was moderate, and so we should be careful not to attribute to her the excesses of her followers. The trouble with Carson was otherwise: it was in her use and abuse of science to invoke in her readers an overwhelming fear. In Silent Spring, scientific claims find dubious grounding in the evidence. Research findings are exaggerated, distorted and then merged with the purely anecdotal and the speculative, to great rhetorical effect. Pg.19

Historically, the most important area of distortion is in linking organic pesticides with human cancers. The scientific case for DDT as a carcinogen has never been strong and it certainly was not strong when Silent Spring was published. Of course, uncertainty remained, but Carson used the authority of science to go beyond uncertainty and present DDT as a dangerous carcinogen. And it was not just DDT; Carson depicts us ‘living in a sea of carcinogens’, mostly of our own making, and for which there is ‘no safe dose’. Pg.19

The Legacy

If we are to understand how the EPA ban came about, it is important to realise that this action succeeded in breaking a policy stalemate that was becoming increasingly hazardous for the increasingly embattled Nixon administration. On one side of this stalemate were the repeated scientific assessments pointing to a moderate position, while on the other side were calls for more and more extreme measures fuelled by more and more outrageous claims. Pg.21

Such sober assessments by scientific panels were futile in the face of the pseudo-scientific catastrophism that was driving the likes of the Audubon Society into a panic over the silencing of the birds. By the early 1970s two things were clear: public anxiety over DDT would not go away, and yet the policy crisis would not be resolved by heeding the recommendations of scientific committees. Instead, resolution came through the EPA, and the special role that it found for itself following the publication of the Sweeney report. Pg.22

Summary

The DDT scare demonstrated an effective method: Claim that a chemical pollutant is a serious public health risk, Cancer being the most alarming of all. The media stoked the fear, and politicians acted to quell anxiety despite the weak scientific case. Also, the precedent was set for a governmental entity (EPA in this case) to make a judgment overruling expert advice in responding to public opinion.

The SST Scare

The Context

The contribution to the demise of the SST of the environmentalists’ campaign is sometimes overstated, but that is of less concern to our story than the perception that this was their victory. While the DDT campaign was struggling to make headway, the SST campaign would be seen as an early symbolic triumph over unfettered technological progressivism. It provided an enormous boost to the new movement and helped to shape it. Back in 1967, the Sierra Club had first come out campaigning against the SST for the sonic shockwaves sweeping the (sparsely populated) wilderness over which it was then set to fly. But as they began to win that argument, tension was developing within the organisation, with some members wishing to take a stronger, more general and ethical stand against new and environmentally damaging technologies such as this. P.27

With popular support for environmental causes already blooming across the country, and with the SST program already in jeopardy, scientists finally gained their own position of prominence in the controversy when they introduced some new pollution concerns. . . If that wasn’t enough, environmental concerns were also raised in the most general and cursory terms about the aircraft’s exhaust emissions. These first expressions of pollution concerns would soon be followed by others, from scientists who were brought into the debate to air speculation about various atmospheric catastrophes that would ensue if these supersonic birds were ever allowed to fly. Pg.27

The Alarm

What did make the front page of the New York Times on 2 August 1970 was concern about another climatic effect highlighted in the executive summary of the report. The headline trumpeted ‘Scientists ask SST delay pending study of pollution’ (see Figure 2.1).  The conference had analysed the effect of emissions from a fleet of 500 aircraft flying in the stratosphere, and concerns were raised that the emission of water vapour (and to a lesser extent other emissions) might absorb sunlight sufficiently to have a local or even global effect on climate. . . The climatic change argument remained in the arsenal of the anti-SST campaigners through to the end, but it was soon outgunned by much more dramatic claims about possible damage to the ozone layer. Pg.30

Throughout the 1970s, scientific speculation drove a series of ozone scares, each attracting significant press attention. These would climax in the mid-1980s, when evidence of ozone-depleting effects of spray-can propellants would be discovered in the most unlikely place. This takes us right up to the start of the global warming scare, presenting along the way many continuities and parallels. Indeed, the push for ozone protection up to the 1980s runs somewhat parallel with the global warming movement until the treaty process to mitigate ozone damage suddenly gained traction and became the very model for the process to mitigate global warming. The ozone story therefore warrants a much closer look. Pg.31

For Harold Johnston of the University of California, the real problem with SST exhaust would not be water vapour but oxides of nitrogen. Working all night, the next morning he presented Xerox copies of handwritten work projecting 10–90% depletion. In high traffic areas, there would be no stopping these voracious catalysts: the ozone layer would all but disappear within a couple of years. Even when Johnston later settled for a quotable reduction by half, there could be no quibbling over the dangers to nature and humanity of such massive environmental destruction. Pg.44

A New York Times reporter contacted Johnston to confirm his claims and although the report he delivered was subdued, the story remained alarming. It would take less than a year of full-fleet operations, Dr Johnston said in a telephone interview, for SSTs to deplete half of the stratospheric ozone that shields the earth from the sun’s ultraviolet radiation. Scientists argued in the SST debate last March that even a 1 percent reduction of ozone would increase radiation enough to cause an additional 10,000 cases of skin cancer a year in the United States. The next day, 19 May 1971, a strong negative vote demolished the funding bill. All but a few stalwarts agreed that one more vote in the House and it was all over for Boeing’s SST. After that final vote, on 30 May, the New York Times followed-up on its initial story with a feature on Johnston’s claims. This was written by their leading science writer, Walter Sullivan, an influential science communicator important to our story. Pg.48

The Exaggeration

It is true that in 1971 the link between skin cancer and sun exposure was fairly well established in various ways, including by epidemiological studies that found fair-skinned communities in low latitudes tended to record higher rates. However, the link to ultraviolet light exposure (specifically, the UV-B band) is strongest among those cancers that are most common but are also rarely lethal. The link with the rarer and most dangerous cancers, the malignant melanomas, is not so strong, especially because they often appear on skin that is not usually exposed to the sun. Pg.43

Thus, sceptics of the fuss over the risk of a few percent thinning of the already variable ozone layer would point out that the anti-SST crowd did not seemed overly worried about the modern preference for sunshine, which was, on the very same evidence, already presenting a risk many orders of magnitude greater: a small depletion in the ozone layer would be the equivalent of moving a few miles south. To the dismay of their environmentalist opponents, the bolder among these sceptics would recommend the same mitigation measures recommended to the lifestyle migrants—sunscreen, sunglasses and sunhats. Pg.43

But in 1971 there was no way to directly measure stratospheric NOx. No one was even sure whether there was any up there. Nor was there any way to confirm the presence—and, if so, the concentration— of many of the other possibly relevant reactive trace gases. This left scientists only guessing at natural concentrations, and for NOx, Johnston and others had done just that. These ‘best guesses’ were then the basis for modelling of the many possible reactions, the reaction rates, and the relative significance of each in the natural chemistry of the cold thin air miles above. All this speculation would then form the basis of further speculations about how the atmosphere might respond to the impacts of aircraft that had not yet flown; indeed none had even been built. Pg.46

The Legacy

But already the message had got through to where it mattered: to the chair of the Senate Committee on Aeronautical and Space Science, Clinton Anderson. The senator accepted Johnston’s theory on the strength of Sullivan’s account, which he summarised in a letter to NASA before concluding that ‘we either need NOx-free engines or a ban on stratospheric flight’.  And so it turned out that directly after the scrapping of the Boeing prototype, the overriding concern about supersonic exhaust pollution switched from water vapour to NOx. Pg.49

As startling as Johnston’s success appears, it is all the more extraordinary to consider how all the effort directed at solving the NOx problem was never distracted by a rising tide of doubt. The more the NOx effect was investigated, the more complex the chemistry seemed to be and the more doubtful became the original scientific foundations of the scare. In cases of serial uncertainty, the multiplying of best-guess estimates of an effect can shift one way and then the other as the science progresses. But this was never the case with NOx, nor with the SST-ozone scare generally. Pg.50

Summary

The SST Scare moved attention to the atmosphere and the notion of trace gases causing environmental damage, again linked to cancer risk. While ozone was the main issue, climate change was also raised along with interest in carbon dioxide emissions. Public policy was moved to withdraw funding for American SST production and later to ban European SSTs from landing in the US. It also demonstrated that fears could be promoted regarding a remote part of nature poorly known or understood. Models were built projecting fearful outcomes from small changes in atmospheric regions where data was mostly lacking.
earth_ozone_1

 

The CFC Scare

The Context

Presumptions about the general state of a system’s stability are inevitable in situations of scant evidence, and they tend to determine positions across the sceptic/alarmist divide. Of course, one could suppose a stable system, in which a relatively minor compensatory adjustment might have an alarming impact on civilisation, like the rapid onset of a few metres of rise in sea level. But it is the use of such phrases as ‘disturbing the delicate balance of nature’ or ‘a threat to life on Earth’ that are giveaways to a supposition of instability. Hence Scorer’s incredulity regarding Johnston’s leap towards his catastrophic conclusion: ‘How could it be alleged seriously that the atmosphere would be upset by introducing a small quantity of the most commonly and easily formed compounds of the two elements which comprise 99% of it?’ Pg.68

Meanwhile, ‘Sherry’ Rowland at the University of California was looking around for a new interest. Since 1956 he had been mostly researching the chemistry of radioactive isotopes under funding from the Atomic Energy Commission. Hearing of Lovelock’s work, he was intrigued by the proposal that nearly all the CFCs ever produced might still be out there. Were there no environmental conditions anywhere that would degrade these chemicals? He handed the problem to his post-doctoral research assistant, Mario Molina. Molina eventually concluded that indeed there were no ‘sinks’ for CFCs anywhere in the ocean, soils or lower atmosphere. Thus we should expect that CFCs would drift around the globe, just as Lovelock had proposed, and that they would do so for decades, even centuries. . . or forever? Could mankind have created an organic compound that is so noble that it is almost immortal? Pg.75

The Alarm

The ozone effect that Molina had stumbled upon was different to those previously proposed from rockets and aeroplanes in one important respect: it would be tremendously delayed. Like a hidden cancer, the CFCs would build up quietly and insidiously in the lower atmosphere until their effect on the ozone miles above was eventually detectable, decades later. But when unequivocal evidence finally arrived to support the theory, it would be too late. By then there would be no stopping the destruction of the thin veil protecting us from the Sun’s carcinogenic rays. What Molina had stumbled upon had, in double-dose, one sure element of a good environmental scare. Pg.77

According to Walter Sullivan, they had calculated that spray-can CFCs have already accumulated sufficiently in the upper air to begin depleting the ozone that protects the earth from lethal ultraviolet radiation.  On current emission trends, 30% of the ozone layer would be destroyed as early as 1994. This was no longer a story about saving the sky for our grandchildren. These scientists had found an effect, already in train, with ‘lethal’ consequences for all living things during the lifetime of most of the New York Times’ massive and influential readership. Pg.82

During 1988, the second wave of global environmentalism would reach its peak in the USA, with CFC pollution its first flagship cause. Mid-March saw the US Congress voting unanimously to ratify the Montreal Protocol. It was only the second country to do so, while resistance remained strong in Europe. The following day, NASA announced the results of a huge two-year study of global ozone trends. Pb.107

The new scientific evidence came from a re-analysis of the ozone record. This found that the protective layer over high-population areas in the midlatitudes of the northern hemisphere had been depleted by between 1.7% and 3% from 1969 to 1986. These trends had been calculated after removing the effect of ‘natural geophysical variables’ so as to better approximate the anthropogenic influence. As such, these losses across just 15 years were at much faster rates than expected by the previous modelling of the CFC effect. Pg.107

The statements of the scientists (at least as quoted) made it clear to the press that this panel of experts had interpreted the empirical evidence as showing that a generalised CFC-driven depletion had already begun, and at a much faster rate than expected from the modelling used to inform the Montreal Protocol.  Pg.109

This linking by scientists of the breakup of the southern vortex with low ozone readings in southern Australia during December 1987 morphed into the idea that the ozone hole itself had moved over southern Australia. All sorts of further exaggerations and extrapolations ensued, including the idea of the hole’s continuing year-round presence. An indication of the strength of this mythology is provided by a small survey in 1999 of first-year students in an atmospheric science course at a university in Melbourne. This found that 80% of them believed the ozone hole to be over Australia, 97% believed it to be present during the summer and nearly 80% blamed ozone depletion for Australia’s high rate of skin cancer. Pg.114

After the London ‘Save the Ozone Layer Conference’, the campaign to save the ozone layer was all but won. It is true that a push for funding to assist poor country compliance did gain some momentum at this conference, and it was thought that this might stymie agreement, but promises of aid were soon extracted, and these opened the way for agreement on a complete global phase-out of CFC production. Pg.119

The Exaggeration

Here we had Harvard scientists suggesting that hairspray destruction of the ozone layer had already begun. Verification of the science behind this claim could not have played any part in the breaking of the scare, for there was nothing to show. It turned out that McElroy and Wofsy had not shown their work to anyone, anywhere. Indeed, the calculations they reported to Sullivan were only submitted for publication a few days after the story ran in the New York Times. By that time already, the science did not matter; when McElroy and Wofsy’s calculations finally appeared in print in February 1975, the response to the scare was in full swing, with spray-can boycotts, with ‘ban the can’ campaigns, and with bills to that effect on the table in Congress. Pg.82

It was on track to deliver its findings by April 1976 when it was hit with the shocking discovery of a new chlorine ‘sink’. On receiving this news, it descended into confusion and conflict and this made impossible the timely delivery of its much-anticipated report. The new ‘sink’ was chlorine nitrate. When chlorine reacts to form chlorine nitrate its attack on ozone is neutralised. It was not that chlorine nitrate had previously been ignored, but that it was previously considered very unstable. However, late in 1975 Rowland concluded it was actually quite stable in the mid-stratosphere, and therefore the two most feared ozone eaters—NOx and CFCs—would neutralise each other: not only could natural NOx moderate the CFC effect, but hairsprays and deodorants could serve to neutralise any damage Concorde might cause. Pg.84

Now, at the height of the spray-can scare, there was a shift back to climate. This was reinforced when others began to point to the greenhouse effect of CFCs. An amazing projection, which would appear prominently in the NAS report, was that CFCs alone would increase global mean temperature by 1°C by the end of the century—and that was only at current rates of emissions! In all this, McElroy was critical of Rowland (and others) for attempting to maintain the momentum of the scare by switching to climatic change as soon as doubts about the cancer scare emerged. It looked like the scientists were searching for a new scientific justification of the same policy outcome. Pg.87

The Legacy

The ban on the non-essential uses of spray-can CFCs that came into force in January 1978 marked a peak in the rolling ozone scares of the 1970s. Efforts to sustain the momentum and extend regulation to ‘essential’ spray cans, to refrigeration, and on to a complete ban, all failed. The tail-end of the SST-ozone scare had also petered out after the Franco-British consortium finally won the right to land their Concorde in New York State in 1977. And generally in the late 1970s, the environmental regulation movement was losing traction, with President Carter’s repeated proclamations of an environmental crisis becoming increasingly shrill (more on that below). Eventually, in 1981, Ronald Reagan’s arrival at the White House gave licence and drive to a backlash against environmental regulation that had been building throughout the 1970s. Long before Reagan’s arrival, it was made clear in various forums that further regulatory action on CFCs could only be premised on two things: international cooperation and empirical evidence. Pg.89

To some extent, the demand for better science had always been resisted. From the beginning, advocates conceded that direct and unequivocal evidence of CFC-caused depletion might be impossible to gain before it is too late.  But concerns over whether the science was adequate went deeper. The predictions were based on simple models of a part of our world that was still remote and largely unknown. Pg.91

Summary.

The CFC scare brought the focus of dangerous behavior down from the stratosphere to spray cans in the hands of ordinary people, along with their use of air conditioners so essential to life in the sunny places people prefer.  Speculation about ozone holes over polar regions were also more down to earth. And for the first time all of this concern produced an international treaty with extraordinary cooperation against CFCs, with UNEP soaring into prominence and gaining much credit for guiding the policy process.

The CO2 Scare

The Context

In the USA during the late 1970s, scientific interest in the potential catastrophic climatic consequences of carbon dioxide emissions came to surpass other climatic concerns. Most importantly, it came to surpass the competing scientific and popular anxiety over global cooling and its exacerbation by aerosol emissions. However, it was only during the late 1980s that the ‘carbon dioxide question’ broke out into the public discourse and transformed into the campaign to mitigate greenhouse warming. For more than a decade before the emergence of this widespread public concern, scientists were working on the question under generous government funding. Pg.122

The proven trigger for the release of funding was to forewarn of catastrophe, to generate public fear and so motivate administrators and politicians to fund investigations targeting the specific issue. The dilemma for the climatic research leadership was that calls for more research to assess the level of danger would fail unless declarations of danger were already spreading fear. Pg.143

The scare that would eventually triumph over all preceding global environmental scares, and the scare that would come to dominate climatic research funding, began with a coordinated, well-funded program of research into potentially catastrophic effects. It did so before there was any particular concern within the meteorological community about these effects, and before there was any significant public or political anxiety to drive it. It began in the midst of a debate over the relative merits of coal and nuclear energy production. Pg 144

The Alarm

In February 1979, at the first ever World Climate Conference, meteorologists would for the first time raise a chorus of warming concern. These meteorologists were not only Americans. Expert interest in the carbon dioxide threat had arisen during the late 1970s in Western Europe and Russia as well. However, there seemed to be nothing in particular that had triggered this interest. There was no new evidence of particular note. Nor was there any global warming to speak of. Global mean temperatures remained subdued, while in 1978 another severe winter descended over vast regions of North America. The policy environment also remained unsympathetic. Pg.184

At last, during the early 1980s, Nature gave some clear signals that it was coming out on the side of the warmers. In the early 1980s it started to become clear that the four-decade general cooling trend was over. Weather station records in the northern mid-latitudes began again to show an upward trend, which was traceable back to a turnaround during the 1970s. James Hansen was early in announcing this shift, and in doing so he also excited a foreboding of manmade warming. Pg.193

Besides, there was a much grander diluvian story that continued to gain currency: the semi-submerged West Antarctic ice sheet might detach and slide into the sea. This was for some an irresistible image of terrible beauty: displacement on a monumental scale, humanity unintentionally applying the lever of industrial emissions to cast off this inconceivably large body of ice. As if imagining some giant icy Archimedes slowly settling into his overflowing bath, Hansen calculated the consequential displacement to give a sea-level rise of 5 or 6 metres within a century. Pg.195

Moreover, it had the imprimatur of the American Association for the Advancement of Science; the AAAS journal, Science, was esteemed in the USA above all others. Thus we can forgive Sullivan his credulity of this string of claims: that the new discovery of ‘clear evidence’ shows that emissions have ‘already warmed the climate’, that this supports a prediction of warming in the next century of ‘almost unprecedented magnitude’, and that this warming might be sufficient to ‘melt and dislodge the ice cover of West Antarctica’. The cooling scare was barely in the grave, but the warmers had been rehearsing in the wings. Now their most daring member jumped out and stole the show. Pg.196

But Hansen went beyond this graph and beyond the conclusion of his published paper to firstly make a strong claim of causation, and then, secondly, to relate this cause to the heat being experienced that year (indeed, the heat being experienced in the hearing room even as he spoke!). He explained that ‘the Earth is warmer in 1988 than at any time in the history of instrumental measurements’. He had calculated that ‘there is only a 1 percent chance of an accidental warming of this magnitude. . . ’ This could only mean that ‘the greenhouse effect has been detected, and it is changing our climate now’. Hansen’s detection claim was covered by all the main television network news services and it won for him another New York Times front page headline: Global warming has begun, expert tells Senate. Pg.224

The Exaggeration

Where SCOPE 29 looked toward the time required for a doubling of the atmospheric concentration of carbon dioxide, at Villach the policy recommendation would be based on new calculations for the equivalent effect when all emitted greenhouse gases were taken into account. The impact of the new calculations was to greatly accelerate the rate of the predicted warming. According to SCOPE 29, on current rates of emissions, doubling of the carbon dioxide concentration would be expected in 2100. At Villach, the equivalent warming effect of all greenhouse gases was expected as early as 2030. Pg.209

This new doubling date slipped under a psychological threshold: the potential lifetime of the younger scientists in the group. Subsequently, these computations were generally rejected and the agreed date for ‘the equivalent of CO2 doubling’ was pushed out at least 20 years; indeed, never again would there be a doubling estimate so proximate with the time in which it was made. Pg.209

Like so many of the consensus statements from this time on, this one is twisted so that it gives the appearance of saying more than it actually does. In this way, those pushing for dramatic effect and those concerned not to overstate the case can come to agreement. In fact, this passage of the statement brings the case for alarm down to the reliability of the modelling, which is pretty much true of SCOPE 29. Pg.210

In other words, the Impact on Climate Change working group concluded that the models are not yet ready to make predictions (however vaguely) about the impact of greenhouse gas emissions on the global climate.  Pg.210

The Legacy

Today, emissions targets dominate discussions of the policy response to global warming, and total emissions rates are tacitly assumed to be locked to a climatic response of one, two or so many degrees of warming. Today’s discussions sits on top of a solid foundation of dogma established across several decades and supposedly supported by a scientific consensus, namely that there is a direct cause–effect temperature response to emissions. Pg.219

One of the main recommendations for mitigating these dire consequences is a comprehensive global treaty to protect the atmosphere. On the specific issue of global warming, the conference statement calls for the stabilisation of atmospheric concentrations of one greenhouse gas, namely carbon dioxide. It estimates that this would require a reduction of current global emissions by more than 50%. However, it suggests an initial goal for nations to reduce their current rates of carbon dioxide emission by 20% by 2005. This rather arbitrary objective would become the headline story: ‘Targets agreed to save climate’. And it stuck. In the emissions-reduction policy debate that followed, this ‘Toronto target’ became the benchmark. For many years to come—indeed, until the Kyoto Protocol of 1997—it would be a key objective of sustainable development’s newly launched flagship. Pg.221

Summary

The framework for international action is established presuming that CO2 emissions directly cause global warming and that all nations must collectively cut their use of fossil fuels. However, the drive for a world treaty is hampered by a lack of proof and scientists’ mixed commitment to the policy goals.

 

The IPCC Scare

The Context

Before winter closed in at the end of 1988, North America was brimming with warming enthusiasm. In the USA, global warming was promised attention no matter who won the presidential election. In Canada, after the overwhelming success of the Toronto conference, the government continued to promote the cause, most enthusiastically through its environment minister Tom McMillan. Elsewhere among world leaders, enthusiasm was also building. The German chancellor, Helmut Kohl, had been a long-time campaigner against fossil fuels. Pg.224

In February 1989, the year got off to a flying start with a conference in Delhi organised by India’s Tata Energy Research Institute and the Woods Hole Research Center, which convened to consider global warming from the perspective of developing countries. The report of the conference produced an early apportionment of blame and a call for reparations. It proclaimed that the global warming problem had been caused by the industrially developed countries and therefore its remediation should be financed by them, including by way of aid to underdeveloped countries. This call was made after presenting the problem in the most alarming terms: Global warming is the greatest crisis ever faced collectively by humankind, unlike other earlier crises, it is global in nature, threatens the very survival of civilisation, and promises to throw up only losers over the entire international socio-economic fabric. The reason for such a potential apocalyptic scenario is simple: climate change of geological proportions are occurring over time-spans as short as a single human lifetime. Pg.226

Throughout 1989, the IPCC working groups conducted a busy schedule of meetings and workshops at venues around the northern hemisphere. Meanwhile, the outpouring of political excitement that had been channelled into the process brought world attention to the IPCC. By the time of its second full session in June 1989, its treaty development mandate had become clearer: the final version of the resolution that had passed at the UN General Assembly the previous December—now called ‘Protection of global climate for present and future generations of mankind’—requested that the IPCC make recommendations on strengthening relevant existing international legal instruments and on ‘elements for inclusion in a possible future international convention on climate.’ pg.242

The Alarm

The general feeling in the research community that the policy process had surged ahead of the science often had a different effect on those scientists engaged with the global warming issue through its expanded funding. For them, the situation was more as President Bush had intimated when promising more funding: the fact that ‘politics and opinion have outpaced the science’ brought the scientists under pressure ‘to bridge the gap’pg.253

This is what became known as the ‘first detection’ program. With funding from DoE and elsewhere, the race was soon on to find ways to achieve early detection of the climate catastrophe signal. More than 10 years later, this search was still ongoing as the framework convention to mitigate the catastrophe was being put in place. It was not so much that the ‘conventional wisdom’ was proved wrong; in other words, that policy action did not in fact require empirical confirmation of the emissions effect. It was more that the policy action was operating on the presumption that this confirmation had already been achieved. Pg.254

The IPCC has warned that if CO2 emissions are not cut by 60 percent immediately, the changes in the next 60 years may be so rapid that nature will be unable to adapt and man incapable of controlling them.  The policy action to meet this threat—the UN Framework Convention on Climate Change—went on to play a leading role as the headline outcome of the entire show. The convention drafted through the INC negotiation over the previous two years would not be legally binding, but it would provide for updates, called ‘protocols’, specifying mandatory emissions limits. Towards the end of the Earth Summit, 154 delegations put their names to the text. Pg.266

The Exaggeration

It may surprise readers that even within the ‘carbon dioxide community’ it was not hard to find the view that the modelling of the carbon dioxide warming was failing validation against historical data and, further upon this admission, the suggestion that their predicted warming effect is wrong. In fact, there was much scepticism of the modelling freely expressed in and around the Carbon Dioxide Program in these days before the climate treaty process began. Those who persisted with the search for validation got stuck on the problem of better identifying background natural variability. There did at least seem to be agreement that any recent warming was well within the bounds of natural variability. Pg.261

During the IPCC review process, Wigley was asked to answer the question that he had avoided in the SCOPE 29: When is detection likely to be achieved? He responded with an addition to the IPCC chapter that explains that we would have to wait until the half-degree of warming that had occurred already during the 20th century is repeated. Only then are we likely to determine just how much of it is human-induced. If the carbon dioxide driven warming is at the high end of the predictions, then this would be early in the 21th century, but if the warming was slow then we may not know until 2050 (see Figure 15.1). In other words, scientific confirmation that carbon dioxide emissions is causing global warming is not likely for decades. Pg.263

These findings of the IPCC Working Group 1 assessment presented a political problem. This was not so much that the working group was giving the wrong answers; it was that it had got stuck on the wrong questions, questions obsolete to the treaty process. The IPCC first assessment was supposed to confirm the scientific rationale for responding to the threat of climate change, the rationale previously provided by the consensus statement coming out of the 1985 Villach conference. After that, it would provide the science to support the process of implementing a coordinated response. But instead of confirming the Villach findings, it presented a gaping hole in the scientific rationale. Pg.263

Scientist-advocates would continue their activism, but political leaders who pledged their support for climate action had invested all scientific authority for this action in the IPCC assessment. What did the IPCC offer in return? It had dished up dubiously validated model projections and the prospect of empirical confirmation perhaps not for decades to come. Far from legitimising a treaty, the scientific assessment of Working Group 1 provided governments with every reason to hesitate before committing to urgent and drastic action. Pg.263

In 1995, the IPCC was stuck between its science and its politics. The only way it could save itself from the real danger of political oblivion would be if its scientific diagnosis could shift in a positive direction and bring it into alignment with policy action. Without a positive shift in the science, it is hard to see how even the most masterful spin on another assessment could serve to support momentum towards real commitment in a binding protocol. With ozone protection, the Antarctic hole had done the trick and brought on agreement in the Montreal Protocol. But there was nothing like that in sight for the climate scare. Without a shift in the science, the IPCC would only cause further embarrassment and so precipitate its further marginalisation. Pg.278

For the second assessment, the final meeting of the 70-odd Working Group 1 lead authors was scheduled for July 1995 in Asheville, North Carolina. This meeting was set to finalise the drafting of the chapters in response to review comments. It was also (and mostly) to finalise the draft Summary for Policymakers, ready for intergovernmental review. The draft Houghton had prepared for the meeting was not so sceptical on the detection science as the main text of the detection chapter drafted by Santer; indeed it contained a weak detection claim. However, it matched the introduction to the detection chapter, where Santer had included the claim that ‘the best evidence to date suggests’. . . .. . a pattern of climate response to human activities is identifiable in observed climate records.

This detection claim appeared incongruous with the scepticism throughout the main text of the chapter and was in direct contradiction with its Concluding Summary. It represented a change of view that Santer had only arrived at recently due to a breakthrough in his own ‘fingerprinting’ investigations. These findings were so new that they were not yet published or otherwise available, and, indeed, Santer’s first opportunity to present them for broader scientific scrutiny was when Houghton asked him to give a special presentation to the Asheville meeting. Pg.279

However, the results were also challenged at Asheville: Santer’s fingerprint finding and the new detection claim were vigorously opposed by several experts in the field. One of the critics, John Christy, recalls challenging Santer on his data selection.  Santer recalls disputing the quality of the datasets used by Christy.  Debates over the scientific basis of the detection claim dominated the meeting, sometimes continuing long after the formal discussions had finished and on into the evening. Pg.280

In September, a draft summary of the entire IPCC second assessment was leaked by the New York Times, the new detection claim revealed on its front page. Pg.281

The UK Independent headlined ‘Global Warming is here, experts agree’ with
the subheading:  ‘Climate of fear: Old caution dropped as UN panel of scientists concur on danger posed by greenhouse gases.‘ The article explains the breakthough: “The panel’s declaration, after three days of torturous negotiation in Madrid, marks a decisive shift in the global-warming debate. Sceptics have claimed there is no sound evidence that climate has been changed by the billions of tonnes of carbon dioxide and other heat-trapping ‘greenhouse gases’ spewed into the atmosphere each year, mostly from the burning of fossil fuels and forests. But the great majority of governments and climate scientists now think otherwise and are now prepared to say so. ‘The balance of evidence suggests a discernible human influence on global climate’, the IPCC’s summary of its 200-page report says. The last such in-depth IPCC report was published five years ago and was far more cautious.” Pg.283

The Legacy

Stories appearing in the major newspapers over the next few days followed a standard pattern. They told how the new findings had resolved the scientific uncertainty and that the politically motivated scepticism that this uncertainty had supported was now untenable. Not only was the recent success of the attribution finding new to this story; also new was the previous failure. Before this announcement of the detection breakthrough, attention had rarely been drawn to the lack of empirical confirmation of the model predictions, but now this earlier failure was used to give a stark backdrop to the recent success, maximising its impact and giving a scientific green light to policy action. Thus, the standard narrative became: success after the previous failure points the way to policy action. Pg.284

With so many political actors using the authority of the IPCC’s detection finding to justify advancing in that direction, it is hard to disagree with his assessment. Another authority might well have been used to carry the treaty politics forward, but the fact that this particular authority was available, and was used, meant that the IPCC was hauled back into the political picture, where it remains the principal authority on the science to this day. Pg.301

What we can see from all this activity by scientists in the close vicinity of the second and third IPCC assessments is the existence of a significant body of opinion that is difficult to square with the IPCC’s message that the detection of the catastrophe signal provides the scientific basis for policy action. Most of these scientists chose not to engage the IPCC in public controversy and so their views did not impact on the public image of the panel. But even where the scientific basis of the detection claims drew repeated and pointed criticism from those prepared to engage in the public controversy, these objections had very little impact on the IPCC’s public image. Pg.310

Today, after five full assessments and with another on the way, the IPCC remains the pre-eminent authority on the science behind every effort to head off a global climate catastrophe. Pg.310

Summary:

Today the IPCC is a testament to the triumph of politics over science, of style and rhetoric over substance and evidence. A “bait and switch” gambit was applied at the right moment to produce the message wanted by the committed. Fooled by the finesse, the media then trumpeted the “idea whose time has come,” and the rest is history, as they say.   And yet, despite IPCC claims to the contrary, the detection question is still not answered for those who demand evidence.

Thank you Bernie Lewin and GWPF for setting the record straight, and for demonstrating how this campaign is sustained by unfounded fears.

A continuing supply of hot air keeps scare balloons inflated.

The Alarmist Beehive

This post is about bees since they are also victims of science abuse by environmental activists, aided and abetted by the media. The full story is told by Jon Entine at Slate Do Neonics Hurt Bees?  Researchers and the Media Say Yes. The Data Do Not.
A new, landmark study provides plenty of useful information. If only we could interpret it accurately. Synopsis below.

Futuristic Nightmare Scenarios

“Neonicotinoid Pesticides Are Slowly Killing Bees.”

No, there is no consensus evidence that neonics are “slowly killing bees.” No, this study did not add to the evidence that neonics are driving bee health problems. And yet . . .

Unfortunately, and predictably, the overheated mainstream news headlines also generated a slew of even more exaggerated stories on activist and quack websites where undermining agricultural chemicals is a top priority (e.g., Greenpeace, End Times Headlines, and Friends of the Earth). The takeaway: The “beepocalypse” is accelerating. A few news outlets, such as Reuters (“Field Studies Fuel Dispute Over Whether Banned Pesticides Harm Bees”) and the Washington Post (“Controversial Pesticides May Threaten Queen Bees. Alternatives Could Be Worse.”), got the contradictory findings of the study and the headline right.

But based on the study’s data, the headline could just as easily have read: “Landmark Study Shows Neonic Pesticides Improve Bee Health”—and it would have been equally correct. So how did so many people get this so wrong?

Bouncing off a database can turn your perspective upside down.

Using Data as a Trampoline rather than Mining for Understanding

This much-anticipated two year, $3.6 million study is particularly interesting because it was primarily funded by two major producers of neonicotinoids, Bayer Crop Science and Syngenta. They had no involvement with the analysis of the data. The three-country study was led by the Centre for Ecology and Hydrology, or CEH, in the U.K.—a group known for its skepticism of pesticides in general and neonics in particular.

The raw data—more than 1,000 pages of it (only a tiny fraction is reproduced in the study)—are solid. It’s a reservoir of important information for entomologists and ecologists trying to figure out the challenges facing bees. It’s particularly important because to date, the problem with much of the research on neonicotinoids has been the wide gulf between the findings from laboratory-based studies and field studies.

Some, but not all, results from lab research have claimed neonics cause health problems in honeybees and wild bees, endangering the world food supply. This has been widely and often breathlessly echoed in the popular media—remember the execrably reported Time cover story on “A World Without Bees.” But the doses and time of exposure have varied dramatically from lab study to lab study, so many entomologists remain skeptical of these sweeping conclusions. Field studies have consistently shown a different result—in the field, neonics seem to pose little or no harm. The overwhelming threat to bee health, entomologists now agree, is a combination of factors led by the deadly Varroa destructor mite, the miticides used to control them, and bee practices. Relative to these factors, neonics are seen as relatively inconsequential.

The Bees are all right. Carry on.

Disparity between Field and Lab Research (sound familiar?)

Jon Entine addressed this disparity between field and lab research in a series of articles at the Genetic Literacy Project, and specifically summarized two dozen key field studies, many of which were independently funded and executed. This study was designed in part to bridge that gulf. And the devil is in the interpretation.

Overall, the data collected from 33 different fields covered 42 analyses and 258 endpoints—a staggering number. The paper only presented a sliver of that data—a selective glimpse of what the research, in its entirety showed.

What patterns emerged when examining the entire data set? . . . In sum, of 258 endpoints, 238—92 percent—showed no effects. (Four endpoints didn’t yield data.) Only 16 showed effects. Negative effects showed up 9 times—3.5 percent of all outcomes; 7 showed a benefit from using neonics—2.7 percent.

As one scientist pointed out, in statistics there is a widely accepted standard that random results are generated about 5 percent of the time—which means by chance alone we would expect 13 results meaninglessly showing up positive or negative.

Norman Carreck, science director of the International Bee Research Association, who was not part of either study, noted, the small number of significant effects “makes it difficult to draw any reliable conclusions.”

Moreover, Bees Are Not in Decline

The broader context of the bee health controversy is also important to understand; bees are not in sharp decline—not in North America nor in Europe, where neonics are under a temporary ban that shows signs of becoming permanent, nor worldwide. Earlier this week, Canada reported that its honeybee colonies grew 10 percent year over year and now stand at about 800,000. That’s a new record, and the growth tracks the increased use of neonics, which are critical to canola crops in Western Canada, where 80 percent of the nation’s honey originates.

Managed beehives in the U.S. had been in steady decline since the 1940s, as farm land disappeared to urbanization, but began stabilizing in the mid-1990s, coinciding with the introduction of neonicotinoids. They hit a 22-year high in the last count.

Global hive numbers have steadily increased since the 1960s except for two brief periods—the emergence of the Varroa mite in the late 1980s and the brief outbreak of colony collapse disorder, mostly in the U.S., in the mid-2000s.

Conclusion

So the bees, contrary to widespread popular belief, are actually doing all right in terms of numbers, although the Varroa mite remains a dangerous challenge. But still, a cadre of scientists well known for their vocal opposition to and lobbying against neonics have already begun trying to leverage the misinterpretation of the data. Within hours of the release of the study, advocacy groups opposed to intensive agricultural techniques had already begun weaponizing the misreported headlines.

But viewing the data from the European study in context makes it even more obvious that sweeping statements about the continuing beepocalypse and the deadly dangers to bees from pesticides, and neonicotinoids in particular, are irresponsible. That’s on both the scientists, and the media.

Summary

The comparison with climate alarmism is obvious. The data is equivocal and subject to interpretation. Lab studies can not be replicated in the real world. Activists make mountains out of molehills. Reasonable balanced analysts are ignored or silenced. Media outlets proclaim the end of life as we know it to capture ears and eyeballs for advertisers, and to build their audiences (CNN: All the fear all the time”). Business as usual for Crisis Inc.

 

 

 

Ocean Oxygen Misdirection


The climate scare machine is promoting again the fear of suffocating oceans. For example, an article this week by Chris Mooney in Washington Post, It’s Official, the Oceans are Losing Oxygen.

A large research synthesis, published in one of the world’s most influential scientific journals, has detected a decline in the amount of dissolved oxygen in oceans around the world — a long-predicted result of climate change that could have severe consequences for marine organisms if it continues.

The paper, published Wednesday in the journal Nature by oceanographer Sunke Schmidtko and two colleagues from the GEOMAR Helmholtz Centre for Ocean Research in Kiel, Germany, found a decline of more than 2 percent in ocean oxygen content worldwide between 1960 and 2010.

Climate change models predict the oceans will lose oxygen because of several factors. Most obvious is simply that warmer water holds less dissolved gases, including oxygen. “It’s the same reason we keep our sparkling drinks pretty cold,” Schmidtko said.

But another factor is the growing stratification of ocean waters. Oxygen enters the ocean at its surface, from the atmosphere and from the photosynthetic activity of marine microorganisms. But as that upper layer warms up, the oxygen-rich waters are less likely to mix down into cooler layers of the ocean because the warm waters are less dense and do not sink as readily.

And of course, other journalists pile on with ever more catchy headlines.

The World’s Oceans Are Losing Oxygen Due to Climate Change

How Climate Change Is Suffocating The Oceans

Overview of Oceanic Oxygen

Once again climate alarmists/activists have seized upon an actual environmental issue, but misdirect the public toward their CO2 obsession, and away from practical efforts to address a real concern. Some excerpts from scientific studies serve to put things in perspective.

How the Ocean Breathes

Variability in oxygen and nutrients in South Pacific Antarctic Intermediate Water by J. L. Russell and A. G. Dickson

The Southern Ocean acts as the lungs of the ocean; drawing in oxygen and exchanging carbon dioxide. A quantitative understanding of the processes regulating the ventilation of the Southern Ocean today is vital to assessments of the geochemical significance of potential circulation reorganizations in the Southern Hemisphere, both during glacial-interglacial transitions and into the future.

Traditionally, the change in the concentration of oxygen along an isopycnal due to remineralization of organic material, known as the apparent oxygen utilization (AOU), has been used by physical oceanographers as a proxy for the time elapsed since the water mass was last exposed to the atmosphere. The concept of AOU requires that newly subducted water be saturated with respect to oxygen and is calculated from the difference between the measured oxygen concentration and the saturated concentration at the sample temperature.

This study has shown that the ratio of oxygen to nutrients can vary with time. Since Antarctic Intermediate Water provides a necessary component to the Pacific equatorial biological regime, this relatively high-nutrient, high-oxygen input to the Equatorial Undercurrent in the Western Pacific plays an important role in driving high rates of primary productivity on the equator, while limiting the extent of denitrifying bacteria in the eastern portion of the basin. 

Uncertain Measures of O2 Variability and Linkage to Climate Change

A conceptual model for the temporal spectrum of oceanic oxygen variability by Taka Ito and Curtis Deutsch

Changes in dissolved O2 observed across the world oceans in recent decades have been interpreted as a response of marine biogeochemistry to climate change. Little is known however about the spectrum of oceanic O2 variability. Using an idealized model, we illustrate how fluctuations in ocean circulation and biological respiration lead to low-frequency variability of thermocline oxygen.

Because the ventilation of the thermocline naturally integrates the effects of anomalous respiration and advection over decadal timescales, shortlived O2 perturbations are strongly damped, producing a red spectrum, even in a randomly varying oceanic environment. This background red spectrum of O2 suggests a new interpretation of the ubiquitous strength of decadal oxygen variability and provides a null hypothesis for the detection of climate change influence on oceanic oxygen. We find a statistically significant spectral peak at a 15–20 year timescale in the subpolar North Pacific, but the mechanisms connecting to climate variability remain uncertain.

The spectral power of oxygen variability increases from inter-annual to decadal frequencies, which can be explained using a simple conceptual model of an ocean thermocline exposed to random climate fluctuations. The theory predicts that the bias toward low-frequency variability is expected to level off as the forcing timescales become comparable to that of ocean ventilation. On time scales exceeding that of thermocline renewal, O2 variance may actually decrease due to the coupling between physical O2 supply and biological respiration [Deutsch et al., 2006], since the latter is typically limited by the physical nutrient supply.

Climate Model Projections are Confounded by Natural Variability

Natural variability and anthropogenic trends in oceanic oxygen in a coupled carbon cycle–climate model ensemble by T. L. Frolicher et al.

Internal and externally forced variability in oceanic oxygen (O2) are investigated on different spatiotemporal scales using a six-member ensemble from the National Center for Atmospheric Research CSM1.4-carbon coupled climate model. The oceanic O2 inventory is projected to decrease significantly in global warming simulations of the 20th and 21st centuries.

The anthropogenically forced O2 decrease is partly compensated by volcanic eruptions, which cause considerable interannual to decadal variability. Volcanic perturbations in oceanic oxygen concentrations gradually penetrate the ocean’s top 500 m and persist for several years. While well identified on global scales, the detection and attribution of local O2 changes to volcanic forcing is difficult because of unforced variability.

Internal climate modes can substantially contribute to surface and subsurface O2 variability. Variability in the North Atlantic and North Pacific are associated with changes in the North Atlantic Oscillation and Pacific Decadal Oscillation indexes. Simulated decadal variability compares well with observed O2 changes in the North Atlantic, suggesting that the model captures key mechanisms of late 20th century O2 variability, but the model appears to underestimate variability in the North Pacific.

Our results suggest that large interannual to decadal variations and limited data availability make the detection of human-induced O2 changes currently challenging.

The concentration of dissolved oxygen in the thermocline and the deep ocean is a particularly sensitive indicator of change in ocean transport and biology [Joos et al., 2003]. Less than a percent of the combined atmosphere and ocean O2 inventory is found in the ocean. The O2 concentration in the ocean interior reflects the balance between O2 supply from the surface through physical transport and O2 consumption by respiration of organic material.

Our modeling study suggests that over recent decades internal natural variability tends to mask simulated century-scale trends in dissolved oxygen from anthropogenic forcing in the North Atlantic and Pacific. Observed changes in oxygen are similar or even smaller in magnitude than the spread of the ensemble simulation. The observed decreasing trend in dissolved oxygen in the Indian Ocean thermocline and the boundary region between the subtropical and subpolar gyres in the North Pacific has reversed in recent years [McDonagh et al., 2005; Mecking et al., 2008], implicitly supporting this conclusion.

The presence of large-scale propagating O2 anomalies, linked with major climate modes, complicates the detection of long-term trends in oceanic O2 associated with anthropogenic climate change. In particular, we find a statistically significant link between O2 and the dominant climate modes (NAO and PDO) in the North Atlantic and North Pacific surface and subsurface waters, which are causing more than 50% of the total internal variability of O2 in these regions.

To date, the ability to detect and interpret observed changes is still limited by lack of data. Additional biogeo-chemical data from time series and profiling floats, such as the Argo array (http://www.argo.ucsd.edu) are needed to improve the detection of ocean oxygen and carbon system changes and our understanding of climate change.

The Real Issue is Ocean Dead Zones, Both Natural and Man-made

 

Since 1994, he and the World Resources Institute (report here) in Washington,D.C., have identified and mapped 479 dead zones around the world. That’s more than nine times as many as scientists knew about 50 years ago.

What triggers the loss of oxygen in ocean water is the explosive growth of sea life fueled by the release of too many nutrients. As they grow, these crowds can simply use up too much of the available oxygen.

Many nutrients entering the water — such as nitrogen and phosphorus — come from meeting the daily needs of some seven billion people around the world, Diaz says. Crop fertilizers, manure, sewage and exhaust spewed by cars and power plants all end up in waterways that flow into the ocean. Each can contribute to the creation of dead zones.

Ordinarily, when bacteria steal oxygen from one patch of water, more will arrive as waves and ocean currents bring new water in. Waves also can grab oxygen from the atmosphere.

Dead zones develop when this ocean mixing stops.

Rivers running into the sea dump freshwater into the salty ocean. The sun heats up the freshwater on the sea surface. This water is lighter than cold saltier water, so it floats atop it. When there are not enough storms (including hurricanes) and strong ocean currents to churn the water, the cold water can get trapped below the fresh water for long periods.

Dead zones are seasonal events. They typically last for weeks or months. Then they’ll disappear as the weather changes and ocean mixing resumes.

Solutions are Available and do not Involve CO2 Emissions

Helping dead zones recover

The Black Sea is bordered by Europe and Asia. Dead zones used to develop here that covered an area as large as Switzerland. Fertilizers running off of vast agricultural fields and animal feedlots in the former Soviet Union were a primary cause. Then, in 1989, parts of the Soviet Union began revolting. Two years later, this massive nation broke apart into 15 separate countries.

The political instability hurt farm activity. In short order, use of nitrogen and phosphorus fertilizers by area farmers declined. Almost at once, the size of the Black Sea’s dead zone shrunk dramatically. Now if a dead zone forms there it’s small, Rabalais says. Some years there is none.

Chesapeake Bay, the United State’s largest estuary, has its own dead zone. And the area affected has expanded over the past 50 years due to pollution. But since the 1980s, farmers, landowners and government agencies have worked to reduce the nutrients flowing into the bay.

Farmers now plant cover crops, such as oats or barley, that use up fertilizer that once washed away into rivers. Growers have also established land buffers to absorb nutrient runoff and to keep animal waste out of streams. People have even started to use laundry detergents made without phosphorus.

In 2011, scientists reported that these efforts had achieved some success in shrinking the size of the bay’s late-summer dead zones.

The World Resources Institute lists 55 dead zones as improving. “The bottom line is if we take a look at what is causing a dead zone and fix it, then the dead zone goes away,” says Diaz. “It’s not something that has to be permanent.”

Summary

Alarmists/activists are again confusing the public with their simplistic solution for a complex situation. And actual remedies are available, just not the agenda preferred by climatists.


Waste Management Saves the Ocean

 

Fearmongers Fan Forest Fire Flames

Playing Climate Whack-a-Mole again, this time with a recent outbreak of claims that forest fires will be worse due to climate change. For example:

Climate change found to double impact of forest fires

Over the past 30 years, human-caused climate change has nearly doubled the amount of forest area lost to wildfires in the western United States, a new study has found.

The result puts hard numbers to a growing hazard that experts say both Canada and the U.S. must prepare for as western forests across North America grow warmer and drier and increasingly spawn wildfires that cannot be contained.

The PNAS study referenced is: Impact of anthropogenic climate change on wildfire across western US forests

Abstract: Increased forest fire activity across the western United States in recent decades has contributed to widespread forest mortality, carbon emissions, periods of degraded air quality, and substantial fire suppression expenditures. Although numerous factors aided the recent rise in fire activity, observed warming and drying have significantly increased fire-season fuel aridity, fostering a more favorable fire environment across forested systems. We demonstrate that human-caused climate change caused over half of the documented increases in fuel aridity since the 1970s and doubled the cumulative forest fire area since 1984. This analysis suggests that anthropogenic climate change will continue to chronically enhance the potential for western US forest fire activity while fuels are not limiting.

It’s Not That Simple

As usual, the screaming headlines exaggerate a more complex and nuanced situation. From the study author:

“We see these big increases in fuel aridity across the Western forests” from 1984 to 2015, Dr. Abatzoglou tells the Monitor in a phone interview. “But what we find is about half, only half, not all of it, but about half of it, is attributable to the human-caused climate change.”

That means that other factors, like natural variability, have compounded with climate change to result in longer fire seasons, larger individual fires, and a ninefold increase in the area burnt over the past 30 years.

Another factor could be the amount of dry, unburned wood that has built up over decades of efforts to suppress wildfires, suggests Abatzoglou. So one solution going forward could be to allow controlled fires to burn off that tinder when conditions are wetter. “We do see a really strong coupling between climate change and forest fire in the Western United States,” he says, but “I think there may be ways to weaken that relationship” – by minimizing fuel for these massive wildfires.

h/t to Junk Science

h/t Junk Science

Case Study: Fort McMurray

That PNAS analysis is built upon climate models. On the ground, specific events show more clearly how human and natural fac tors contribute to a forest fires, and it is not so obviously linked to rising CO2.  Blair King writes about Fort McMurray at Huffington Post (here):

We Can’t Blame Climate Change For The Fort McMurray Fires

So what actually caused the fire to be so severe? Well it appears to be a combination of the effects of El Nino and historic forest management decisions. To explain: after the Slave Lake fire in 2011 the Alberta Government sought advice on the fire situation. The result was the Flat Top Complex Wildfire Review Committee Report which made a number of recommendations and concluded:

Before major wildfire suppression programs, boreal forests historically burned on an average cycle ranging from 50 to 200 years as a result of lightning and human-caused wildfires. Wildfire suppression has significantly reduced the area burned in Alberta’s boreal forests. However, due to reduced wildfire activity, forests of Alberta are aging, which ultimately changes ecosystems and is beginning to increase the risk of large and potentially costly catastrophic wildfires.

Essentially the report acknowledged that the trees surrounding Fort McMurray are hard-wired for fire and if they are not managed properly then these types of catastrophic fires will become more common. The warm weather may have accelerated the fires season, but the stage was set for such a fire and not enough work was done to avoid it.

The Haines Index

No matter how a fire ignites, a forest fire becomes dangerous because of the weather conditions allowing it to start and to grow.  The potential for forest fires is often indicated by the Haines Index (HI) that has been widely used for operational fire-weather forecasts in regions of the United States, Canada, and Australia. Several studies have shown a positive correlation between HI and observed fire activity.

An important study (here) demonstrates HI is strongly linked to ocean oscillation events such as El Nino.

The Interannual Variability of the Haines Index over North America

The Haines index (HI) is a fire-weather index that is widely used as an indicator of the potential for dry, lowstatic-stability air in the lower atmosphere to contribute to erratic fire behavior or large fire growth. This study examines the interannual variability of HI over North America and its relationship to indicators of large-scale circulation anomalies. The results show that the first three HI empirical orthogonal function modes are related respectively to El Nino–Southern Oscillation (ENSO), the Arctic Oscillation (AO), and the interdecadal sea ~ surface temperature variation over the tropical Pacific Ocean. During the negative ENSO phase, an anomalous ridge (trough) is evident over the western (eastern) United States, with warm/dry weather and more days with high HI values in the western and southeastern United States. During the negative phase of the AO, an anomalous trough is found over the western United States, with wet/cool weather and fewer days with high HI, while an anomalous ridge occurs over the southern United States–northern Mexico, with an increase in the number of days with high HI. After the early 1990s, the subtropical high over the eastern Pacific Ocean and the Bermuda high were strengthened by a wave train that was excited over the tropical western Pacific Ocean and resulted in warm/dry conditions over the southwestern United States and western Mexico and wet weather in the southeastern United States. The above conditions are reversed during the positive phase of ENSO and AO and before the early 1990s.

Forests Benefit from More CO2 and Milder Temperatures

A previous post (here) provided links to studies showing how US and other forests have generally become more healthy and resilient over recent decades.  For example:

There is strong evidence from the United States and globally that forest growth has been increasing over recent decades to the past 100+ years. Future prospects for forests are not clear because different models produce divergent forecasts. However, forest growth models that incorporate more realistic physiological responses to rising CO2 are more likely to show future enhanced growth. Overall, our review suggests that United States forest health has improved over recent decades and is not likely to be impaired in at least the next few decades.

Figure 9--Subregional Timber Supply Model projections of total hardwood and softwood removals volumes, in billion cubic feet (bcf), by private owners (where NIPF stands for nonindustrial private forest land), 1999 to 2040, under the base scenario. USDA South Region

Figure 9–Subregional Timber Supply Model projections of total hardwood and softwood removals volumes, in billion cubic feet (bcf), by private owners (where NIPF stands for nonindustrial private forest land), 1999 to 2040, under the base scenario. USDA South Region

As well, the outlook for timber production is optimistic  (here):

Although models suggest that global timber productivity will likely increase with climate change, regional production will exhibit large variability, as illustrated in Table 1. In boreal regions natural forests would migrate to the higher latitudes. Countries affected would likely include Russia and Canada. Warming could be accompanied by increased forest management in northern parts of some of these countries, particularly the Nordic countries, as increased tree growth rates are experienced. Climate change will also substantially impact other services, such as seed availability, nuts, berries, hunting, resins, and plants used in pharmaceutical and botanical medicine and the cosmetics industry, and these impacts will also be highly diverse and regionalized.

Another factor to consider is the effects of impacts other than climate change such as land-use change and tree plantation establishment. In many regions, these effects may be more important than the direct impact of climate. Indeed, over the past half-century industrial wood production has been increasingly shifting from native forests to planted forests. (my bold)

Summary

Again we see jumping to a conclusion to get a simple (simplistic) reduction of a complex reality. In legal terms, blaming forest fires on CO2 is a rush to judgment. Yes, forests are affected by more CO2 and milder temperatures. They grow stronger and more resilient to all kinds of threats, including forest fires. The incidence of forest fires is associated strongly with drier conditions resulting from ocean oscillations and accompanying atmospheric circulations. Humans play a role in land usage, timber management and fire suppression policies.

See link for more on playing Climate Whack-a-mole

Postscript:

I am proud of making a post title composed entirely of F-words.  Apologies for any problems caused to stutterers.  Old joke about the fate of a stutterer during an Air Force parachute training exercise.  All the chutes of the trainees opened, except for one man who kept falling past the others.  He was overheard saying:  “Th-th-th-th-thr-three, Fa-fa-fa-fa-fou-four, Fi-fi- . . .”

Safety, water, economics come before climate

Climate activists like to put their banner up and take credit for opposition demonstrations against development projects involving energy.  But the truth is, climate change is the last thing on the minds of those objecting to the projects.  This report gets at the realities:

Safety, water, economics come before climate for foes of local energy projects (here)

OTTAWA — New research suggests that polarizing debates over the impacts of climate change are not the driving force behind local opposition to major energy projects.

And that’s something governments and regulators need to consider as they push the transition to clean energy infrastructure such as tidal power, wind farms and hydro electricity.

A report to be released Thursday at an industry-sponsored energy conference looks at six controversial case studies across Canada, ranging from the Northern Gateway pipeline proposal in northern British Columbia to a gas-fired electricity plant in Oakville, Ont., and shale gas exploration in rural New Brunswick.

The joint project of the University of Ottawa and the Canada West foundation found that local communities are demanding a greater role in major infrastructure, whether it be wind farms, hydroelectric dams or pipelines.

The study concludes that “the world of elite, centralized decision-making is a thing of the past.”

Notwithstanding the pitched public battles over climate science and environment policy, the researchers found that in the cases they studied, global warming was not a principal driver of most local opposition.

“Climate change bore hardly at all on the local community attitudes in any of the cases,” writes lead author Michael Cleland.

Using public opinion research and interviews with project opponents, proponents and local authorities, the report found a “far more important” list of concerns: safety; the need or rationale for the project; economics; local environmental impacts such as water contamination; poor consultation and communication; and local involvement in decision-making.

The findings have implications that go far beyond today’s headlines over stalled oil pipeline applications.

As the study’s authors write, “the vast majority of future decisions will focus on new ‘clean’ energy infrastructure to underpin a very low GHG economy.

“As the case studies show, clean energy may be as controversial as hydrocarbon energy at the local community level.”

See also Ban Ki-moon Listen to the Masses (here)

And Power (and $) to the People (here)

Anthrax–Russia–Global Warming

It’s a trifecta for media alarms with can’t-miss top lines like these in just the last two days:

Russian officials are blaming global warming for a recent Anthrax outbreak in the far north region of Yamal that hospitalized dozens of nomadic tribesmen and killed one 12-year-old boy.

Thawing Russian Arctic Permafrost Leads to Anthrax Outbreak
(that one adds in the permafrost bogeyman)

Climate Change Blamed for the Anthrax Outbreak in Russia

Etc., Etc., Etc

Sputnik News tells what you need to know (and said so back in May)

Global warming can uncover and expose anthrax cattle burial sites in the Arctic and cause the spread of dangerous infections, Russia’s Emergencies Ministry warned on Wednesday.

“Climatic anomaly impacts on permafrost zones, enhances the danger of exposing anthrax cattle burial grounds,” a ministry spokesman said.

There are more than 100,000 anthrax cattle burial sites in Russia, about 400 of which are located in the Arctic region, he said. (My bold)

Anthrax is an acute disease caused by Bacillus anthracis and affects both humans and animals. It can form dormant spores that are able to survive in harsh conditions for decades or even centuries, the spokesman warned.

Some Russian ministries and other government agencies are known for their tendency to issue dire warnings to ensure more federal funding, especially ahead of each new fiscal year.  (My bold)

MOSCOW, May 25 (RIA Novosti)

The Facts of the outbreak: 90 hospitalized, 20 cases confirmed. One Died.

Footnote:

The Sputnik News article is dated in 2011 (h/t manic) and provides a context for this recent news. The Tass story on August 1, 2016, tells of those infected and an investigation into the one death.  No mention of global warming.  That was added by BBC and others, based on  a comment by  a Russian WWF employee.

CNN has a factual report on this event here:

http://www.cnn.com/2016/07/28/health/anthrax-thawed-reindeer-siberia/index.html