Head, Heart and Science Updated

A man who has not been a socialist before 25 has no heart. If he remains one after 25 he has no head.—King Oscar II of Sweden

H/T to American Elephants for linking to this Jordan Peterson video:  The Fatal Flaw in Leftist Thought.  He has an outstanding balance between head and heart, and also applies scientific analysis to issues, in this case the problem of identity politics and leftist ideology.

As usual Peterson makes many persuasive points in this talk.  I was struck by his point that we have established the boundary of extremism on the right, but no such boundary exists on the left.  Our society rejects right wingers who cross the line and assert racial superiority.  Conservative voices condemn that position along with the rest.

We know from the Soviet excesses that the left can go too far, but what is the marker?  Left wingers have the responsibility to set the boundary and sanction the extremists.  Peterson suggests that the fatal flaw is the attempt to ensure equality of outcomes for identity groups, and explains why that campaign is impossible.

From Previous Post on Head, Heart and Science

Recently I had an interchange with a friend from high school days, and he got quite upset with this video by Richard Lindzen. So much so, that he looked up attack pieces in order to dismiss Lindzen as a source. This experience impressed some things upon me.

Climate Change is Now Mostly a Political Football (at least in USA)

My friend attributed his ill humor to the current political environment. He readily bought into slanderous claims, and references to being bought and paid for by the Koch brothers. At this point, Bernie and Hilliary only disagree about who is the truest believer in Global Warming. Once we get into the general election process, “Fighting Climate Change” will intensify as a wedge issue, wielded by smug righteous believers on the left against the anti-science neanderthals on the right.

So it is a hot label for social-media driven types to identify who is in the tribe (who can be trusted) and the others who can not.  For many, it is not any deeper than that.

The Warming Consensus is a Timesaver

My friend acknowledged that his mind was made up on the issue because 95+% of scientists agreed. It was extremely important for him to discredit Lindzen as untrustworthy to maintain the unanimity. When a Warmist uses: “The Scientists say: ______” , it is much the same as a Christian reference: “The Bible says: _______.” In both cases, you can fill in the blank with whatever you like, and attribute your idea to the Authority. And most importantly, you can keep the issue safely parked in a No Thinking Zone. There are plenty of confusing things going on around us, and no one wants one more ambiguity requiring time and energy.

Science Could Lose the Delicate Balance Between Head and Heart

Decades ago Arthur Eddington wrote about the tension between attitudes of artists and scientists in their regarding nature. On the one hand are people filled with the human impulse to respect, adore and celebrate the beauty of life and the world. On the other are people driven by the equally human need to analyze, understand and know what to expect from the world. These are Yin and Yang, not mutually exclusive, and all of us have some of each.

Most of us can recall the visceral response in the high school biology lab when assigned to dissect a frog. Later on, crayfish were preferred (less disturbing to artistic sensibilities). For all I know, recent generations have been spared this right of passage, to their detriment. For in the conflict between appreciating things as they are, and the need to know why and how they are, we are exposed to deeper reaches of the human experience. If you have ever witnessed, as I have, a human body laid open on an autopsy table, then you know what I mean.

Anyone, scientist or artist, can find awe in contemplating the mysteries of life. There was a time when it was feared that the march of science was so advancing the boundaries of knowledge that the shrinking domain of the unexplained left ever less room for God and religion. Practicing scientists knew better. Knowing more leads to discovering more unknowns; answers produce cascades of new questions. The mystery abounds, and the discovery continues. Eddington:

It is pertinent to remember that the concept of substance has disappeared from fundamental physics; what we ultimately come down to is form. Waves! Waves!! Waves!!! Or for a change — if we turn to relativity theory — curvature! Energy which, since it is conserved, might be looked upon as the modern successor of substance, is in relativity theory a curvature of space-time, and in quantum theory a periodicity of waves. I do not suggest that either the curvature or the waves are to be taken in a literal objective sense; but the two great theories, in their efforts to reduce what is known about energy to a comprehensible picture, both find what they require in a conception of “form”.

What do we really observe? Relativity theory has returned one answer — we only observe relations. Quantum theory returns another answer — we only observe probabilities.

It is impossible to trap modern physics into predicting anything with perfect determinism because it deals with probabilities from the outset.
― Arthur Stanley Eddington

Works by Eddington on Science and the Natural World are here.

Summary

The science problem today is not the scientists themselves, but with those attempting to halt its progress for the sake of political power and wealth.

Eddington:
Religious creeds are a great obstacle to any full sympathy between the outlook of the scientist and the outlook which religion is so often supposed to require … The spirit of seeking which animates us refuses to regard any kind of creed as its goal. It would be a shock to come across a university where it was the practice of the students to recite adherence to Newton’s laws of motion, to Maxwell’s equations and to the electromagnetic theory of light. We should not deplore it the less if our own pet theory happened to be included, or if the list were brought up to date every few years. We should say that the students cannot possibly realise the intention of scientific training if they are taught to look on these results as things to be recited and subscribed to. Science may fall short of its ideal, and although the peril scarcely takes this extreme form, it is not always easy, particularly in popular science, to maintain our stand against creed and dogma.
― Arthur Stanley Eddington

But enough about science. It’s politicians we need to worry about:

Footnote:

“Asked in 1919 whether it was true that only three people in the world understood the theory of general relativity, [Eddington] allegedly replied: ‘Who’s the third?”

Postscript:  For more on how we got here see Warmists and Rococo Marxists.

Chicxulub asteroid Apocalypse? Not so fast.

The Daily Mail would have you believe Apocalyptic asteroid that wiped out the dinosaurs 66 million years ago triggered 100,000 years of global warming
Chicxulub asteroid triggered a global temperature rise of 5°C (9°F).

This notion has been around for years, but dredged up now to promote fears of CO2 and global warming. And maybe it’s because of a new Jurassic Park movie coming this summer.  But it doesn’t take much looking around to discover experts who have a sober, reasonable view of the situation.

Princeton expert Gerta Keller, Professor of Geosciences at Princeton, has studied this issue since the 1990s and tells all at her website CHICXULUB: THE IMPACT CONTROVERSY Excerpts below with my bolds.

Introduction to The Impact Controversy

In the 1980s as the impact-kill hypothesis of Alvarez and others gained popular and scientific acclaim and the mass extinction controversy took an increasingly rancorous turn in scientific and personal attacks fewer and fewer dared to voice critique. Two scientists stand out: Dewey McLean (VPI) and Chuck Officer (Dartmouth University). Dewey proposed as early as 1978 that Deccan volcanism was the likely cause for the KTB mass extinction, Officer also proposed a likely volcanic cause. Both were vilified and ostracized by the increasingly vocal group of impact hypothesis supporters. By the middle of the 1980s Vincent Courtillot (Physique de Globe du Paris) also advocated Deccan volcanism, though not as primary cause but rather as supplementary to the meteorite impact. Since 2008 Courtillot has strongly advocated Deccan volcanism as the primary cause for the KTB mass extinction.

(Overview from Tim Clarely, Ph.D. questioning the asteroid) In secular literature and movies, the most popular explanation for the dinosaurs’ extinction is an asteroid impact. The Chicxulub crater in Mexico is often referred to as the “smoking gun” for this idea. But do the data support an asteroid impact at Chicxulub?

The Chicxulub crater isn’t visible on the surface because it is covered by younger, relatively undeformed sediments. It was identified from a nearly circular gravity anomaly along the northwestern edge of the Yucatán Peninsula (Figure 1). There’s disagreement on the crater’s exact size, but its diameter is approximately 110 miles—large enough for a six-mile-wide asteroid or meteorite to have caused it.

Although some of the expected criteria for identifying a meteorite impact are present at the Chicxulub site—such as high-pressure and deformed minerals—not enough of these materials have been found to justify a large impact. And even these minerals can be caused by other circumstances, including rapid crystallization4 and volcanic activity.

The biggest problem is what is missing. Iridium, a chemical element more abundant in meteorites than on Earth, is a primary marker of an impact event. A few traces were identified in the cores of two drilled wells, but no significant amounts have been found in any of the ejecta material across the Chicxulub site. The presence of an iridium-rich layer is often used to identify the K-Pg (Cretaceous-Paleogene) boundary, yet ironically there is virtually no iridium in the ejecta material at the very site claimed to be the “smoking gun”!

In addition, secular models suggest melt-rich layers resulting from the impact should have exceeded a mile or two in thickness beneath the central portion of the Chicxulub crater. However, the oil wells and cores drilled at the site don’t support this. The thickest melt-rich layers encountered in the wells were between 330 and 990 feet—nowhere near the expected thicknesses of 5,000 to 10,000 feet—and several of the melt-rich layers were much thinner than 300 feet or were nonexistent.

Finally, the latest research even indicates that the tsunami waves claimed to have been generated by the impact across the Gulf of Mexico seem unlikely.

Summary from Geller

The Cretaceous-Tertiary boundary (KTB) mass extinction is primarily known for the demise of the dinosaurs, the Chicxulub impact, and the frequently rancorous thirty years-old controversy over the cause of this mass extinction. Since 1980 the impact hypothesis has steadily gained support, which culminated in 1990 with the discovery of the Chicxulub crater on Yucatan as the KTB impact site and “smoking gun” that proved this hypothesis. In a perverse twist of fate, this discovery also began the decline of this hypothesis, because for the first time it could be tested directly based on the impact crater and impact ejecta in sediments throughout the Caribbean, Central America and North America.

Two decades of multidisciplinary studies amassed a database with a sum total that overwhelmingly reveals the Chicxulub impact predates the KTB mass extinction. It’s been a wild and frequently acrimonious ride through the landscape of science and personalities. The highlights of this controversy, the discovery of facts inconsistent with the impact hypothesis, the denial of evidence, misconceptions, and misinterpretations are recounted here. (Full paper in Keller, 2011, SEPM 100, 2011).

Chicxulub Likely Happened ~100,000 years Before the KTB Extinction

Figure 42. Planktic foraminiferal biostratigraphy, biozone ages calculated based on time scales where the KTB is placed at 65Ma, 65.5Ma and 66Ma, and the relative age positions of the Chicxulub impact, Deccan volcanism phases 2 and 3 and climate change, including the maximum cooling and maximum warming (greenhouse warming) and the Dan-2 warm event relative to Deccan volcanism.

Most studies surrounding the Chicxulub impact crater have concentrated on the narrow interval of the sandstone complex or so-called impact-tsunami. Keller et al. (2002, 2003) placed that interval in zone CF1 based on planktic foraminiferal biostratigraphy and specifically the range of the index species Plummerita hantkeninoides that spans the topmost Maastrichtian. Zone CF1. The age of CF1 was estimated to span the last 300ky of the Maastrichtian based on the old time scale of Cande and Kent (1995) that places the KTB at 65Ma. The newer time scale (Gradstein et al., 2004) places the KTB at 65.5Ma, which reduces zone CF1 to 160ky.

By early 2000 our team embarked on an intensive search for impact spherules below the sandstone complex throughout NE Mexico. Numerous outcrops were discovered with impact spherule layers in planktic foraminiferal zone CF1 below the sandstone complex and we suggested that the Chicxulub impact predates the KTB by about 300ky (Fig. 42; Keller et al., 2002, 2003, 2004, 2005, 2007, 2009; Schulte et al., 2003, 2006).

Time scales change with improved dating techniques. Gradstein et al (2004) proposed to place the KTB at 65.5 Ma, (Abramovich et al., 2010). This time scale is now undergoing further revision (Renne et al., 2013) placing the KTB at 66 Ma, which reduces zone CF to less than 100ky. By this time scale, the age of the Chicxulub impact predates the KTB by less than 100ky based on impact spherule layers in the lower part zone CF1. See Fig. 42 for illustration.

Unfortunately, this wide interest rarely resulted in integrated interdisciplinary studies or joint discussions to search for common solutions to conflicting results. Increasingly, in a perverse twist of science new results became to be judged by how well they supported the impact hypothesis, rather than how well they tested it. An unhealthy US versus THEM culture developed where those who dared to question the impact hypothesis, regardless of the solidity of the empirical data, were derided, dismissed as poor scientists, blocked from publication and getting grant funding, or simply ignored. Under this assault, more and more scientists dropped out leaving a nearly unopposed ruling majority claiming victory for the impact hypothesis. In this adverse high-stress environment just a small group of scientists doggedly pursued evidence to test the impact hypothesis.

No debate has been more contentious during the past thirty years, or has more captured the imagination of scientists and public alike, than the hypothesis that an extraterrestrial bolide impact was the sole cause for the KTB mass extinction (Alvarez et al., l980). How did this hypothesis evolve so quickly into a virtually unassailable “truth” where questioning could be dismissed by phrases such as “everybody knows that an impact caused the mass extinction”, “only old fashioned Darwinian paleontologists can’t accept that the mass extinction was instantaneous”, “paleontologists are just bad scientists, more like stamp collectors”, and “it must be true because how could so many scientists be so wrong for so long.” Such phrases are reminiscent of the beliefs that the Earth is flat, that the world was created 6000 years ago, that Noah’s flood explains all geological features, and the vilification of Alfred Wegner for proposing that continents moved over time.

Update Published at National Geographic February 2018 By Shannon Hall Volcanoes, Then an Asteroid, Wiped Out the Dinosaur

What killed the dinosaurs? Few questions in science have been more mysterious—and more contentious. Today, most textbooks and teachers tell us that nonavian dinosaurs, along with three-fourths of all species on Earth, disappeared when a massive asteroid hit the planet near the Yucatán Peninsula some 66 million years ago.

But a new study published in the journal Geology shows that an episode of intense volcanism in present-day India wiped out several species before that impact occurred.

The result adds to arguments that eruptions plus the asteroid caused a one-two punch. The volcanism provided the first strike, weakening the climate so much that a meteor—the more deafening blow—was able to spell disaster for Tyrannosaurs rex and its late Cretaceous kin.

A hotter climate certainly helped send the nonavian dinosaurs to their early grave, says Paul Renne, a geochronologist at the University of California, Berkeley, who was not involved in the study. That’s because the uptick in temperature was immediately followed by a cold snap—a drastic change that likely set the stage for planet-wide disaster.

Imagine that some life managed to adapt to those warmer conditions by moving closer toward the poles, Renne says. “If you follow that with a major cooling event, it’s more difficult to adapt, especially if it’s really rapid,” he says.

In this scenario, volcanism likely sent the world into chaos, driving many extinctions alone and increasing temperatures so drastically that most of Earth’s remaining species couldn’t protect themselves from that second punch when the asteroid hit.

“The dinosaurs were extremely unlucky,” Wignall says.

But it will be hard to convince Sean Gulick, a geophysicist at the University of Texas at Austin, who co-led recent efforts to drill into the heart of the impact crater in Mexico. He points toward several studies that have suggested that ecosystems remained largely intact until the time of the impact.

Additionally, a forthcoming paper might make an even stronger case that the impact drove the extinction alone, notes Jay Melosh, a geophysicist at Purdue University who has worked on early results from the drilling project. It looks as though the divisive debate will continue with nearly as much ferocity as the events that rocked our world 66 million years ago.

Summary:

So if the Chicxulub asteroid didn’t kill the dinosaurs, what did? Paleontologists have advanced all manner of other theories over the years, including the appearance of land bridges that allowed different species to migrate to different continents, bringing with them diseases to which native species hadn’t developed immunity. Keller and Addate do not see any reason to stray so far from the prevailing model. Some kind of atmospheric haze might indeed have blocked the sun, making the planet too cold for the dinosaurs — it just didn’t have to have come from an asteroid. Rather, they say, the source might have been massive volcanoes, like the ones that blew in the Deccan Traps in what is now India at just the right point in history.

For the dinosaurs that perished 65 million years ago, extinction was extinction and the precise cause was immaterial. But for the bipedal mammals who were allowed to rise once the big lizards were finally gone, it is a matter of enduring fascination.

This science seems as settled as climate change/global warming, and with many of the same shenanigans.

Saving the Internet is like Saving the Climate

 

Some striking parallels appear in the fights over “net neutrality” and “climate change.” Firstly, both issues are mostly political. Politicians like those above alarmed over the internet are the same ones alarmed about the climate.  (Pictured are Democrat senators Franken, Sanders, Booker and Markey.)

Then there is the comparison that both concerns involve a fear of the future. Impressionable youngsters and others have been told they have a right to a “stable climate” and to “net neutrality.” Advocates ignore how benign and conveniently warming has been our climate since 1850, but assure us that extreme heat and dire consequences surely lie ahead. The same politicians ignore the fact the internet worked amazing well prior to the Obama 2015 Internet Order, and brought great innovation and affordable services without heavy-handed regulations.

The proposed policies are also comparable. Evil corporations can not be trusted to deal fairly with their customers and must be controlled by government bureaucracy. Laws must be written to dictate to network operators what they can offer and what prices they can demand. Fossil fuel producers must be hounded, sued, taxed, and constrained by any means possible.

Also underneath the simplistic political positions, there are technical complexities and realities overlooked in the rush to enact “solutions.” I have read and posted a lot on global warming/climate change and know how unfounded are the alarmist claims. So I am inclined to be skeptical when the same people harp about the internet.

In the last few days, some things appear true to me. Prior to 2015 the FCC  classified the internet as a Title I service and all the growth and innovation happened under that “light-touch” regulation. The 2015 FCC Internet Order under the Obama administration reclassified the Internet as Title II, mandating “heavy-handed” regulation, including price controls and even the potential for taxes to pay for regulatory costs. This move was justified to protect consumers against discrimination by providers.  That order is now being repealed by the present Trump-appointed FCC head.

Skeptics pointed at the time that Title II worked extremely well to protect telephone monopolies and prevent innovation for decades. Notice that AT&T is in full support of restoring Title II regulation. Smart phones and Voice Over Internet escaped Title II regulation, and who knows what future inventions will come without government interference.

We can also see virtue signaling on display in both campaigns. The Senate action this week is unlikely to succeed, but the point was always to rally the faithful for the mid-term elections. Underneath the feel-good notion of net neutrality is the impulse to control and limit choices by putting bureaucrats in charge.

Everyone wants the same thing: Free competition so that the best ideas and services can rise and prosper, privacy so that personal information can not be exploited against one’s interests, and freedom to choose and to pay accordingly. But how to get there? Going back to the 2010 FCC Internet Order is a good start, which is the result of the recent FCC decision.

In both climate and internet issues, there are important matters to address by means of facts and analysis rather than knee-jerk politics. Rolling out widely accessible broadband networks is expensive and won’t happen if builders and operators are unprofitable. And based on past experience, it will also not work as a government project. As for climate, officials should be more humble. No one knows what future weather will be, and most likely there will be both periods colder and warmer than the present. The proper role of government is not to attempt control of the weather, but to prepare for the contingencies with robust infrastructure and reliable, affordable energy.

Some sources:

Forbes:  The FCC’s Net Neutrality Repeal: The End Of The Internet Or A Path To A Legislative Compromise?

Mediashift: Your Guide to Net Neutrality (2018 Edition) 

Boston Globe:  The real reason the Net neutrality fight goes on

American Consumer Institute:  Did the FCC Lie about Net Neutrality? (2015)

Forbes:  Am I The Only Techie Against Net Neutrality? (2014)

Journal on Telecom and High Tech Law:  Unintended Consequences of Net Neutrality Regulation   (2007)

 

EU Joins Alarmist Beehive

Update April 28, 2018

The European Union has decided to ban bee-killing pesticides

This post is about bees since they are also victims of science abuse by environmental activists, aided and abetted by the media. The full story is told by Jon Entine at Slate Do Neonics Hurt Bees?  Researchers and the Media Say Yes. The Data Do Not.
A new, landmark study provides plenty of useful information. If only we could interpret it accurately. Synopsis below.

Futuristic Nightmare Scenarios

“Neonicotinoid Pesticides Are Slowly Killing Bees.”

No, there is no consensus evidence that neonics are “slowly killing bees.” No, this study did not add to the evidence that neonics are driving bee health problems. And yet . . .

Unfortunately, and predictably, the overheated mainstream news headlines also generated a slew of even more exaggerated stories on activist and quack websites where undermining agricultural chemicals is a top priority (e.g., Greenpeace, End Times Headlines, and Friends of the Earth). The takeaway: The “beepocalypse” is accelerating. A few news outlets, such as Reuters (“Field Studies Fuel Dispute Over Whether Banned Pesticides Harm Bees”) and the Washington Post (“Controversial Pesticides May Threaten Queen Bees. Alternatives Could Be Worse.”), got the contradictory findings of the study and the headline right.

But based on the study’s data, the headline could just as easily have read: “Landmark Study Shows Neonic Pesticides Improve Bee Health”—and it would have been equally correct. So how did so many people get this so wrong?

trampoline-judges-sydney-olympics

Bouncing off a database can turn your perspective upside down.

Using Data as a Trampoline rather than Mining for Understanding

This much-anticipated two year, $3.6 million study is particularly interesting because it was primarily funded by two major producers of neonicotinoids, Bayer Crop Science and Syngenta. They had no involvement with the analysis of the data. The three-country study was led by the Centre for Ecology and Hydrology, or CEH, in the U.K.—a group known for its skepticism of pesticides in general and neonics in particular.

The raw data—more than 1,000 pages of it (only a tiny fraction is reproduced in the study)—are solid. It’s a reservoir of important information for entomologists and ecologists trying to figure out the challenges facing bees. It’s particularly important because to date, the problem with much of the research on neonicotinoids has been the wide gulf between the findings from laboratory-based studies and field studies.

Some, but not all, results from lab research have claimed neonics cause health problems in honeybees and wild bees, endangering the world food supply. This has been widely and often breathlessly echoed in the popular media—remember the execrably reported Time cover story on “A World Without Bees.” But the doses and time of exposure have varied dramatically from lab study to lab study, so many entomologists remain skeptical of these sweeping conclusions. Field studies have consistently shown a different result—in the field, neonics seem to pose little or no harm. The overwhelming threat to bee health, entomologists now agree, is a combination of factors led by the deadly Varroa destructor mite, the miticides used to control them, and bee practices. Relative to these factors, neonics are seen as relatively inconsequential.

The Bees are all right. Carry on.

Disparity between Field and Lab Research (sound familiar?)

Jon Entine addressed this disparity between field and lab research in a series of articles at the Genetic Literacy Project, and specifically summarized two dozen key field studies, many of which were independently funded and executed. This study was designed in part to bridge that gulf. And the devil is in the interpretation.

Overall, the data collected from 33 different fields covered 42 analyses and 258 endpoints—a staggering number. The paper only presented a sliver of that data—a selective glimpse of what the research, in its entirety showed.

What patterns emerged when examining the entire data set? . . . In sum, of 258 endpoints, 238—92 percent—showed no effects. (Four endpoints didn’t yield data.) Only 16 showed effects. Negative effects showed up 9 times—3.5 percent of all outcomes; 7 showed a benefit from using neonics—2.7 percent.

As one scientist pointed out, in statistics there is a widely accepted standard that random results are generated about 5 percent of the time—which means by chance alone we would expect 13 results meaninglessly showing up positive or negative.

Norman Carreck, science director of the International Bee Research Association, who was not part of either study, noted, the small number of significant effects “makes it difficult to draw any reliable conclusions.”

Moreover, Bees Are Not in Decline

The broader context of the bee health controversy is also important to understand; bees are not in sharp decline—not in North America nor in Europe, where neonics are under a temporary ban that shows signs of becoming permanent, nor worldwide. Earlier this week, Canada reported that its honeybee colonies grew 10 percent year over year and now stand at about 800,000. That’s a new record, and the growth tracks the increased use of neonics, which are critical to canola crops in Western Canada, where 80 percent of the nation’s honey originates.

Managed beehives in the U.S. had been in steady decline since the 1940s, as farm land disappeared to urbanization, but began stabilizing in the mid-1990s, coinciding with the introduction of neonicotinoids. They hit a 22-year high in the last count.

Global hive numbers have steadily increased since the 1960s except for two brief periods—the emergence of the Varroa mite in the late 1980s and the brief outbreak of colony collapse disorder, mostly in the U.S., in the mid-2000s.

Conclusion

So the bees, contrary to widespread popular belief, are actually doing all right in terms of numbers, although the Varroa mite remains a dangerous challenge. But still, a cadre of scientists well known for their vocal opposition to and lobbying against neonics have already begun trying to leverage the misinterpretation of the data. Within hours of the release of the study, advocacy groups opposed to intensive agricultural techniques had already begun weaponizing the misreported headlines.

But viewing the data from the European study in context makes it even more obvious that sweeping statements about the continuing beepocalypse and the deadly dangers to bees from pesticides, and neonicotinoids in particular, are irresponsible. That’s on both the scientists, and the media.

Summary

The comparison with climate alarmism is obvious. The data is equivocal and subject to interpretation. Lab studies can not be replicated in the real world. Activists make mountains out of molehills. Reasonable balanced analysts are ignored or silenced. Media outlets proclaim the end of life as we know it to capture ears and eyeballs for advertisers, and to build their audiences (CNN: All the fear all the time”). Business as usual for Crisis Inc.

Update April 29

Entine posted regarding the just announced full  EU ban: Global consensus finds neonicotinoids not driving honeybee health problems—Why is Europe so determined to ban them?

The whole article is enlightening, and especially this part describing the research protocols:

“The BRGD (Bee Research Guidance Document) insists that, in order to be considered valid, field experiments must demonstrate that 90 percent of the hive has been exposed to the neonic. The biggest problem with this is that there are generally no neonic residues detectable in crops by the time bees are foraging on them, and if there are residues, the amount is miniscule.”

“The authors of the 2017 CEH study (cited by innumerable reporters as condemning neonics) noted that neonic “residues were detected infrequently and rarely exceeded [1.5 parts per billion].” (To put 1.5 parts per billion in context, the EPA has determined that levels below 25 parts per billion have no effect at all on bees.)”

“At the same time, the bee-hive is a dynamic community and has a considerable capacity to detoxify itself from contaminants. So even the vanishingly small quantities brought into the hives by foragers might very well be wholly or partially eliminated before researchers could test for them.”

“The BRGD thus presents those researchers with a Catch 22: In order to meet the 90th percent exposure requirement they would have to massively over-treat their crops with neonics, creating a foraging environment that simply would not occur in real life. But this defeats the entire purpose of a Tier III field trial, which is to recreate realistic, controlled conditions to see how bees are affected—or not—in the real world. The BRGD requirement has the effect of ‘forcing’ certain pesticides to fail or the studies that don’t comply are invalidated.”

 

 

Only Two Energy Sources

 

This incisive (cutting to the core) essay is from Darrin Qualman There are just two sources of energy  Excerpts below in italics with my bolds.

Our petro-industrial civilization produces and consumes a seemingly diverse suite of energies: oil, coal, ethanol, hydroelectricity, gasoline, geothermal heat, hydrogen, solar power, propane, uranium, wind, wood, dung. At the most foundational level, however, there are just two sources of energy. Two sources provide more than 99 percent of the power for our civilization: solar and nuclear. Every other significant energy source is a form of one of these two. Most are forms of solar.

When we burn wood we release previously captured solar energy. The firelight we see and the heat we feel are energies from sunlight that arrived decades ago. That sunlight was transformed into chemical energy in the leaves of trees and used to form wood. And when we burn that wood, we turn that chemical-bond energy back into light and heat. Energy from wood is a form of contemporary solar energy because it embodies solar energy mostly captured years or decades ago, as distinct from fossil energy sources such as coal and oil that embody solar energy captured many millions of years ago.

Straw and other biomass are a similar story: contemporary solar energy stored as chemical-bond energy then released through oxidation in fire. Ethanol, biodiesel, and other biofuels are also forms of contemporary solar energy (though subsidized by the fossil fuels used to create fertilizers, fuels, etc.).

Coal, natural gas, and oil products such as gasoline and diesel fuel are also, fundamentally, forms of solar energy, but not contemporary solar energy: fossil. The energy in fossil fuels is the sun’s energy that fell on leaves and algae in ancient forests and seas. When we burn gasoline in our cars, we are propelled to the corner store by ancient sunlight.

Wind power is solar energy. Heat from the sun creates air-temperature differences that drive air movements that can be turned into electrical energy by wind turbines, mechanical work by windmills, or geographic motion by sailing ships.

Hydroelectric power is solar energy. The sun evaporates and lifts water from oceans, lakes, and other water bodies, and that water falls on mountains and highlands where it is aggregated by terrain and gravity to form the rivers that humans dam to create hydro-power.

Of course, solar energy (both photovoltaic electricity and solar-thermal heat) is solar energy.

Approximately 86 percent of our non-food energy comes from fossil-solar sources such as oil, natural gas, and coal. Another 9 percent comes from contemporary solar sources, mostly hydro-electric, with a small but rapidly growing contribution from wind turbines and solar photovoltaic panels. In total, then, 95 percent of the energy we use comes from solar sources—contemporary or fossil. As is obvious upon reflection, the Sun powers the Earth.

The only major energy source that is not solar-based is nuclear power: energy from the atomic decay of unstable, heavy elements buried in the ground billions of years ago when our planet was formed. We utilize nuclear energy directly, in reactors, and also indirectly, when we tap geothermal energies (atomic decay provides 60-80 percent of the heat from within the Earth). Uranium and other radioactive elements were forged in the cores of stars that exploded before our Earth and Sun were created billions of years ago. The source for nuclear energy is therefore not solar, but nonetheless stellar; energized not by our sun, but by another. Our universe is energized by its stars.

There are two minor exceptions to the rule that our energy comes from nuclear and solar sources: Tidal power results from the interaction of the moon’s gravitational field and the initial rotational motion imparted to the Earth; and geothermal energy is, in its minor fraction, a product of residual heat within the Earth, and of gravity. Tidal and geothermal sources provide just a small fraction of one percent of our energy supply.

Some oft-touted energy sources are not mentioned above. Because some are not energy sources at all. Rather, they are energy-storage media. Hydrogen is one example. We can create purified hydrogen by, for instance, using electricity to split water into its oxygen and hydrogen atoms. But this requires energy inputs, and the energy we get out when we burn hydrogen or react it in a fuel cell is less than the energy we put in to purify it. Hydrogen, therefore, functions like a gaseous battery: energy carrier, not energy source.

Knowing that virtually all energy flows have their origins in our sun or other stars helps us critically evaluate oft-heard ideas that there may exist undiscovered energy sources. To the contrary, it is extremely unlikely that there are energy sources we’ve overlooked. The solution to energy supply constraints and climate change is not likely to be “innovation” or “technology.” Though some people hold out hope for nuclear fusion (creating a small sun on Earth rather than utilizing the conveniently-placed large sun in the sky) it is unlikely that fusion will be developed and deployed this century. Thus, the suite of energy sources we now employ is probably the suite that will power our civilization for generations to come. And since fossil solar sources are both limited and climate-disrupting, an easy prediction is that contemporary solar sources such as wind turbines and solar photovoltaic panels will play a dominant role in the future.

Summary

Understanding that virtually all energy sources are solar or nuclear in origin reduces the intellectual clutter and clarifies our options. We are left with three energy supply categories when making choices about our future:
Fossil solar: oil, natural gas, and coal;
Contemporary solar: hydroelectricity, wood, biomass, wind, photovoltaic electricity, ethanol and biodiesel (again, often energy-subsidized from fossil-solar sources); and
Nuclear.

Footnote:  The author ends with support for windmills and solar panels, but drops nuclear without explanation.  Also there are presently unsolved problems when substituting those intermittent power sources for fossil fuels.  Details are at Climateers Tilting at Windmills

Fortunately, we have time to adapt to the ongoing slight fluctuations in weather while assembling the longer-term transition to nuclear and other energy sources. Unfortunately, a recent study of energy subsidies in the US shows only a small amount is directed toward nuclear, and the sole purpose is decommissioning.

Update April 27

Some good news today: Secretary of Energy Rick Perry Announces $60 Million for U.S. Industry Awards in Support of Advanced Nuclear Technology Development

WASHINGTON, D.C. – U.S. Secretary of Energy Rick Perry announced today that the U.S. Department of Energy (DOE) has selected 13 projects to receive approximately $60 million in federal funding for cost-shared research and development for advanced nuclear technologies. These selections are the first under DOE’s Office of Nuclear Energy’s U.S. Industry Opportunities for Advanced Nuclear Technology Development funding opportunity announcement (FOA), and subsequent quarterly application review and selection processes will be conducted over the next five years. DOE intends to apply up to $40 million of additional FY 2018 funding to the next two quarterly award cycles for innovative proposals under this FOA.

“Promoting early-stage investment in advanced nuclear power technology will support a strong, domestic, nuclear energy industry now and into the future,” said Secretary Perry. “Making these new investments is an important step to reviving and revitalizing nuclear energy, and ensuring that our nation continues to benefit from this clean, reliable, resilient source of electricity. Supporting existing as well as advanced reactor development will pave the way to a safer, more efficient, and clean baseload energy that supports the U.S. economy and energy independence.”

Game Changer? Brewing Fuel

There’s been a lot of crazy talk regarding energy coming from the Golden State, but there are also serious scientists in California, especially at Cal Tech, where Steven Koonin studied, taught and served as Provost.  This recent announcement caught my eye: Scientists breed bacteria that make tiny high-energy carbon rings.  Text below with my bolds

Caltech scientists have created a strain of bacteria that can make small but energy-packed carbon rings that are useful starting materials for creating other chemicals and materials. These rings, which are otherwise particularly difficult to prepare, now can be “brewed” in much the same way as beer.

Brewing equipment. Pike Microbrewery, Seattle. Source: Lonely Planet

The bacteria were created by researchers in the lab of Frances Arnold, Caltech’s Linus Pauling Professor of Chemical Engineering, Bioengineering and Biochemistry, using directed evolution, a technique Arnold developed in the 1990s. The technique allows scientists to quickly and easily breed bacteria with the traits that they desire. It has previously been used by Arnold’s lab to evolve bacteria that create carbon-silicon and carbon-boron bonds, neither of which is found among organisms in the natural world. Using this same technique, they set out to build the tiny carbon rings rarely seen in nature.

Familiar energetic organic compounds found in nature.

“Bacteria can now churn out these versatile, energy-rich organic structures,” Arnold says. “With new lab-evolved enzymes, the microbes make precisely configured strained rings that chemists struggle to make.”

In a paper published this month in the journal Science, the researchers describe how they have now coaxed Escherichia coli bacteria into creating bicyclobutanes, a group of chemicals that contain four carbon atoms arranged so they form two triangles that share a side. To visualize its shape, imagine a square piece of paper that’s lightly creased along a diagonal.

Source: Wikipedia

Bicyclobutanes are difficult to make because the bonds between the carbon atoms are bent at angles that put them under a great deal of strain. Bending these bonds away from their natural shape takes a lot of energy and can result in unwanted byproducts if the conditions for their synthesis aren’t just right. But it’s the strain that makes bicyclobutanes so useful. The bent bonds act like tightly wound springs: they pack a lot of energy that can be used to drive chemical reactions, making bicyclobutanes useful precursors to a variety of chemical products, such as pharmaceuticals, agrochemicals, and materials. When strained rings, like bicyclobutanes, are incorporated into larger molecules, they can imbue those molecules with interesting properties—for example, the ability to conduct electricity but only when an external force is applied—making them potentially useful for creating smart materials that are responsive to their environments.

Unlike other carbon rings, such as cyclohexanes and cyclopentanes, bicyclobutanes are rarely found in nature. This could be due to their inherit instability or the lack of suitable biological machineries for their assembly. But now, Arnold and her team have shown that bacteria can be genetically reprogrammed to produce bicyclobutanes from simple commercial starting materials. As the E. coli cells go about their bacterial business, they churn out bicyclobutanes. The setup is kind of like putting sugar and letting it ferment into alcohol.

“To our surprise, the enzymes can be engineered to efficiently make such crazy carbon rings under ambient conditions,” says graduate student Kai Chen, lead author on the paper. “This is the first time anyone has introduced a non-native pathway for bacteria to forge these high-energy structures.”

Chen and his colleagues, postdocs Xiongyi Huang, Jennifer Kan, and graduate student Ruijie Zhang, did this by giving the bacteria a copy of a gene that encodes an enzyme called cytochrome P450. The enzyme had previously been modified through directed evolution by the Arnold lab and others to create molecules containing small rings of three carbon atoms—essentially half of a bicyclobutane group.

“The beauty is that a well-defined active-site environment was crafted in the enzyme to greatly facilitate formation of these high-energy molecules,” Huang says.

The precision with which the bacterial enzymes do their work also allows the researchers to efficiently make the exact strained rings they want, with a precise configuration and in a single chiral form. Chirality is a property of molecules in which they can be “right-handed” or “left-handed,” with each form being the mirror image of the other. It matters because living things are selective about which “handedness” of a molecule they use or produce. For instance, all living things exclusively use the right-handed form of the sugar ribose (the backbone of DNA), and many chiral pharmaceutical chemicals are only effective in one handedness; in the other, they can be toxic.

Chiral forms of a molecule are difficult to separate from one another, but by changing the genetic code of the bacteria, the researchers can ensure the enzymes favor one chiral product over another. Mutation in the genes tuned the enzymes to forge a broad range of bicyclobutanes with high precision.

Kan says advancements like theirs are pushing chemistry in a greener direction.

“In the future, instead of building chemical plants for making the products we need to improve lives, wouldn’t it be great if we could just program bacteria to make what we want?” Kan says.

The paper, titled “Enzymatic Construction of Highly Strained Carbocycles,” appears in the April 5 issue of Science.

 

In California Everything Causes Cancer, So Labels Say

Best attempt at a scary cup of coffee.

An update on left coasters freaking out about cancerous items is provided by Sara Chodosh in Popular Science California needs to stop saying everything causes cancer

Unsurprisingly, it is the nanny state doing the fear mongering. Excerpts below with my bolds.

You may have heard that coffee gives you cancer. Or that everything gives you cancer—if you live in California.

The reason: Proposition 65. It’s a California state law that requires businesses with 10 or more employees to provide reasonable warning about the use of any chemicals the state has decided could cause cancer, birth defects, or other reproductive harm. One of these chemicals is acrylamide, which a rodent study pinned as a possible carcinogen. It’s found in almost everything that’s cooked at a high temperature. And because a particularly litigious law firm recently sued the state for not properly warning residents about acrylamide in coffee, California is now on the verge of requiring all coffee shops and manufacturers to include a warning on the beverage that it may cause cancer.

The problem, of course, is that coffee doesn’t cause cancer. Acrylamide might cause cancer at very high doses, but the amount that you’ll find in your food is harmless. You’ve actually been unintentionally eating it for your whole life, because it’s in everything from potato chips to roasted asparagus.. .No human studies suggest it’s carcinogenic at any realistic dose.

But coffee is only the beginning.

By California’s logic, all sorts of things should have warning labels. We wanted to make a joking list of ridiculous items that would need a cautionary sign according to Prop 65—but then we did our research. Turns out the state of California already slaps a warning on just about everything. Here’s just a small sample of things that could kill you out west:

Tiffany lamps
In order to abide by Prop 65’s rule on lead in furniture, Tiffany-style lamps have to have a warning label. The ornate lampshades have lead in them, and since lead is carcinogenic (weakly, but still!), they have to get a label. We do worry about lead paint in houses, because the flakes can get onto the floor and babies love putting stuff on the floor into their mouths, but lead in a lamp is usually encased or otherwise pretty solid. All the same, don’t let your kid (or your spouse) lick a lampshade or an electrical wire.

Amusement parks
The metal dust and diesel fumes given off by your favorite amusement park rides could give you cancer, and the state of California needs you to know. Also, the food you eat there might be fried, which could give you more cancer, and you might drink a beer there which also could give you cancer. The whole park is basically a death trap. Spending a day in Disneyland isn’t likely to expose you to enough of any of the worrisome chemicals to cause harm, but the warning does make one wonder why, if amusement parks are so hazardous, Californians aren’t jumping to protect the people who work there every day.

Hotels
Sometimes people smoke in hotels. They also drink alcoholic beverages. Both of these things can give you cancer, and so whenever you enter a hotel (or “other lodging establishment”) you must be warned. California hotels now often carry an actual warning label advising you about these dangers (yes, seriously), lest you wander into one unwittingly.

Boats
Engine exhaust from your “recreational vehicle”—along with the carbon monoxide and other engine-related chemicals—poses you a threat. Therefore, your boat must carry a warning label that advises you to avoid exposure to everything the boat does. If you’re thinking “but cars do that too,” don’t worry: passenger vehicles also carry the warning.

Wooden furniture and flooring
When wooden furniture is made, from sofas to bed frames there tends to be some wood dust. You know, because it’s made of wood. But wood dust is dangerous, and it doesn’t matter that it’s really only a problem if you regularly inhale the levels of wood dust that sawmill workers are exposed to. Furniture that might still contain wood dust has to carry a label all the same.

Tuna
Mercury can cause birth defects, and therefore all fish high in mercury (like tuna, but also swordfish, marlin, king mackerel, and tilefish) falls under the Prop 65 guidelines. You definitely shouldn’t be eating a ton of fish while pregnant, but most of us should be more worried about mercury poisoning, which anyone can get from consuming too much fish—not that Prop 65 warns you about that.

Pumpkin puree
Apparently there’s acrylamide in your pumpkin pie and you’ve been eating it for years. California’s got your back.

Potatoes
According to the state of California, we should all be soaking our potatoes in water for 15 to 30 minutes before cooking them, and we should never fry or roast them to a deep golden brown. We should carefully make them a light brown so as to avoid the acrylamide produced in the cooking process. So… have fun with that.

All alcohol
It’s well-known that consuming lots of alcohol on a regular basis increases your risk of cancer. Alcohol doesn’t give you cancer, but it can make you more likely to get it. A few beers a week—or maybe even a glass of wine a day—isn’t going to do you in, but high daily consumption can certainly push a person’s risk of getting cancer higher. Studies suggest that increased alcohol consumption might even be (partly) to blame for the rise in colorectal cancer in young people. So if we’re going to applaud California for any of their labels, it’ll be for this one. Licking lamps and driving boats probably won’t give you cancer, but a life of alcoholism might. Will a label on alcohol do more than teach people to tune out cancer risk warnings? Unclear. But it’s true we should all try to keep our drinking in moderation.

Look: no lifestyle can protect you from every kind of cancer. Genetic mutations are an inevitability, and cancer can strike anyone. You can certainly decrease your risk, mainly by not smoking or being overweight, since those two factors contribute to many cancer cases. You can exercise regularly, eat a balanced diet, and try not to drink too much. But at the end of the day, you shouldn’t be making every tiny decision based on whether it might contribute to your cancer risk—because simply being alive and having cells that continue to replicate puts you at risk of developing the disease. So sit back, relax, and do the best you can with the body you’ve got. And have a nice cup of coffee while you’re at it.

Weird Liberal Science

Now this one I take personally having earned a degree in Organic Chemistry. Alex Beresow exposes a liberal journalist who disparages all chemicals with no comprehension of the science. In this case Nicholas Kristof demonstrates how his employer, the New York Times, misleads and spreads irrational fears in its mission to sell copies to its clientèle on the upper west side of NYC.  He seems to be channeling Rachel Carson (Silent Spring) who wrote about carcinogens everywhere as she was dying of the disease.

The article is NYT’s Nicholas Kristof Would Flunk An 8th Grade Science Test Excerpts below with my bolds.

It’s often helpful for journalists who do not have specialized knowledge of complex scientific topics to write about them anyway, because if they can understand them and figure out how to communicate them, they can perform a tremendous public service. However, if journalists don’t take the time to understand complex topics and get the very basics wrong, they do the public a massive disservice and end up looking like buffoons.

Which brings us to veteran New York Times columnist Nicholas Kristof, who studied law and fancies himself an expert in chemistry and toxicology. Chemists and toxicologists disagree.

His latest diatribe — which was easily and thoroughly debunked by my colleagues Dr. Chuck Dinerstein and Ana Dolaskie — begins with the single most shameless act of fearmongering I have ever seen from a major media outlet. He shows a bunch of common household products, all of which are perfectly safe, and asks, “What poisons are in your body?”

Look at all these lethal things: toothpaste, soap, shower curtains. It’s amazing we all aren’t dead yet. Mr. Kristof’s “research” — if you can even call it that — relied heavily on well-known anti-science activists, such as the Environmental Working Group.

When we criticized his scientific ignorance, Mr. Kristof doubled down, as the scientifically ignorant always do.

It’s interesting that his immediate defense is to lie about his writings. He isn’t only afraid of endocrine disruptors; he’s afraid he’ll get cancer from popcorn, and he’s worried that some enigmatic chemicals somewhere out there are causing diabetes, obesity, and autism. This level of paranoia is what we would expect from a chemtrail conspiracy theorist, not a public intellectual.

Mr. Kristof’s reference to DES is typical chemophobic scaremongering. He points out a chemical that really is bad (DES), which in his mind justifies his demonization of every other chemical of which he is afraid. That’s the chemistry equivalent of saying that all Muslims are suspicious because 9/11 happened.

Mr. Kristof has demonstrated time and again that he is entirely ignorant of the basic principles of chemistry and toxicology. And given that he has been widely criticized from all sorts of science writers, he’s also completely impervious to being educated by actual experts.

Consider what Deborah Blum, a chemistry writer, wrote about him:

“Whenever Nicholas Kristof writes a piece about the evil, awful world of chemicals out there, I feel a twitchy need to kick something. Or someone. Possibly right there in The New York Times newsroom.”

In perhaps the biggest indication that Mr. Kristof is fundamentally anti-science, he ignores evidence that he dislikes. That’s utterly taboo for scientists, but par-for-the-course for NYT op-ed columnists. Writing in Forbes, Trevor Butterworth says:

“[Kristof] applies no statistical or experimental criticism to these studies: they always “really” find what they claim to have found; and he seems unaware of the many non-industry funded studies or regulatory agency assessments that contradict them. There is no mention, for instance, of the 15-page point-by-point rebuttal written by the Food and Drug Administration to the Natural Resources Defense Council’s petition to ban BPA, a rebuttal which relies, primarily, on non-industry funded research.

Chemjobber, a blog that promotes jobs in chemistry, had this to say of Mr. Kristof:

“I am a little at wit’s end to understand how to help intelligent people like Mr. Kristof see past their clear fear of chemicals, the distrust they have of chemical companies and their seeming dismissal of regulatory agencies. It seems to me that he is all too credulous to the claims of organizations like the Silent Spring Institute that are incentivized to generate as much fear and doubt around chemicals as possible. “

The New York Times Has Only One Editorial Standard

The real problem is that the New York Times has only one editorial standard: To publish whatever sells more copies to their Upper West Side clientele. That means throwing biotechnology and chemistry under the bus while embracing organic food, acupuncture, and other forms of witchcraft.

 

Updated: Fears and Facts about Reservoirs and GHGs

 

A previous post explained how methane has been hyped in support of climate alarmism/activism. Now we have an additional campaign to disparage hydropower because of methane emissions from dam reservoirs. File this under “They have no shame.” Excerpts below with my bolds.

On March 5, 2018 a study was published in Environmental Research Letters Greenhouse gas emissions of hydropower in the Mekong River Basin can exceed those of fossil fuel energy sources

“The hydropower related emissions started in the Mekong in mid-1960’s when the first large reservoir was built in Thailand, and the emissions increased considerably in early 2000’s when hydropower development became more intensive. Currently the emissions are estimated to be around 15 million tonnes of CO2e per year, which is more than total emissions of all sectors in Lao PDR in year 2013,” says Dr Timo Räsänen who led the study. The GHG emissions are expected to increase when more hydropower is built. However, if construction of new reservoirs is halted, the emissions will decline slowly in time.

Another recent example of the claim is from Asia Times Global hydropower boom will add to climate change

The study, published in BioScience, looked at the carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O) emitted from 267 reservoirs across six continents. In total, the reservoirs studied have a surface area of more than 77,287 square kilometers (29,841 square miles). That’s equivalent to about a quarter of the surface area of all reservoirs in the world, which together cover 305,723 sq km – roughly the combined size of the United Kingdom and Ireland.

“The new study confirms that reservoirs are major emitters of methane, a particularly aggressive greenhouse gas,” said Kate Horner, Executive Director of International Rivers, adding that hydropower dams “can no longer be considered a clean and green source of electricity.”

In fact, methane’s effect is 86 times greater than that of CO2 when considered on this two-decade timescale. Importantly, the study found that methane is responsible for 90% of the global warming impact of reservoir emissions over 20 years.

Alarmists are Wrong about Hydropower

Now CH4 is proclaimed the primary culprit held against hydropower. As usual, there is a kernel of truth buried beneath this obsessive campaign: Flooding of biomass does result in decomposition accompanied by some release of CH4 and CO2. From HydroQuebec:  Greenhouse gas emissions and reservoirs

Impoundment of hydroelectric reservoirs induces decomposition of a small fraction of the flooded biomass (forests, peatlands and other soil types) and an increase in the aquatic wildlife and vegetation in the reservoir.

The result is higher greenhouse gas (GHG) emissions after impoundment, mainly CO2 (carbon dioxide) and a small amount of CH4 (methane).

However, these emissions are temporary and peak two to four years after the reservoir is filled.

During the ensuing decade, CO2 emissions gradually diminish and return to the levels given off by neighboring lakes and rivers.

Hydropower generation, on average, emits 50 times less GHGs than a natural gas generating station and about 70 times less than a coal-fired generating station.

The Facts about Tropical Reservoirs

Activists estimate Methane emissions from dams and reservoirs across the planet, including hydropower, are estimated to be significantly larger than previously thought, approximately equal to 1 gigaton per year.

Activists also claim that dams in boreal regions like Quebec are not the problem, but tropical reservoirs are a big threat to the climate. Contradicting that is an intensive study of Brazilian dams and reservoirsGreenhouse Gas Emissions from Reservoirs: Studying the Issue in Brazil

The Itaipu Dam is a hydroelectric dam on the Paraná River located on the border between Brazil and Paraguay. The name “Itaipu” was taken from an isle that existed near the construction site. In the Guarani language, Itaipu means “the sound of a stone”. The American composer Philip Glass has also written a symphonic cantata named Itaipu, in honour of the structure.

Five Conclusions from Studying Brazilian Reservoirs

1) The budget approach is essential for a proper grasp of the processes going on in reservoirs. This approach involves taking into account the ways in which the system exchanged GHGs with the atmosphere before the reservoir was flooded. Older studies measured only the emissions of GHG from the reservoir surface or, more recently, from downstream de-gassing. But without the measurement of the inputs of carbon to the system, no conclusions can be drawn from surface measurements alone.

2) When you consider the total budgets, most reservoirs acted as sinks of carbon in the short run (our measurements covered one year in each reservoir). In other words, they received more carbon than they exported to the atmosphere and to downstream.

3) Smaller reservoirs are more efficient as carbon traps than the larger ones.

4) As for the GHG impact, in order to determine it, we should add the methane (CH4) emissions to the fraction of carbon dioxide (CO2) emissions which comes from the flooded biomass and organic carbon in the flooded (terrestrial) soil. The other CO2 emissions, arising from the respiration of aquatic organisms or from the decomposition of terrestrial detritus that flows into the reservoir (including domestic sewage), are not impacts of the reservoir. From this sum, we should deduct the amount of carbon that is stored in the sediment and which will be kept there for at least the life of the reservoir (usually more than 80 years). This “stored carbon” ranges from as little as 2 percent of the total carbon output to more than 25 percent, depending on the reservoirs.

5) When we assess the GHG impacts following the guidelines just described, all of FURNAS’s reservoirs have lower emissions than the cleanest European oil plant. The worst case – Manso, which was sampled only three years after the impoundment, and therefore in a time in which the contribution from the flooded biomass was still very significant – emitted about half as much carbon dioxide equivalents (CO2 eq) as the average oil plant from the United States (CO2 eq is a metric measure used to compare the emissions from various greenhouse gases based upon their global warming potential, GWP. CO2 eq for a gas is derived by multiplying the tons of the gas by the associated GWP.) We also observed a very good correlation between GHG emissions and the age of the reservoirs. The reservoirs older than 30 years had negligible emissions, and some of them had a net absorption of CO2eq.

Keeping Methane in Perspective

Over the last 30 years, CH4 in the atmosphere increased from 1.6 ppm to 1.8 ppm, compared to CO2, presently at 400 ppm. So all the dam building over 3 decades, along with all other land use was part of a miniscule increase of a microscopic gas, 200 times smaller than the trace gas, CO2.

Background Facts on Methane and Climate Change

Methane pollution surrounding Porter Ranch, LA, ( Photo credit: Energy Efficiency Team)

 

The US Senate is considering an act to repeal with prejudice an Obama anti-methane regulation. The story from activist source Climate Central is
Senate Mulls ‘Kill Switch’ for Obama Methane Rule

The U.S. Senate is expected to vote soon on whether to use the Congressional Review Act to kill an Obama administration climate regulation that cuts methane emissions from oil and gas wells on federal land. The rule was designed to reduce oil and gas wells’ contribution to climate change and to stop energy companies from wasting natural gas.

The Congressional Review Act is rarely invoked. It was used this month to reverse a regulation for the first time in 16 years and it’s a particularly lethal way to kill a regulation as it would take an act of Congress to approve a similar regulation. Federal agencies cannot propose similar regulations on their own.

The Claim Against Methane

Now some Republican senators are hesitant to take this step because of claims like this one in the article:

Methane is 86 times more potent as a greenhouse gas than carbon dioxide over a period of 20 years and is a significant contributor to climate change. It warms the climate much more than other greenhouse gases over a period of decades before eventually losing its potency. Atmospheric carbon dioxide remains a potent greenhouse gas for thousands of years.

Essentially the journalist is saying: As afraid as you are about CO2, you should be 86 times more afraid of methane. Which also means, if CO2 is not a warming problem, your fear of methane is 86 times zero. The thousands of years claim is also bogus, but that is beside the point of this post, which is Methane.

IPCC Methane Scare

The article helpfully provides a link referring to Chapter 8 of IPCC AR5 report by Working Group 1 Anthropogenic and Natural Radiative Forcing.

The document is full of sophistry and creative accounting in order to produce as scary a number as possible. Table 8.7 provides the number for CH4 potency of 86 times that of CO2.  They note they were able to increase the Global Warming Potential (GWP) of CH4 by 20% over the estimate in AR4. The increase comes from adding in more indirect effects and feedbacks, as well as from increased concentration in the atmosphere.

In the details are some qualifying notes like these:

Uncertainties related to the climate–carbon feedback are large, comparable in magnitude to the strength of the feedback for a single gas.

For CH4 GWP we estimate an uncertainty of ±30% and ±40% for 20- and 100-year time horizons, respectively (for 5 to 95% uncertainty range).

 

Methane Facts from the Real World
From Sea Friends (here):

Methane is natural gas CH4 which burns cleanly to carbon dioxide and water. Methane is eagerly sought after as fuel for electric power plants because of its ease of transport and because it produces the least carbon dioxide for the most power. Also cars can be powered with compressed natural gas (CNG) for short distances.

In many countries CNG has been widely distributed as the main home heating fuel. As a consequence, methane has leaked to the atmosphere in large quantities, now firmly controlled. Grazing animals also produce methane in their complicated stomachs and methane escapes from rice paddies and peat bogs like the Siberian permafrost.

It is thought that methane is a very potent greenhouse gas because it absorbs some infrared wavelengths 7 times more effectively than CO2, molecule for molecule, and by weight even 20 times. As we have seen previously, this also means that within a distance of metres, its effect has saturated, and further transmission of heat occurs by convection and conduction rather than by radiation.

Note that when H20 is present in the lower troposphere, there are few photons left for CH4 to absorb:

Even if the IPCC radiative greenhouse theory were true, methane occurs only in minute quantities in air, 1.8ppm versus CO2 of 390ppm. By weight, CH4 is only 5.24Gt versus CO2 3140Gt (on this assumption). If it truly were twenty times more potent, it would amount to an equivalent of 105Gt CO2 or one thirtieth that of CO2. A doubling in methane would thus have no noticeable effect on world temperature.

However, the factor of 20 is entirely misleading because absorption is proportional to the number of molecules (=volume), so the factor of 7 (7.3) is correct and 20 is wrong. With this in mind, the perceived threat from methane becomes even less.

Further still, methane has been rising from 1.6ppm to 1.8ppm in 30 years (1980-2010), assuming that it has not stopped rising, this amounts to a doubling in 2-3 centuries. In other words, methane can never have any measurable effect on temperature, even if the IPCC radiative cooling theory were right.

Because only a small fraction in the rise of methane in air can be attributed to farm animals, it is ludicrous to worry about this aspect or to try to farm with smaller emissions of methane, or to tax it or to trade credits.

The fact that methane in air has been leveling off in the past two decades, even though we do not know why, implies that it plays absolutely no role as a greenhouse gas.

More information at THE METHANE MISCONCEPTIONS by Dr Wilson Flood (UK) here

Summary:

Natural Gas (75% methane) burns the cleanest with the least CO2 for the energy produced.

Leakage of methane is already addressed by efficiency improvements for its economic recovery, and will apparently be subject to even more regulations.

The atmosphere is a methane sink where the compound is oxidized through a series of reactions producing 1 CO2 and 2H20 after a few years.

GWP (Global Warming Potential) is CO2 equivalent heat trapping based on laboratory, not real world effects.

Any IR absorption by methane is limited by H2O absorbing in the same low energy LW bands.

There is no danger this century from natural or man-made methane emissions.

Conclusion

Senators and the public are being bamboozled by opaque scientific bafflegab. The plain truth is much different. The atmosphere is a methane sink in which CH4 is oxidized in the first few meters. The amount of CH4 available in the air is miniscule, even compared to the trace gas CO2, and it is not accelerating. Methane is the obvious choice to signal virtue on the climate issue since governmental actions will not make a bit of difference anyway, except perhaps to do some economic harm.

Give a daisy a break (h/t Derek here)

Daisy methane

Footnote:

For a more thorough and realistic description of atmospheric warming see:

Fearless Physics from Dr. Salby