Mark Imisides writes at Australia Spectator Politics deep in the thrall of climate change. Excerpts in italics with my bolds and some added images.
The issue of Climate Change stormed across our TV screens at the recent federal election (Australia). Funded by a renewables investor, the Climate 200 candidates mopped up – taking eight seats from the major parties.
How did something that was considered a slightly oddball theory in the late 80s morph into a multiheaded monster that seems to have every world government, including ours, in its thrall?
By any measure, the Climate Change industry is huge. It’s quite common to hear people use the term ‘big oil’, but given that the Climate Change industry is now, at $1.5 trillion, rapidly catching up to the oil industry ($2.1 trillion), it’s entirely valid to use the term ‘big climate’. Andrew Urban recently pointed this out, with several cases of people that have cashed in big-time from Climate Change alarmism.
How did it get so big? How did it become the colossus that it now is? As it happens, there is a back story that few people outside of the scientific sector are aware of.
In simple terms, it was a perfect storm of four factors that all came together in the late 80s. As I was doing my PhD at the time, I got to see this firsthand. Very few people understand this process, largely because those outside the sector don’t know that science is just as prone as any other industry to the shifting sands of zeitgeist – trends that emerge that shape and mould how science is done, funded, and perceived.
These four factors were to revolutionise science, and although not all the changes were bad, their lasting legacy will be the entry of science into political discourse, the grubby touching of research funds by hands soiled by self-interest, greed, and corruption.
Revolutionary Factor 1: Chernobyl Ended Science’s Golden Age
The first of these factors was the accident at Chernobyl in 1986. What? What on earth does Chernobyl have to do with climate change?
The answer is that there is no direct connection. It was, however, the final nail in the coffin of science’s golden age. Coming out of the second world war, science was seen as having the answers to all the world’s ills. It was seen as having won the war (largely due to the atomic bomb) and was the way of the future. This optimism is expressed beautifully in the film A Beautiful Mind where the incoming students at Princeton in 1948 are given a stirring lecture by the mathematician Hellinger of the future of science in the post-war world.
And so science flourished. Antibiotics became readily available, DNA was characterised, and the contraceptive pill introduced a new era of sexual freedom. The synthetic chemical industry also boomed, as did the nuclear power industry. We had wonder chemicals like DDT that eradicated malaria-carrying mosquitoes, aerosol fly sprays, and cheap reliable power thanks to the power of the atom.
But then, over time, mistakes happened, largely because environmental science, as a discipline, didn’t exist. Or, to put it another way, the world was seen as an infinitely-sized bucket into which chemicals could be poured, and the concept of man-made chemicals exerting any influence on the world in which we live just didn’t exist. It just never occurred to anyone.
But then reality began to impinge on this mindset. Amongst other things, the persistence of nonbiodegradable pesticides like DDT became a problem, CFC propellants threatened the ozone layer, and we had Sellafield and Three Mile Island. Neither of these two incidents actually killed anyone but they gave everyone an awful fright.
But Chernobyl was the last straw. Scientists, it seemed, couldn’t be trusted after all.
It was the China Syndrome for real.
Revolutionary Factor 2: Global Recession Crashed Research Funding
This mistrust brought with it added scrutiny, and this coincided with the second factor in this list – the worldwide recession in the late 80s (as Paul Keating famously said, the recession we had to have).
The combination of the loss of trust in scientists with the fact that there wasn’t as much money to splash around as before, resulted in a massive change in the emphasis of scientific research. Before this time it was pretty easy to get funding for any pet project that you had, and you didn’t have to you justify it much in terms of its importance or relevance. But after this time, suddenly funds became much more difficult to get. To get funding, you had to prove the relevance of your research. You had to justify why it was important in terms practical outcomes.
In scientific terms, we could say that the emphasis switched
from pure research to applied research.
This change in emphasis had a dramatic effect on university faculties. Prior to this time if you went onto any campus you would find a science faculty that contained four departments – physics, chemistry, geology, and biology. When this change from pure to applied research happened, it hit the physics departments hardest. Of the four classic sciences, physics is the most fundamental, and the least applied. That is, it forms the basis on which knowledge in the other three disciplines are developed, but it is not as directly applicable to the real world.
And so, suddenly, people that were either physics academics or postgraduate students had very bleak career prospects. Consequently, from about this time physics departments began to disappear from university campuses, being subsumed into faculties with titles such as ‘Physical Sciences’ and so on. Suddenly, there were a lot of highly qualified physicists looking for work.
Revolutionary Factor 3: Warming Replaced the Cooling Trend
The third factor that came into play at about this time was that the world had started warming again. It cooled from about 1940 until 1975, but by the late 80s a warming trend was emerging. And, as it happens, this fed right into Margaret Thatcher’s political agenda – the fourth and final factor in our list.
Revolutionary Factor 4: Power Struggle Thatcher vs. Coal Unions
Thatcher had an awful problem with coal unions and wanted to break their power. She did this by approaching the Royal Society and essentially saying that she wanted to demonise coal and get people to use nuclear power, and there was money on the table if they could do that.
Thus, the IPCC was born, and every government department around the world
devoted to ‘climate science’ dates from about this time.
But who could they employ – the term ‘climate scientist’ didn’t exist. Well, as it happens, questions of heat transfer fall squarely into the lap of physicists, and, oh look, there are plenty of them looking for work. They had a precious job offer if they would just say the right things.
Thus, the discipline of ‘climate science’ was born, degrees were established, and people began selecting it as a career option from the undergraduate level.
Well, so far so good. When new scientific knowledge is revealed, new career paths, and even new language is established. For example, with the advent of electricity in the early 19th Century largely as a result of the efforts of Volta, the term ‘electrician’ was coined.
Climate Science Discredited Early On
But here is where the similarity between ‘climate science’ and other new technologies like electricity (or quantum mechanics or biogenetics) ends. In each of these latter cases, the science has developed from its infancy into a mature discipline, as research and enquiry revealed further information and deeper understandings.
With ‘climate science’, however, the very opposite happened. It was barely a decade old when it was comprehensively disproved, by two discrete mechanisms. That is, it was proven that the notion of CO2 warming the world had no scientific basis, and the computer models that predicted its influence were wrong.
The first of these was the absence of the ‘tropospheric hotspot’ that was predicted by the computer models. Anthony Watts has a very good, even-handed discussion of it on his website.
The second, and more significant study, resulted from a set of ice-core studies from Vladivostok in 1999, that showed that atmospheric CO2 concentration followed temperature changes, not the other way around.
As an Analytical Chemist, it was this second study that struck a chord with me. From the time I first saw Al Gore’s graph with overlaid temperature and CO2 charts, the first question I asked was, ‘How do they know which one is cause and which one is effect? Haven’t they ever heard of Henry’s Law?’
Henry’s Law, for the layman, relates the concentration of a compound in the aqueous phase to its concentration in the vapour phase above it. In simplest terms, the solubility of a gas in a liquid decreases with temperature. That is, the hotter a liquid gets, the less of the gas that can dissolve in it. So if the ocean was heating, it would release CO2 into the atmosphere, resulting in higher gaseous concentrations.
The question of which one is cause and which one is effect is determined simply by which one leads, and which one lags, and the Vladivostok ice cores showed that CO2 changes lagged behind temperature changes by about 10,000 years. Quod Erat Demonstrandum.
In other words, the Vladivostok ice core data comprehensively disproves the notion that man-made CO2 is heating the planet. A theory was advanced, accepted by many, but then disproven.
This has happened many times in science, and it is the very mechanism by which scientific knowledge is advanced.
Perhaps the most spectacular example of this is luminiferous ether theory. It was a theory that was spawned by a simple observation. It was observed that if an alarm clock was placed in a glass bell, and a vacuum was drawn, then when the alarm went off, no sound was evident, despite the fact that you could see the hammer striking the bells.
They concluded, correctly, that sound requires air to move through, but light didn’t. Why was the clock still visible in the vacuum? Why, the only explanation possible, it seemed, was that there was some other, as yet unknown, medium through which it moved. Thus, the luminiferous ether was proposed.
This was widely accepted, until Michelson and Morley sought to measure it, in 1887. They discovered, in simple terms, that it just wasn’t there. Thus, the theory was overturned, almost overnight. As it happens it took Michelson and Morley some time to realise the implication of their experiment, but when they did, the conclusion was inescapable – there was no luminiferous ether.
This led to further studies into the nature of light, and before long the photoelectric effect was discovered, and the passage of light through a vacuum was elegantly explained. Thus, the luminiferous ether theory has been consigned to the dustbin of history, along with phlogiston theory and a whole lot of other theories that seemed like a good idea at the time.
Anthropogenic Global Warming Escaped Scientific Death
Why hasn’t that happened to the notion of Anthropogenic Global Warming? Why, when it was discovered that CO2 concentration followed temperature, and not the other way around, didn’t people say ‘temperature is the dog, and CO2 is the tail’? Why, in 2023, is the tail wagging the dog?
The simple answer is that science is no longer driving the bus.
What is happening is the type of positive feedback loop that climate scientists talk about, despite the fact that these things are almost unheard of in the physical world. It is alive and well in politics.
It is always in a government’s interest to create a crisis. It is a well-known political phenomenon that people cling the incumbents in a crisis, and the governments go to great lengths to make people think that they are saving people from the crisis. Many people, for example, attribute John Howard’s success in the 2001 election to the Tampa Crisis, and George W Bush’s popularity shot up after 9/11.
The other side of this feedback loop is that if you are employed to investigate ‘climate change’, well, you’d jolly well better find it, or the government money will dry up.
So if the scientists say the right things, they get employment and ‘research’ funding. The greater the crisis they report on, the happier the government is, and the more money they get paid.
This is the reason that these ‘climate scientists’ avoid scrutiny. They don’t have to justify their research to an ARC or CRC committee. They don’t have to produce results for scrutiny, in order to justify their research. It’s already guaranteed, in perpetuity, with a blank cheque.
No scientist is going to say ‘there is no climate crisis’ or he will be out of a job, and no government is going to say ‘there is no climate crisis’ or they’ll be out on their ear.
Energy to be The Sacrificial Lamb Instead
The consequence of this is an energy crisis. People dying from the cold because they cannot pay their bills. Game shows in the UK now offer the payment of energy bills as a lucrative prize, and here in Australia we face the prospect of skyrocketing energy bills because of the comically inept Chris Bowen, and his massive uncosted plan to completely replace our electricity with renewables.
So what do we do about it? I think there is hope for sanity to return, but we have to prosecute the case in the right way.
Never let a good story get in the way of facts springs to mind with this. It took me 40 years to be given medals for my part in conflicts that “never happend” as it was not politically popular and in some spheres are still not recognised. That is the way the world works I suppose. Its all about the narrative and not the facts.
Communism hides under the convenient heading of ‘democracy’, and is by any other description, government members forcibly taking (or printing) money that isn’t theirs to squander on whatever crises they manufacture, for the purpose of implementing controls, which otherwise would be rejected.
Irrespective of evidence or scientific facts, the bottom line is always the same – we pay!
WAKE UP WORLD!
Reblogged this on Climate Collections.