NOAA Loses 1M km2 of Arctic Ice in July

Arctic2021185

NOAA’s Sea Ice Index (shown in orange above) dramatically lost 1M km2 of Arctic sea ice extent in just the last four days.  Meanwhile MASIE from the National Ice Center (NIC) (cyan color) declined much less, ~400k km2, a more typical decline since the two datasets were nearly the same on June 30, 2021.  Note that extreme drops in ice extents can happen in July, as seen last year (purple line after day 190), so the issue bears watching. There is history of satellite difficulties discriminating between open water and surface melt water during both the melting and refreezing seasons.  A background post below explains differences between the two datasets.

Background from previous post Support MASIE Arctic Ice Dataset

MASIE: “high-resolution, accurate charts of ice conditions”
Walt Meier, NSIDC, October 2015 article in Annals of Glaciology.

Update February 4, 2017 Below

The home page for MASIE (here) invites visitors to show their interest in the dataset and analysis tools since continued funding is not assured. The page says:
NSIDC has received support to develop MASIE but not to maintain MASIE. We are actively seeking support to maintain the Web site and products over the long term. If you find MASIE helpful, please let us know with a quick message to NSIDC User Services.

For the reasons below, I hope people will go there and express their support.

1. MASIE is Rigorous.

Note on Sea Ice Resolution:

Northern Hemisphere Spatial Coverage

Sea Ice Index (SII) from NOAA is based on 25 km cells and 15% ice coverage. That means if a grid cell 25X25, or 625 km2 is estimated to have at least 15% ice, then 625 km2 is added to the total extent. In the mapping details, grid cells vary between 382 to 664 km2 with latitudes.  And the satellites’ Field of View (FOV) is actually an ellipsoid ranging from 486 to 3330 km2 depending on the channel and frequency.  More info is here.

MASIE is based on 4 km cells and 40% ice coverage. Thus, for MASIE estimates, if a grid cell is deemed to have at least 40% ice, then 16 km2 is added to the total extent.

The significantly higher resolution in MASIE means that any error in detecting ice cover at the threshold level affects only 16 km2 in the MASIE total, compared to at least 600 km2 variation in SII.  A few dozen SII cells falling below the 15% threshold is reported as a sizable loss of ice in the Arctic.

2. MASIE is Reliable.

2017029google

MASIE is an operational ice product developed from multiple sources to provide the most accurate possible description of Arctic ice for the sake of ships operating in the region.

Operational analyses combine a variety of remote-sensing inputs and other sources via manual integration to create high-resolution, accurate charts of ice conditions in support of navigation and operational forecast models. One such product is the daily Multisensor Analyzed Sea Ice Extent (MASIE). The higher spatial resolution along with multiple input data and manual analysis potentially provide more precise mapping of the ice edge than passive microwave estimates.  From Meier et al., link below.

Some people have latched onto a line from the NSIDC background page:
Use the Sea Ice Index when comparing trends in sea ice over time or when consistency is important. Even then, the monthly, not the daily, Sea Ice Index views should be used to look at trends in sea ice. The Sea Ice Index documentation explains how linear regression is used to say something about trends in ice extent, and what the limitations of that method are. Use MASIE when you want the most accurate view possible of Arctic-wide ice on a given day or through the week.

That statement was not updated to reflect recent developments:
“In June 2014, we decided to make the MASIE product available back to 2006. This was done in response to user requests, and because the IMS product output, upon which MASIE is based, appeared to be reasonably consistent.”

The fact that MASIE employs human judgment is discomforting to climatologists as a potential source of error, so Meier and others prefer that the analysis be done by computer algorithms. Yet, as we shall see, the computer programs are themselves human inventions and when applied uncritically by machines produce errors of their own.

3. MASIE serves as Calibration for satellite products.

The NSIDC Background cites as support a study by Partington et al (2003).  Reading that study, one finds that the authors preferred the MASIE data and said this:

“Passive microwave sensors from the U.S. Defense Meteorological Satellite Program have long provided a key source of information on Arctic-wide sea ice conditions, but suffer from some known deficiencies, notably a tendency to underestimate ice concentrations in summer. With the recent release of digital and quality controlled ice charts extending back to 1972 from the U.S. National Ice Center (NIC), there is now an alternative record of late twentieth century Northern Hemisphere sea ice conditions to compare with the valuable, but imperfect, passive microwave sea ice record.”

“This analysis has been based on ice chart data rather than the more commonly analyzed passive microwave derived ice concentrations. Differences between the NIC ice chart sea ice record and the passive microwave sea ice record are highly significant despite the fact that the NIC charts are semi-dependent on the passive microwave data, and it is worth noting these differences. . .In summer, the difference between the two sources of data rises to a maximum of 23% peaking in early August, equivalent to ice coverage the size of Greenland. (my bold)  For clarity: the ice chart data show higher extents than passive microwave data.

The differences are even greater for Canadian regions.

“More than 1380 regional Canadian weekly sea-ice charts for four Canadian regions and 839 hemispheric U.S. weekly sea-ice charts from 1979 to 1996 are compared with passive microwave sea-ice concentration estimates using the National Aeronautics and Space Administration (NASA) Team algorithm. Compared with the Canadian regional ice charts, the NASA Team algorithm underestimates the total ice-covered area by 20.4% to 33.5% during ice melt in the summer and by 7.6% to 43.5% during ice growth in the late fall.”

From: The Use of Operational Ice Charts for Evaluating Passive Microwave Ice Concentration Data, Agnew and Howell  http://www.tandfonline.com/doi/pdf/10.3137/ao.410405

More recently Walter Meier, who is in charge of SII, and several colleagues compared SII and MASIE and published their findings October 2015 (here).  The purpose of the analysis was stated thus:
Our comparison is not meant to be an extensive validation of either product, but to illustrate as guidance for future use how the two products behave in different regimes.

The abstract concludes:
Comparisons indicate that MASIE shows higher Arctic-wide extent values throughout most of the year, largely because of the limitations of passive microwave sensors in some conditions (e.g. surface melt). However, during some parts of the year, MASIE tends to indicate less ice than estimated by passive microwave sensors. These comparisons yield a better understanding of operational and research sea-ice data products; this in turn has important implications for their use in climate and weather models.

A more extensive comparison of MASIE from NIC and SII from NOAA is here.

4. MASIE continues a long history of Arctic Ice Charts.

Naval authorities have for centuries prepared ice charts for the safety of ships operating in the Arctic.  There are Russian, Danish, Norwegian, and Canadian charts, in addition to MASIE, the US version.  These estimates rely on multiple sources of data, including the NASA reports.  Charts are made with no climate ax to grind, only to get accurate locations and extents of Arctic ice each day.

Figure 16-3: Time series of April sea-ice extent in Nordic Sea (1864-1998) given by 2-year running mean and second-order polynomial curves. Top: Nordic Sea; middle: eastern area; bottom: western area (after Vinje, 2000). IPCC Third Assessment Report

Figure 16-3: Time series of April sea-ice extent in Nordic Sea (1864-1998) given by 2-year running mean and second-order polynomial curves. Top: Nordic Sea; middle: eastern area; bottom: western area (after Vinje, 2000). IPCC Third Assessment Report

Since these long-term records show a quasi-60 year cycle in ice extents, it is vital to have a modern dataset based on the same methodology, albeit with sophisticated modern tools.

Summary

Measuring anything in the Arctic is difficult, and especially sea ice that is constantly moving around.  It is a good thing to have independent measures using different methodologies, since any estimate is prone to error.

Please take the time to express your appreciation for NIC’s contribution and your support for their products at MASIE  home page.

Update February 4, 2017

In the comments Neven said MASIE was unusable because it was biased low before 2010 and high afterward.  I have looked into that and he is mistaken.  Below is the pattern that is observed most months.  March is the annual maximum and coming up soon.

march-masie-sii

As the graph shows, the two datasets were aligned through 2010, and then SII began underestimating ice extent, resulting in a negative 11-year trend.  MASIE shows the same fluctuations, but with higher extents and a slightly positive trend for March extents.  The satellite sensors have a hard time with mixed ice/water conditions (well-documented).

More on the two datasets NOAA has been Losing Arctic Ice

Ivermectin Invictus: The Unsung Covid Victor

Invictus Games Toronto 2017

The triumph for Ivermectin over Covid19 is reviewed in biznews Studies on Ivermectin show positive results as Covid-19 treatment.  Excerpts in italics with my bolds.

The use of Ivermectin for the prevention and treatment of Covid-19 has been the subject of much debate. The World Health Organisation‘s recommendation against Ivermectin as an alternative treatment for Covid-19 is shrouded in suspicion as the WHO’s second biggest donor is the Bill and Melinda Gates Foundation (BMGF). Bill Gates also founded and funds The Vaccine Alliance (GAVI). The connection and clear conflict of interest is thus astounding. This 3,000 word synopsis, done by Rubin van Niekerk, is on Bryant’s peer reviewed meta analysis published in the American Journal of Therapeutics about 60 studies on the treatment impact of Ivermectin on Covid-19. Van Niekerk notes that:

‘Ivermectin studies vary widely, which makes the consistently positive results even more remarkable.”

Ivermectin Meta Analysis Synopsis By Rubin van Niekerk*

Meta analysis of 60 studies on Ivermectin and Covid 19 by Bryant, published in the American Journal of Therapeutics. (Version 93 Updated 21/6/21)

This is a brief 3000-word synopsis of the analysis of all significant studies concerning the use of ivermectin for COVID-19. Search methods, inclusion criteria, effect extraction criteria (more serious outcomes have priority), all individual study data, PRISMA answers, and statistical methods are detailed. Random effects of meta-analysis results for all studies, for studies within each treatment stage, for mortality results, for COVID-19 case results, for viral clearance results, for peer-reviewed studies, for Randomized Controlled Trials (RCTs), and after exclusions are presented.

Please read the original 18 000-word comprehensive research analysis should you need more detail and insight into the methodology on Ivermectin for COVID-19: real-time meta analysis of 61 studies 

♦ Meta analysis using the most serious outcome reported shows 76% and 85% improvement for early treatment and prophylaxis (RR 0.24 [0.14-0.41] and 0.15 [0.09-0.25]), with similar results after exclusion based sensitivity analysis, restriction to peer-reviewed studies, and restriction to Randomized Controlled Trials.
81% and 96% lower mortality is observed for early treatment and prophylaxis (RR 0.19 [0.07-0.54] and 0.04 [0.00-0.58]). Statistically significant improvements are seen for mortality, ventilation, hospitalization, cases, and viral clearance. 28 studies show statistically significant improvements in isolation.

Ivermectin meta analysis

•The probability that an ineffective treatment generated results as positive as the 60 studies to date is estimated to be 1 in 2 trillion (p = 0.00000000000045).

•Heterogeneity arises from many factors including treatment delay, population, effect measured, variants, and regimens. The consistency of positive results is remarkable. Heterogeneity is low in specific cases, for example early treatment mortality.

•While many treatments have some level of efficacy, they do not replace vaccines and other measures to avoid infection. Only 27% of ivermectin studies show zero events in the treatment arm.

•Elimination of COVID-19 is a race against viral evolution. No treatment, vaccine, or intervention is 100% available and effective for all current and future variants. All practical, effective, and safe means should be used. Not doing so increases the risk of COVID-19 becoming endemic; and increases mortality, morbidity, and collateral damage.

Pillars Needed Missing

•Administration with food, often not specified, may significantly increase plasma and tissue concentration.

•The evidence base is much larger and has much lower conflict of interest than typically used to approve drugs.

•All data to reproduce this paper and sources are in the appendix. See [Bryant, Hariyanto, Hill, Kory, Lawrie, Nardelli] for other meta analyses, all with similar results confirming effectiveness.

Parasite Drug Analyzed as Possible Covid Treatment in U.K. Trial

cpdfrs7bik441

Leftists Obsessed with Bogus Numbers

5-year-plan-in-four-years

Lubos Motl writes with insight gained from the Czech experience with imposed Communism in his blog article CO2 emissions, “cases”, … fanatical leftists love to worship meaningless quantities as measures of well-being.  Excerpts in italics with my bolds.

Leftists hate money and the conversion of things to money. Why is it so? In the old times, the leftists were the losers who didn’t have much money. The decision based on the “maximization of money” was a decision usually made by “some other people, e.g. the capitalists”, and those may have had different interests than the Marxist losers, and that’s why the Marxist losers generally didn’t like the decisions based on the maximization of the financial benefits. They had a low influence on the society’s decision making (because they were broke) and the interests of the capitalists weren’t always the same as the interests of the Marxist losers. (In reality, what was in the interest in the capitalists was ultimately good for the Marxist losers as well but the latter just didn’t understand it.)

That is the likely reason why the leftists always wanted to switch to some “more objective” measures of well-being. They saw all “subjective” (i.e. money-based) decisions to be dominated by evil people, the class of enemies. Where did this leftist strategy go?

Well, during the 40 years of communism in Czechoslovakia,
the communist party often mindlessly wanted to

maximize the production of coal and steel in tons.

Steel and coal are just two major examples that were used to “objectively measure the well-being”. You may see that within a limited context, there was a grain of truth in it. The more machines we make, the more hard work they may replace, and we need steel and coal for all those good things. But the range of validity of this reasoning was unavoidably very limited. They could have used the U.S. dollars (e.g. the total GDP, or in sustainable salaries) to measure the well-being (that should be maximized by the communist plans) but that would already be bad according to their ideology. Needless to say, it was a road to hell because in the long run, there is no reason why “tons of steel or coal” should be the same thing as “well-being” or “happiness”. And it’s not. We kept on producing lots of steel and coal that was already obsolete, that was helping to preserve technologies and industries that were no longer needed, helpful, or competitive, and the production of coal and steel substantially decreased after communism fell in 1989. We found out that we could get richer despite producing less steel and coal!

In 1989, communism was defeated and humiliated but almost all the communist rats survived. This collective trash has largely moved to the environmentalist movement that became a global warehouse for the Bolshevik human feces, also known as the watermelons. They are green on the surface but red (Bolsheviks) inside. They were willing to modify some details of their ideology or behavior but not the actual core substance. The detail that they modified was to “largely switch the sign” and consider the coal and steel to be evil.

Instead of maximizing steel and coal, the goal became to minimize the CO2 emissions.

The obsession with the CO2 emissions (which now carry the opposite sign: CO2 emissions are claimed to be bad!) is similar to the obsession of the Leninists and Stalinists with the maximization of the steel and coal production except that the current watermelons, the gr@tins of the world, are far more fanatical and unhinged than the Leninists and Stalinists have ever been. And one more thing has changed: these new, green Marxists promote these “objective measures of well-being” because it reduces the freedom, wealth, and power of everyone else. In that sense, they are still Marxists. However, they don’t protest against some people’s getting very rich as long as it is them. By this not so subtle change, we are facing a new class of Marxists who are still Marxists (more fanatical than the old ones) but who are often very rich, too. It is an extremely risky combination when such creatures become both powerful and rich.

Needless to say, the CO2 emissions aren’t the same thing as “evil”, the reduction of the CO2 emissions is in no way the same thing as “well-being”. Instead, if you are at least a little bit rational, you know damn too well that the CO2 emissions are totally obviously positively correlated with the well-being. The more CO2, the better. CO2 is the gas we call life. Its increase by 50% since 1750 AD has allowed the plants to have fewer pores (through which they suck CO2 from the air) which is why they are losing less water and they are better at water management (and at withstanding possible drought). Just the higher CO2 has increased the agricultural yields per squared kilometer by some 20% (greater increases were added by genetic engineering, fight against pests etc.). And the man-made CO2 has freed us from back-breaking labor etc.

15-3.1

The obsession to minimize the CO2 emissions is completely irrational and insane, more insane than the maximization of steel and coal has ever been – but its advocates are more fanatical than the steel and coal comrades used to be. On top of that, most of the projects proposed to lower the CO2 emissions don’t even achieve that because there are always some neglected sources or sinks of CO2 (and lots of cheating everywhere, contrived public “causes” are the ideal environment for corruption, too). Also, the price of one ton of CO2 emissions is as volatile as the Bitcoin and depends on the caps that may be basically arbitrarily chosen by the rogue politicians.

Tons of CO2 are a different quantity to be extremized than tons of coal or steel. But the obsession to “mindlessly minimize or maximize these quantites” is exactly the same and builds on the leftists’ infinite hatred (often just pretended hatred, however) to money as an invention. The hatred towards money is equivalent to the hatred towards the “subjective conversion of costs and benefits to the same unit”. Leftists hate the subjective considerations like that (which are equivalent to counting the costs and benefits in the Czech crowns) because they hate the “subjective thinking” in general. Well, they hate it because the subjective thinking is the thinking of the free people – i.e. people who aren’t politically obedient in general. They prefer “objective thinking”, i.e. an imbecile or a clique of imbeciles who are in charge, have the total power over everybody, and tell everybody “what they should want and do”! When whole nations behave as herds of obedient sheep or other useless animals, the leftists are happy.

Such a general scheme is bound to lead to a decline of the society,
regardless of the detailed choice of the quantity that is worshiped
as the “objective measure of the human well-being”.

In 2020, the epoch of Covidism, if I use the term of the Czech ex-president Václav Klaus, began. The most characteristic yet crazy quantity that the new leftist masters want to minimize (in this case, like the CO2 emissions, it “should be” minimized) are the “cases” of Covid-19, i.e. the number of positive PCR tests (or sometimes all tests, including Ag tests). From the beginning, it’s been insane because most people who are PCR tested positive for Covid-19 aren’t seriously sick. A fraction is completely asymptomatic, a great majority suffers through a very mild disease. On top of that, the number of positive tests depends on the number of people who are tested (because most positive people are unavoidably overlooked unless everyone is tested at least once a week); on the number of “magnifying” cycles in the PCR process; on the strategy to pick the candidates for testing, and lots of other things.

These are the reasons why it has been insane to be focused on the number of “cases” from 2020. But when the methodology to pick the people is constant, when the percentage of the positive tests is roughly kept constant, and when the virus doesn’t change, it becomes fair to use the number of “cases” as a measure of the total proliferation of the disease, Covid-19, in a nation or a population. However, there’s an even deeper problem, one that is related to the main topic of this essay:

Even when the testing frequency and techniques (including the selection) are constant, the number of cases may in no way be considered a measure of the well-being.

The reason is that “being PCR positive” is just a condition that increases the probability that one becomes sick; or one dies. And the number of deaths from Covid-19 is clearly a more important measure of the Covid-related losses than the number of cases – the filthy Coronazis love to obscure even elementary statements such as this one, however. The conversion factor e.g. from the “cases” to “deaths” is the case fatality rate (CFR) and that is not a universal constant. This is particularly important in the case of the Indian “delta” variant of the virus because it also belongs among the common cold viruses. It is a coronaviruses that causes a runny nose. This makes the disease much more contagious, like any common cold, and (in a totally non-immune, normally behaving urban, population). On the other hand, the nose cleans the breathing organs rather efficiently and the disease is unlikely to seriously invade the lungs where it really hurts. In fact, the runny nose indicates that this variant of the virus “likes” to play with the cosmetic problems such as the runny nose, it is not even attracted to the lungs. The same comments apply to any of the hundreds of rhinoviruses, coronaviruses… that cause common cold!

You may check the U.K. Covid graphs to see that despite the growing number of “cases” in recent weeks, the deaths are still near zero. The ratio of the two has decreased by more than one order of magnitude. A factor of 5 or so may be explained by the higher vaccination of the risk groups (older people); the remaining factor is due to the intrinsic lower case fatality rate of the delta variant. It is simply much lower than 0.1%, as every common cold virus is. That is much smaller than some 0.4% which is the expected fraction of the people in a civilized nation that die of Covid-19 (to make these estimates, I mainly use the Czech data which seem clean and I understand them extremely well: some 80% of Czechs have gone through Covid-19 and 0.3% of the population has died, so the case fatality rate must be around 0.4%).

So the conversion factor from a “case” to a “death” may have dropped by a factor of 30 or more in the U.K., relatively to the peak of the disease (the more classical variants of Covid-19). So it is just plain insane to pretend that “one case” is the same problem or “reduction of well-being” as “one case” half a year ago. The disease has turned into a common cold which is nearly harmless. But the society has been totally hijacked by the moronic, self-serving, brutally evil leftists who have simply become powerful assuming that they socially preserve the (totally false) idea that “the number of cases is an important quantity that must be minimized for the society’s well-being”. It is not important at all. The number of cases means absolutely nothing today because almost all the U.K. cases are just examples of a common cold that just happens to pass as a “Covid” through a test because this is how the test was idiotically designed. Everyone who tries to minimize the number of cases as we know them today is a dangerous deluded psychopath and must be treated on par with the war criminals, otherwise whole nations will be greatly damaged. The damage has already been grave but we face the risk of many years (like 40 years of the Czechoslovak communism) when a similar totally destructive way of thinking preserves itself by illegitimate tools that totally contradict even the most elementary Western values.

“Cases” mean nothing, especially when the character of the disease that is detected by the tests becomes vastly less serious. They mean even less than the “CO2 emissions” and even that favorite quantity of the moronic fanatical leftists hasn’t ever been a good measure of anything we should care about. Stop this insanity and treat the people “fighting to lower the cases” as war criminals right now. Thank you very much.

cg5b5e89d87e5cd-1

Alex Epstein on Energizing Puerto Rico

The video covers Alex Epstein’s full Congressional testimony on energy in Puerto Rico — including Q&A.  For those who prefer reading text, I provide below a transcript of most of it, with light editing transposing a spoken presentation into a written one.

Thank you for the honor of testifying before this committee.  What I say today will shock many of you. My views on energy are indeed unconventional, but I hope you hear me out with an open mind since we share the same goal: which is a flourishing and prosperous Puerto Rico.

I want to make the case that the one thing that will most help the people of Puerto Rico lift themselves out of crushing poverty is the thing many of you believe should be eliminated; and that’s low-cost reliable fossil fuel energy.

For just a brief moment I’d like to ask all of you to close your eyes. I’d like you to imagine an ambitious young Puerto Rican woman. I’ll call her Mia. Mia studied hard at the Emilio Delgado high school in Corusol. Mia’s passion is artificial intelligence, and she dreams of working at a high-tech startup. Sadly opportunities are sparse because many companies have fled Puerto Rico, and many more avoid it due to high cost and unreliable energy. So Mia applies for a remote job at a silicon valley tech incubator. But during her interview the electricity suddenly cuts out. The screen goes blank; she hopes it comes back quickly but hours go by with no internet as she waits in the sweltering heat with no AC. Think of Mia’s despair in those moments. Think of how much low-cost reliable energy could have helped her when she needed it most.

I know that every member of this committee shares the same goal I do: A healthy, prosperous and flourishing Puerto Rico. In order to help millions of talented and passionate people like Mia, we must look carefully at the full context of facts about Puerto Rico’s energy options. Here are three crucial facts that i almost never hear discussed about Puerto Rico and I want to highlight them.

First, the percentage of Puerto Ricans currently living in poverty is 43 percent.

Second, the cost of energy in Puerto Rico versus the states is up to three times higher

Third, the per capita income in Puerto Rico is $13, 000.

Honorable members does it strike you as fair that someone earning $13,000 per year should be paying three times what you and I pay for the energy that powers our homes? I don’t think that’s fair and I’m guessing you don’t think so either.

So what’s the solution? While we’re told that solar and wind can provide low-cost reliable energy, nothing could be further from the truth. Because solar and wind are unreliable; they don’t replace reliable power plants, they add to the cost of reliable power plants. The more wind and solar that grids use, the higher their electricity prices. German households have seen prices double in 20 years due to wasteful unreliable solar and wind infrastructure. Their electricity prices are three times ours, which are already too high due to solar and wind.

The only way for Puerto Rico to get low-cost reliable electricity anytime soon is using low-cost reliable fossil fuel energy sources like natural gas and coal, along with some massive regulatory reforms. I discuss in my written testimony actions such as scrapping the Jones act. We owe it to the people of Puerto Rico to give them the full context: The benefits and drawbacks of all their options. This includes recognizing any real coal ash problems, but also recognizing that there are many solutions to coal ash used around the world that don’t require shutting down power plants. Giving Puerto Ricans the full context also includes recognizing that fossil fuels CO2 emissions do impact climate. But it also includes being precise not hysterical about that impact. As I explain in my written testimony there is climate change, but not a climate crisis; and certainly not one that justifies condemning generations of Puerto Ricans to endless poverty by denying them low-cost reliable fossil fuel energy.

The stakes could not be higher. I think about Ellaria Davila who was breathing with the help of a mechanical ventilator. During prolonged blackouts her ventilator shut down and tragically she died. Her autopsy noted plainly that a ventilator “does not work without power.” Ladies and gentlemen: Nothing works without power, not the ventilators, not incubators, not farms nor schools. Not the millions of brave and passionate people who want to provide for their families and live lives of dignity and opportunity. You have it in your power today to help Puerto Ricans gain the power, the low-cost reliable power they need to escape crushing poverty.

I hope that any of you who are interested in this mission will join me on a fact-finding trip to Puerto Rico in the coming weeks. We will have an honest open discussion with Puerto Rican energy experts who are all too often left out of important policy discussions like this one. I would be honored to work with all of your offices, Democrat and Republican, to help the people of Puerto Rico flourish. I look forward to your questions and thank you again for the opportunity to share my perspective with you,

Q: Can you describe the benefits that affordable reliable energy has for communities in general?
A: Sure. Energy is the industry that powers every other industry; the lower cost and more reliable energy is the lower cost and more reliable everything is and vice versa. I just want to stress that Puerto Rico’s energy situation is terrible, and one of the reasons I want to testify today is because nobody is talking about that. They’re talking about, How do we maintain the status quo? The status quo is terrible in Puerto Rico. They desperately need more low-cost reliable energy.

And just a further comment. It doesn’t seem that people here know the facts and the percentages I noted about Puerto Ricans’ situation. For instance, I’m really disappointed that representative Ocasio-Cortez talks about shutting down the coal plant tomorrow. I don’t know if anyone knows what percentage of renewables the whole island has. So its’ 2.5%.

This is so disappointing that we’re talking about this so unseriously, when we really need to highlight the value of low-cost reliable energy and really talk about why Puerto Rico needs much more of it.

Q: In your opinion how will the Biden energy policies impact Puerto Rico?
A: It seems they’re going to make the rest of the US like Puerto Rico. I’m in California, and we’re already seeing this. In my work, I had two projects disrupted last year by blackouts.

We’re also seeing it in Texas. I don’t mean to pick on Representative Ocasio-cCortez, but she tweeted the infrastructures failures in Texas are quite literally what happens when you don’t pursue a green new deal. No, in fact plenty of places around the world can deal with hot and cold when they have enough reliable resilient electricity. Texas defunded reliable resilient electricity including winterization to pay tens of billions of dollars for unreliable solar and wind that don’t work when you need them the most. This energy policy is really the existential threat to talk about. It’s a threat to the US and to make Puerto Rico into a truly and consistently third world county.

Q: Mr Epstein in your testimony you noted that the most promising step that can be taken to lower emissions long term is the development of nuclear. Can you explain a little bit further why you believe nuclear energy Is more effective as a good alternative energy choice?

A: Sure.  It is so wrong that when we talk about potential alternatives to fossil fuels that nuclear is ruled out. You saw with the proposal of the green new deal there was anti-nuclear insistence on renewables which in practice means solar and wind. Renewable mandates usually exclude hydro as well.

This is totally the wrong approach if you’re looking for low carbon alternatives you have to be open to everything. And unfortunately the world is so anti-nuclear today that we’re shutting down record amounts of nuclear capacity this year, even though everyone claims to care about CO2 emissions.  In fact, nuclear provides extremely reliable on-demand power. Every grid in the world that works has on-demand power backing up solar and wind, since batteries do not exist anywhere.

If you’re concerned about coal ash, first of all you need to do real scientific studies that are systematic not just anecdotes and correlations. But if there’s a real problem we know how to solve coal ash problems. There are lots of ways to do that and if you don’t want to use coal in a given location, use natural gas or use nuclear. But the idea of mandating these unreliable solar and wind farms that make energy more expensive wherever they’re used, and then just manipulating data to ignore that fact, that policy is just absolutely devastating for Puerto Rico and anywhere else it’s applied.

In my testimony I noted this is being encouraged around the world by the US. We’re telling India to do this, and Indonesia to do this, places in Africa to do this to try to eliminate fossil fuels when that’s what they need to make their lives better. I think that’s really shameful and and I hope it stops.

Q: Mr. Epstein, you mentioned the need to consider the full context as we assess Puerto Rico’s energy generation. What what are the factors we should consider to understand the full context of the consequences that would stem from closing the AES coal plant?

A: Well, in general we just need to recognize that fossil fuels are the only way for them to get low-cost reliable energy for the foreseeable future. They need far more of it and there are very clear well-documented ways of using fossil fuels in a clean and responsible way. So the fact that there may be a problem with this particular plant, and again that needs to be scientifically studied so you come with a solution. It doesn’t mean we should shut it down; it means we should deal with the problems.

But more broadly, get rid of all the bad regulations and limitations that are preventing Puerto Rico from having low-cost reliable energy and flourishing.

Q: Are these things mutually exclusive? Can there be affordable and reliable energy today relying completely on renewable energy? 

A: I approach it a little bit differently because I think the world just undervalues low-cost reliable energy. Again this is fundamental to human flourishing, and this is something desperately needed around the world. Without low-cost reliable energy from fossil fuels people here can’t claim to care about life expectancy and health. These energy sources have driven down the rate of extreme poverty from over 40 percent people making less than two dollars a day to less than 10% in my lifetime. Access to energy is so important, yet billions of people still lack it. Four and a half billion people are living on less than ten dollars a day.

So my view is the world needs way more energy, and yes, we fortunately have modern technology that can produce it more and more cleanly. But to just look at the side effects of fossil fuels and not look at the benefits, you are condemning people to poverty, to suffering, condemning them to danger.

And I want to just remind everyone: A natural environment is not a good environment. Nature doesn’t give us a clean healthy environment, it gives us a very dirty and unhealthy environment. We need low-cost reliable energy to make the world a very livable place, and those of us who benefit from that in the US should really have sympathy for those in Puerto Rico, let alone the rest of the world that’s really really poor. And we should be doing nothing to impede them from using the most cost effective energy they can.

In conclusion, I would really welcome going on a fact-finding mission with many of you to look into the different energy realities. I think that we can see a way to move forward with cleaner coal, natural gas and nuclear instead of having this crazy dogma that we must rely on unreliable renewables. We should use the most cost-effective energy, and Puerto Rico needs far more of it not less.

Typical Arctic Ice Extents in June

 

 

Arctic2021181

Previous posts reported that Arctic Sea Ice has persisted this year despite a wavy Polar Vortex this spring, bringing cold down to mid-latitudes, and warming air into Arctic regions.  Then in May and now again in June,  the sea ice extent matched or exceeded the 14-year average several times during the month, tracking alongside until month end.  Surprisingly  SII (Sea Ice Index) showed much more ice the first week, similar extents mid- June, and then SII lost ice more rapidly the final week.  Yesterday both SII and MASIE day 181 were close to the same day in 2007.

Note that on the 14-year average, June loses ~2M km2 of ice extent, which 2021 matched, as did 2007.  Both 2020 and 2019 finished lower than average, by 500k and 400k respectively.  

Why is this important?  All the claims of global climate emergency depend on dangerously higher temperatures, lower sea ice, and rising sea levels.  The lack of additional warming is documented in a post Adios, Global Warming

The lack of acceleration in sea levels along coastlines has been discussed also.  See USCS Warnings of Coastal Floodings

Also, a longer term perspective is informative:

post-glacial_sea_level
The table below shows the distribution of Sea Ice across the Arctic Regions, on average, this year and 2007.

Region 2021181 Day 181 Average 2021-Ave. 2007181 2021-2007
 (0) Northern_Hemisphere 9644967 9741628  -96661  9672969 -28002 
 (1) Beaufort_Sea 999085 905769  93316  939209 59876 
 (2) Chukchi_Sea 760235 715065  45170  670088 90146 
 (3) East_Siberian_Sea 924474 1010406  -85932  901963 22511 
 (4) Laptev_Sea 578894 703006  -124112  658742 -79848 
 (5) Kara_Sea 527080 545919  -18839  657478 -130398 
 (6) Barents_Sea 129619 123601  6018  130101 -482 
 (7) Greenland_Sea 461815 501479  -39664  548399 -86584 
 (8) Baffin_Bay_Gulf_of_St._Lawrence 497237 504688  -7451  450461 46777 
 (9) Canadian_Archipelago 761843 778224  -16381  773611 -11768 
 (10) Hudson_Bay 736119 728550  7569  718441 17678 
 (11) Central_Arctic 3239262 3205301  33960  3218999 20262 
 (12) Bering_Sea 15316 4566  10750  981 14336 
 (13) Baltic_Sea 0 3 -3  0
 (14) Sea_of_Okhotsk 12919 13765  -847  2983 9936 

The overall deficit to average happened yesterday, being an extent 1% lower, and one day earlier than average.  The largest deficits to average are in East Siberian and Laptev Seas, along with Greenland Sea.  These are partly offset by surpluses elsewhere, mostly in Beaufort, Chukchi, and Central Artic seas.

 

 

A Look into Arizona Ballot Forensics

e94dca46-8a82-4615-98e4-4b84a0a5ad03

At Gateway Pundit is an article explaining the techniques for validating ballots used in 2020 elections Maricopa County Auditor Bob Hughes Shares How They Are Using High Tech Forensic Digital Cameras and OCR to Validate Ballots.  Excerpts in italics with my bolds.

Any massive vote fraud in Maricopa County Arizona is going to be identified. The Democrats must be terrified.

One of the auditors working for the Arizona Senate, Bob Hughes, discussed the audit and the reasons why the Democrats are absolutely frightened of the Arizona audit and it being performed in other states:

The other thing that I think is interesting is they keep saying, ‘They don’t know what they are doing’. ‘They’re idiots’. ‘These people are ridiculous’. ‘They have no idea what they are doing’. ‘They’ve never done this before’.

This is the first time in the history of the United States, number one, that it’s ever been done, but more important it’s the first time it’s ever been needed. And it was done.

I can tell you that I could go over all the process and you’d all understand, but when a ballot gets created, think about this. It’s like your bill being paid from SRP. They go out and get your voter identification number and they find out what precinct you live in, what city, what county, where your school districts are. All that information has to be accessed to create the proper ballot exactly for you. Because you have to vote for the right candidates in a city mayoral election, council elections, JP elections, the legislative district, the congressional district. [In fact, in 2020 there were 667 different versions of Maricopa County ballots.]

Think of those as maps that overlay the Maricopa County area and it creates all these little sections, and all these people get a very different ballot. So if somebody did what we were told they might have done, which is gone out and just duplicate a bunch of ballots, or put the same ballot in many times, or any of these kinds of things, I knew there was a way to find that out.

And so what we did is we, the cameras are not only cameras. They’re digital cameras. Digital cameras that are forensic. They’re actually police forensic cameras. They’re very, very high speed, high definition digital cameras. They make a scanned ballot.

So we scan that ballot. We then use optical character recognition (OCR). We’re looking at what’s in place on that ballot. Based on who that ballot is. How many should be? Can there be this many?…

What I can tell you is you now will have the most authentic count of every legal authentic ballot you could possibly have.

See Hughes’ speech in the video here.

AZ Ballot Audit

 

 

 

Why Technocrats Deliver Catastrophes

technorats-magazine-e1624833565281

Mark E. Jeftovic writes insightfully on the ways technology backfires when applied by bureaucrats in his article Why the Technocratic Mindset Produces Only Misery and Failure. H/T Tyler Durden at zerohedge. Excerpts in italics with my bolds.

Technocrats have the most fundamental aspect of reality backwards

Saw this article come across, come across my news alert for “Transhumanism”. In it Dr. David Eagleman talks about how not only can we augment human senses with fantastic new abilities (like to “see” heat and electromagnetic patterns), but how we’ll even be able to build machines that think too.

There is a line in his thinking that one can glean from the article: on one side of the line are enhancements and augmentations to the human experience which are startling and amazing and which will transform our societies: even more radical life extension will be in the cards quite soon (for those who can afford it).

Where Eagleman crosses into technocratic thinking is when he veers into the idea of being able to build thinking machines. The logic is that because we’ll be able to increasingly bioengineer our own living bodies, it means we should also be able to bioengineer a mind into machines using the same principles.

I think this is wrong and it’s the same theoretical mistake that leads directly to technocratically inspired catastrophes.

Yes, we continue to build on technological advancements, but we also commit a lot of unforced errors that inflict incalculable misery on humanity. These errors may manifest as policy blunders, economic crises and worse. Most recently, for example, we seem to have gotten ourselves into a global pandemic because a bunch of technocrats funded some gain-of-function experiments in hopes of preempting the next pandemic. Do you see the dynamic here?

Over the years a lot of thinkers have pointed out that technocratic policy tracks, devised by centralized groups of experts within an elite managerial class, often bring about the very conditions they were impaneled to obviate.

• Raising minimum wages increases unemployment.
• Holding interest rates to zero creates economic instability and increases wealth inequality.
• Forcing green energy initiatives creates systems with lower energy efficiency and higher carbon footprints.
• Banning guns increases gun violence.
• Censoring “hate” speech fosters more hatred and polarization.

It’s almost as if the managerial class has no awareness of second-order effects. When they inexorably come to pass they are often blamed on the very people who were counselling against the initial policy in the first place.

Thus, financial meltdowns are blamed on runaway free markets and capitalism gone wild. Global warming (if it truly plays out along prognosticated lines) is blamed on industries who are most rapidly transitioning toward greener energy anyway (like Bitcoin mining).

Climate change is another theme that exemplifies the technocratic dynamic: As a society we’re going to transition off of fossil fuels no matter what anybody thinks about the environment because we’re already past peak oil, and peak demand will probably flatline around 100M bpd and start coming down from there in a secular downtrend, for a variety of reasons (prolonged economic malaise and the ascent of green energy).

Yet the most viable pathway toward transitioning away from fossil fuels, nuclear (and in this I include Thorium), is currently relegated as problematic by technocrats and ideologues.

casper-landfill-2-e1624827835334

It all seems backwards and for a long time I’ve been positing a fundamental root cause of this backwardness. The premise is: We have the mind/matter equation completely backwards in the way we think about how the world works.

Conventional thought is that what we experience as consciousness is something that emanates from the brain. Like steam from a kettle. This is also the core assumption of AI. If we build something that resembles a brain, it’ll think. It’s a kind of Frankenstein approach that Eagleman alludes to in his article.

That won’t work and AI will never be achieved as long as the mechanistic, material reductionist worldview persists. Yet, technocrats put a lot of faith in AI, and they think models derived from AI are or will be superior to anything we can figure out on our own because they were outputted by machines with a bigger/faster/hardware brains.

It is completely… wrong.

I think that what we experience as matter are energy patterns that emanate from an underlying, and conscious sub-strata of reality. This is basic quantum theory. Quantum theory can be problematic because it opens the door to all kinds of New Age Woo Woo, which may not even be entirely wrong at its core, but is prone to deeply flawed implementations (like anything, I guess).

People, and probably most living things, have a sense, an intuitive awareness of this sub-strata of reality. Our mythology and sacred texts are probably the stories of sometimes being more attuned to it and sometimes less so. The late British writer Colin Wilson wrote at length on the consciousness of the Egyptians of the upper kingdom, possibly over 7500 years BC. Their consciousness and language was pictorial not linear. It may even be possible (my extrapolation, not his) that the demarcation point between conscious awareness between individuals was blurred somewhat. 

So what happened?

Into this awareness came religions. Organized structures that would begin to dictate the basis on which members of society were to comprehend and approach this Great Sub-Carrier. Priesthoods evolved – the first monopolies. Religions. Hierarchies. Rulers. Subjects.

One of the earliest forms of social deviance was heresy: approaching the Divine Sub-Carrier from a direction outside the religious structure. Can’t have that.

This dynamic is as old as humanity. It could even be argued that historical progress is the story of the public coming to realize that the monopoly thought structure they were in was flawed or obsolete and then society moving on to the next one. The elites of the day would endeavour to halt the progression or when that failed, co-opt whatever came next.

Then new elites would erect a new orthodoxy that placed them directly in the nexus of what was unknowable and what the rabble thought they needed to know in order to perform their primary function of ….servitude.

Today the great sub-carrier is best described by science, not religion. But again, the priesthood is saying that all knowledge of the sub-carrier should come through them. That’s Scientism. That’s Technocracy. Management by Experts.

The last two years of life on earth are a foretaste of a full blown technocracy. Follow The Science™, plebes.

Only our elites can fathom how to approach and extract knowledge from The Great Externality, but this time they’ve made things even worse because they have it exactly backwards. They think the Great Externality doesn’t even exist. It’s for flakes and Bible bangers. The technocratic priesthood holds that material reality is near completely understood and that our minds are side effects of chemical reactions in our brains.

They hold that if only we can crunch enough Big Data and calculate out all the models we’ll be, like God (who doesn’t exist), able to fix everything and eliminate all bad outcomes, for everybody, everywhere. We may even be able to eliminate death, and we could upload our consciousness (which is an illusion) into the cloud and live forever.

Because of this backwardation, we will always be careening from one catastrophe to the next, and most of them will be of our own making. We collectively suffer from an illusion that we are in control.

But we are not in control. We’re a pattern. A dance. A cycle. Waveforms. Vibrations. What we as humans do specifically well, which is our superpower and has led to our technological advancement which could conceivably continue on a trajectory that makes humanity an interstellar phenomenon, is adapt.

What technocrats can’t understand, or admit is that we can’t control what is going to happen. Either on an individual scale of people thinking in ways they’re not supposed to think, or geological, cultural, geopolitical or cosmic scales. We can’t get interest rates right, we can’t get everybody to agree on whether it’s “Gif” or “jif” and somehow we’re going to change the trajectory of the climate? Achieve immortality? Crank out a Singularity?

That is highly unlikely and in trying to preempt theoretical bad outcomes we typically bring about horrible actual outcomes.

The lab leak from the Wuhan Institute of Virology, if it occurred and it is looking increasingly likely that it did, was the result of gain-of-function studies on bat coronaviruses. They didn’t do it as a bioweapon. It’s not a global conspiracy to institute a Great Reset (all that talk is opportunism more than planning).

They were trying to figure out how to plan for a future global pandemic that may catch humanity off guard and cause incalculable damage. What did they accomplish? They unleashed a global pandemic that caught humanity off guard and caused incalculable damage. Soon to be compounded by global, de-facto compulsory inoculations with experimental vaccines that have a distinctly politicized impetus behind them.

That same dynamic is applied to economics (its where the .COM crash and Global Financial Crisis came from), and social policy (the Woke movement), to climate is all the same technocratic mindset that doesn’t understand the order of reality (mind, then matter) but even worse thinks it knows it.

We’re stuck with that for awhile because the technocratic mindset is incapable of introspection or entertaining the possibility of being wrong about anything. The only move it knows is to double-down on failure.

The antidote to all this is massive decentralization on a global scale, which has the added benefit that decentralization by definition, is not something that gets decided from the top (it never is). It just happens, even in spite of the people in the centre of power who may feel something about their gravitas melting away.

That’s what has started to happen. A global opt-out. The Great Reject. As sure as the Reformation gave way to the Enlightenment despite the protestations of the Church, we’re headed into a world of networks and the sunset of nations. All the while the propagandists of the old order shrieking that in this direction lies certain doom.

The Enlightenment arose from an increase in the level of abstraction, structurally the universe changed from the Ptolemaic worldview (the world as the centre of all existence) to the Heliocentric solar system.

Now we’re experiencing a similar shift away from static top-down hierarchical structures as the natural shape of civilization and toward shifting, impermanent, overlapping networks.

Footnote:  Another Example of Technocratic Adventurism

From American Thinker The Grave Perils of Genetic Editing.  Excerpts in italics with my bolds.

A company called Oxitec, based in the U.K., is piloting a program using gene-/information-modified mosquitos to eliminate the invasive female Aedes aegypti mosquitoes in the Florida Keys. The mosquitoes potentially spread diseases such as Dengue fever and Zika.

Dr. Nathan Rose, head regulator of Oxitec, said mosquito-borne diseases are likely to worsen as a result of climate change. According to the CDC, in a ten-year span between 2010 and 2020, there were 71 cases of Dengue fever transmitted in Florida. In essence, the experiment is being conducted for fear of climate change causing a drastic increase in incidence of Dengue fever. In the Fox article, Rose states that Oxitec will first experiment in Florida, collect data, then “go to the U.S. regulatory agencies to actually get a commercial registration to be able to release these mosquitoes more broadly within the United States.”

Don’t think the Florida Keys just opened their arms with a great big bear hug to this experiment. No, there were pushback and questions. In fact, Oxitec had been pushing this experiment to Key Haven and Key West for years, only to be rejected. Many other places have also declined this experiment. When it was conducted in Brazil, it initially seemed to work, but in the end, the mutated mosquitos transferred mutations to the general public. Thankfully, gene drive was not used in the Brazil experiment, for this type of gene manipulation cannot be reversed and can wipe out a species over time.

Evidently, Oxitec has created a second-generation “friendly mosquito” technology, where new male mosquitoes are programmed to kill only female mosquitoes, with males serving and passing on the modified genes to male offspring for generations. Yes, they are programmed to kill. Oxitec CEO Grey Frandsen announced in 2020 that Oxitec looked forward to working with the Florida Keys community to “demonstrate the effectiveness of our safe, sustainable technology in light of the growing challenges controlling this disease-spreading mosquito.”

Let’s hope the Florida mosquitoes experiment is truly a necessity and not some type of climate-change fear-mongering “sustainable” technology based on speculation.

Don’t Assume Global Warming Blunts Economic Growth

sun0623

In recent years, a strand of economic literature has argued that warming
not only negatively affects the level of economic activity,
but also the rate of income growth. PHOTO BY BLOOMBERG

Ross McKitrick explains in his Financial Post article Why climate change won’t hurt growth.  Excerpts in italics with my bolds.

There is no robust evidence that even the worst-case warming scenarios would cause overall economic losses

It has long been observed that global poverty tends to be concentrated in hot, tropical regions. But persistent poverty in African and South American countries has political and historical roots, especially their embrace of Soviet-backed communism in the 20th century. In places where economic reforms were adopted, like South Asia, growth took off and they quickly converged with the West, despite having tropical climates. So the connection to climate may be coincidental.

But in recent years, a strand of economic literature has argued that warming not only negatively affects the level of economic activity, but also the rate of income growth. This matters because when conducting an analysis over a 100-year time span, small changes in the growth rate can compound over a century and result in large total changes.

A 2012 study led by Melissa Dell of Harvard University presented evidence that warming had insignificant effects on income growth in rich countries, but in poor countries the effect was negative and statistically significant. Another team used this result in a policy model to argue that the “social cost of carbon” was at least 10 times higher than previously thought.

scc-working-group

This was followed up by several studies led by economists Marshall Burke of Stanford and Solomon Hsiang of Berkeley, who reported evidence that warming had significant negative effects on wealthy and poor countries alike. Suddenly a picture emerged that warming is much more harmful than we thought, so it should be full steam ahead on aggressive climate policy. Global policymakers have embraced this belief, in part at the urging of the United Nations Intergovernmental Panel on Climate Change’s (IPCC) 2018 special report on global warming of 1.5 C, which highlighted this research.

But other research tells a different story. One of the challenges in climate economics is that climate data are collected on a grid cell basis (organized in latitude-longitude boxes), while economic data is collected at the national level. To match them up, Dell’s group averaged the climate data up to the national level. There are different ways of doing the averaging, however, and the results are sensitive to the chosen method.

Other teams have begun trying to build economic data sets at the local and regional level so the averaging step can be omitted. One group from Northern Arizona University used grid cell-level economic data from around the world and found, like Dell, that warming temperatures has no effect on growth in rich countries, but they found it has a positive effect in poor countries up to an average temperature of about 17.5 C, which is above the sample average temperature of 14.4 C.

Then a team from Germany developed a regional economic database that lets them account for what economists call “country fixed effects,” namely, unobservable historical and institutional factors specific to each country that are unrelated to, in this case, the climate variables.

When they apply this method, the climate effects on growth and output vanish for rich and poor countries alike.

More recently, a group led by Richard Newell of Resources for the Future raised the issue that the econometric modelling can be done many different ways. Given the same data set, there are lots of decisions to make, such as how many lagged effects to include, whether to use linear or nonlinear equations and whether to use time trends. Altogether, they counted 800 different ways the same data could be analyzed.

In order to determine whether the results depend on the choice of models, they obtained the data set used by the Burke team and used the same country-level averaging method employed by Dell’s team. Then they ran a meta-analysis in which they ran all the possible models and evaluated at how well each one fit the data, in order to identify the best-performing models to reach their conclusions.

Dozens of different models all fit the data about equally well, and they could not rule out that the best ones do not include any role for temperature in economic growth. There was some evidence that warming is good for growth up to 13.4 C, but the positive and negative effects were not statistically significant.

Across the entire range of temperatures in the sample there was no significant influence of climate on either output or growth.

Under the highest-warming scenario, the Burke team had projected a 49 per cent global GDP loss from climate change by 2100, but Newell found the model variant that fits their data best implied a slight global GDP gain. The best growth models as a group project an effect on GDP by 2100 ranging from -84 per cent to +359 per cent, with the central estimates very close to zero. In other words, the effects are too imprecise to say much of anything for certain.

Now we come up against the challenge that policymakers seem to find it easier to deal with gloomy certainty than optimistic uncertainty. In the blink of an eye, a handful of studies in a new research area had become the canonical truth, on which governments swung into a much more aggressive climate policy stance.

But as time has advanced, new data sets, and even reanalysis of the old data sets, has called those results into question and has shown that temperature (and precipitation) changes likely have insignificant effects on GDP and growth, and the effects are as likely to be positive as they are to be negative. This does not mean there aren’t specific regions and specific industries where there are potential losses, especially if the countries don’t adapt. But for the world as a whole, there is no robust evidence that even the worst-case warming scenarios would cause overall economic losses.

It now falls to advisory groups like the IPCC to tell this to world leaders, before they enact any more disastrous climate policies that will do all the harm (and more) that the evidence says climate change itself will not do.

gv020921dapr20210209064505

Footnote:  There are also economists pushing the notion of direct costs from global warming/climate change due to supposed increasing health and prosperity impacts from extreme weather.  This is contrary to IPCC approved studies by economist William Nordhaus.  See IPCC Freakonomics

maxresdefault

Georgia Ballot Review Case Going Forward

fulton-county-voter-fraud-696x252-1

AP reporter Kate Brumback writes at SFGATE Judge allows Georgia ballot review case to move forward.  Excerpts in italics with my bolds.

ATLANTA (AP) — A judge on Thursday allowed a lawsuit alleging fraud in Georgia’s most populous county during the November election and seeking a review of absentee ballots to move forward.

Originally filed in December, the lawsuit says there is evidence of fraudulent ballots and improper ballot counting in Fulton County. The county, county elections board and county courts clerk had filed motions to dismiss the lawsuit. They argued, among other things that the the lawsuit was barred by sovereign immunity, a principle that says state and local governments and can only be sued if they agree to it.

After holding a hearing on those motions Monday, Henry County Superior Court Chief Judge Brian Amero, who was specially appointed to preside over the case, agreed. He ruled that the constitutional claims against those three entities are barred by sovereign immunity and dismissed them. But he also granted a request by the petitioners to add the individual members of the county election board as respondents in the lawsuit instead.

The suit was filed by nine Georgia voters and is spearheaded by Garland Favorito, a longtime critic of Georgia’s election systems. As part of the suit, they are seeking to inspect some 147,000 absentee ballots to determine whether there are illegitimate ballots among them.

Several election workers and volunteers have signed sworn statements saying they saw absentee ballots during the audit that weren’t creased from being mailed, appeared to be marked by a machine rather than by hand and were printed on different paper. The lawsuit also repeats a widely circulated claim of fraud based on security video that shows cases of ballots being pulled from under a skirted table and counted while observers and the news media weren’t present.

Fulton County officials have consistently defended the integrity of the election and have criticized the ballot review effort. The secretary of state’s office says it has investigated the claims and found no evidence of fraud. An independent monitor who observed Fulton County’s election operations as part of a consent agreement said he witnessed sloppy practices and systemic mismanagement but said there was nothing that should cast doubt on the county’s election results.The ballots are kept under seal in the custody of the clerk of Fulton County Superior and Magistrate courts. Amero in April ordered the court clerk to release the scanned absentee ballot images. At a hearing last month, Amero ordered that the paper ballots themselves be unsealed so that the petitioners who filed the lawsuit can inspect and scan them.

He had set a meeting for May 28 with the parties to sort out the logistics of how that review and scanning of paper ballots would proceed. But that meeting was canceled so he could hear the motions to dismiss first.

VoterGA Comment:

“We are pleased that the court has ruled in our favor again for the fifth time. The ruling substitutes Defendants by replacing currently named government organizations with individual board members we named originally in our lawsuit. It also moots Don Samuels’ attempt to dismiss our case. This continues the string of victories we have including how we obtained the original protective order, conditional approval to inspect ballots, access to ballot images, and the order to unseal the ballots.”

screen-shot-2021-06-21-at-9.58.17-am

,

Tropics Lead Ocean Temps Return to Mean


The best context for understanding decadal temperature changes comes from the world’s sea surface temperatures (SST), for several reasons:

  • The ocean covers 71% of the globe and drives average temperatures;
  • SSTs have a constant water content, (unlike air temperatures), so give a better reading of heat content variations;
  • A major El Nino was the dominant climate feature in recent years.

HadSST is generally regarded as the best of the global SST data sets, and so the temperature story here comes from that source, the latest version being HadSST3.  More on what distinguishes HadSST3 from other SST products at the end.

The Current Context

The year end report below showed 2020 rapidly cooling in all regions.  The anomalies have continued to drop sharply well below the mean since 1995.  This Global Cooling was also evident in the UAH Land and Ocean air temperatures ( See March 2021 Ocean Chill Deepens) 

The chart below shows SST monthly anomalies as reported in HadSST3 starting in 2015 through May 2021. After three straight Spring 2020 months of cooling led by the tropics and SH, NH spiked in the summer, along with smaller bumps elsewhere.  Then temps everywhere dropped the last six months, hitting bottom in February 2021.  All regions were well below the Global Mean since 2015, matching the cold of 2018, and lower than January 2015. Now the spring is bringing more temperate waters and a return to the mean anomaly since 2015.

Hadsst052021

A global cooling pattern is seen clearly in the Tropics since its peak in 2016, joined by NH and SH cycling downward since 2016.  

Note that higher temps in 2015 and 2016 were first of all due to a sharp rise in Tropical SST, beginning in March 2015, peaking in January 2016, and steadily declining back below its beginning level. Secondly, the Northern Hemisphere added three bumps on the shoulders of Tropical warming, with peaks in August of each year.  A fourth NH bump was lower and peaked in September 2018.  As noted above, a fifth peak in August 2019 and a sixth August 2020 exceeded the four previous upward bumps in NH.

In 2019 all regions had been converging to reach nearly the same value in April.  Then  NH rose exceptionally by almost 0.5C over the four summer months, in August 2019 exceeding previous summer peaks in NH since 2015.  In the 4 succeeding months, that warm NH pulse reversed sharply. Then again NH temps warmed to a 2020 summer peak, matching 2019.  This has now been reversed with all regions pulling the Global anomaly downward sharply, tempered by warming in March and April, and with May a return to the global mean anomaly since 2015.

And as before, note that the global release of heat was not dramatic, due to the Southern Hemisphere offsetting the Northern one.  Note the May warming was strongest in the Tropics, though the anomaly is quite cool compared to 2016.

A longer view of SSTs

The graph below  is noisy, but the density is needed to see the seasonal patterns in the oceanic fluctuations.  Previous posts focused on the rise and fall of the last El Nino starting in 2015.  This post adds a longer view, encompassing the significant 1998 El Nino and since.  The color schemes are retained for Global, Tropics, NH and SH anomalies.  Despite the longer time frame, I have kept the monthly data (rather than yearly averages) because of interesting shifts between January and July.

Hadsst1995to 0520211995 is a reasonable (ENSO neutral) starting point prior to the first El Nino.  The sharp Tropical rise peaking in 1998 is dominant in the record, starting Jan. ’97 to pull up SSTs uniformly before returning to the same level Jan. ’99.  For the next 2 years, the Tropics stayed down, and the world’s oceans held steady around 0.2C above 1961 to 1990 average.

Then comes a steady rise over two years to a lesser peak Jan. 2003, but again uniformly pulling all oceans up around 0.4C.  Something changes at this point, with more hemispheric divergence than before. Over the 4 years until Jan 2007, the Tropics go through ups and downs, NH a series of ups and SH mostly downs.  As a result the Global average fluctuates around that same 0.4C, which also turns out to be the average for the entire record since 1995.

2007 stands out with a sharp drop in temperatures so that Jan.08 matches the low in Jan. ’99, but starting from a lower high. The oceans all decline as well, until temps build peaking in 2010.

Now again a different pattern appears.  The Tropics cool sharply to Jan 11, then rise steadily for 4 years to Jan 15, at which point the most recent major El Nino takes off.  But this time in contrast to ’97-’99, the Northern Hemisphere produces peaks every summer pulling up the Global average.  In fact, these NH peaks appear every July starting in 2003, growing stronger to produce 3 massive highs in 2014, 15 and 16.  NH July 2017 was only slightly lower, and a fifth NH peak still lower in Sept. 2018.

The highest summer NH peaks came in 2019 and 2020, only this time the Tropics and SH are offsetting rather adding to the warming. (Note: these are high anomalies on top of the highest absolute temps in the NH.)  Since 2014 SH has played a moderating role, offsetting the NH warming pulses. After September 2020 temps dropped off down until February 2021, and now all regions are rising to bring the global anomaly above the mean since 1995.

What to make of all this? The patterns suggest that in addition to El Ninos in the Pacific driving the Tropic SSTs, something else is going on in the NH.  The obvious culprit is the North Atlantic, since I have seen this sort of pulsing before.  After reading some papers by David Dilley, I confirmed his observation of Atlantic pulses into the Arctic every 8 to 10 years.

But the peaks coming nearly every summer in HadSST require a different picture.  Let’s look at August, the hottest month in the North Atlantic from the Kaplan dataset.
AMO Aug and Dec 2021The AMO Index is from from Kaplan SST v2, the unaltered and not detrended dataset. By definition, the data are monthly average SSTs interpolated to a 5×5 grid over the North Atlantic basically 0 to 70N. The graph shows August warming began after 1992 up to 1998, with a series of matching years since, including 2020.  Because the N. Atlantic has partnered with the Pacific ENSO recently, let’s take a closer look at some AMO years in the last 2 decades.
AMO decade 052021This graph shows monthly AMO temps for some important years. The Peak years were 1998, 2010 and 2016, with the latter emphasized as the most recent. The other years show lesser warming, with 2007 emphasized as the coolest in the last 20 years. Note the red 2018 line is at the bottom of all these tracks. The black line shows that 2020 began slightly warm, then set records for 3 months. then dropped below 2016 and 2017, peaked in August ending below 2016. Now in 2021, AMO is tracking the coldest years.

Summary

The oceans are driving the warming this century.  SSTs took a step up with the 1998 El Nino and have stayed there with help from the North Atlantic, and more recently the Pacific northern “Blob.”  The ocean surfaces are releasing a lot of energy, warming the air, but eventually will have a cooling effect.  The decline after 1937 was rapid by comparison, so one wonders: How long can the oceans keep this up? If the pattern of recent years continues, NH SST anomalies may rise slightly in coming months, but once again, ENSO which has weakened will probably determine the outcome.

Footnote: Why Rely on HadSST3

HadSST3 is distinguished from other SST products because HadCRU (Hadley Climatic Research Unit) does not engage in SST interpolation, i.e. infilling estimated anomalies into grid cells lacking sufficient sampling in a given month. From reading the documentation and from queries to Met Office, this is their procedure.

HadSST3 imports data from gridcells containing ocean, excluding land cells. From past records, they have calculated daily and monthly average readings for each grid cell for the period 1961 to 1990. Those temperatures form the baseline from which anomalies are calculated.

In a given month, each gridcell with sufficient sampling is averaged for the month and then the baseline value for that cell and that month is subtracted, resulting in the monthly anomaly for that cell. All cells with monthly anomalies are averaged to produce global, hemispheric and tropical anomalies for the month, based on the cells in those locations. For example, Tropics averages include ocean grid cells lying between latitudes 20N and 20S.

Gridcells lacking sufficient sampling that month are left out of the averaging, and the uncertainty from such missing data is estimated. IMO that is more reasonable than inventing data to infill. And it seems that the Global Drifter Array displayed in the top image is providing more uniform coverage of the oceans than in the past.

uss-pearl-harbor-deploys-global-drifter-buoys-in-pacific-ocean

USS Pearl Harbor deploys Global Drifter Buoys in Pacific Ocean