Covid Rapid Tests Finally Out from Quebec Storage (200 Scientists Ask)

Our son is working mightily to keep his HVAC enterprise alive during the contagion.  Last week a key employee after visiting a pharmacy where a person tested positive was told to go home and get tested (took two days), and then stay home to wait for results (3 more days).  So a work week was lost and the business hobbled during that time.  So I wondered what happened to all those rapid covid tests that were purchased back in September.

The story came out in the Montreal Gazette January 14, 2021 Scientists publish open letter calling for Quebec to use rapid testing.  Excerpts in italics with my bolds.

“We have 1.2 million of those tests just sitting in a warehouse in Farnham,” Université de Montréal professor Roxane Borgès Da Silva says.

A group of 213 scientists, professors, health-care workers and patients published an open letter to the Legault government Thursday calling on Quebec to roll out rapid COVID-19 tests to curb outbreaks more quickly and to step up its communications strategy.

“We have 1.2 million of those tests just sitting in a warehouse in Farnham,” Roxane Borgès Da Silva, a professor with the Université de Montréal’s school of public health, said in an interview Wednesday night. “We have reached a point in the evolution of the pandemic where the health system is at the breaking point. It is time that we use every tool at our disposal.

In the letter, the signatories from institutions including the Université de Montréal, McGill, Laval, UQAM and the Institut national de santé publique du Québec (INSPQ), note the partial closure of businesses and restrictions on gatherings have failed to stem the increase in coronavirus cases, hospitalizations and deaths.

With a month-long curfew instituted Saturday, officials must take advantage of the expected decline in cases to counter the spread through better screening, which is “all the more important in a context where possible variants of the SARS-CoV-2 virus, which are more infectious, could spread in Quebec,” the letter reads.

Using rapid tests could help to identify positive cases more quickly, particularly those among asymptomatic patients and those who have been at so-called “super-spreader” events, the letter states

“In this regard, we believe that the simple and widely available access throughout the territory to rapid tests … allowing a result to be obtained in a few minutes could be a game-changer, especially in places of propagation.”

Quebec has been hesitant to use the tests widely because it fears their lack of sensitivity could clear people for COVID-19 when they actually have the virus. But Da Silva said the tests are close to 90-per-cent accurate when used on patients who are in an extremely contagious phase, which is crucial to stopping the most dangerous transmitters. The tests could be used at workplaces, high schools and CHSLDs, or be made available at pharmacies and doctors’ offices to allow the public to get tested quickly, Da Silva said.

As well, the signatories argue the government needs to do a better job of transmitting data and information to the public to improve citizens’ confidence in the science and improve compliance with regulations. They suggest using improved communication campaigns that appeal to the general public and using new strategies that incorporate humour and music.

More from CBC (here): 

The tests, which return a result in as little as 15 minutes, have variously been called a “game-changer” (Ontario Premier Doug Ford) and “less safe” as the gold-standard PCR lab test (Quebec Health Minister Christian Dubé, to Le Devoir).

The chair of McGill University’s bioengineering department, Dr. David Juncker, leans to the Ford end of the spectrum —provided the tests are used effectively. Right now they are not, he said, and it’s to the detriment of the broader testing effort.

“The current testing system isn’t very effective in terms of contact, trace and isolate … it’s too slow, it’s too cumbersome, it has too many delays. That’s one of the reasons we’re failing in containing the spread of the pandemic,” said Juncker, an expert on diagnostic testing.

The main area of concern for the provincial government — also voiced by federal officials — is the the rapid test’s lower accuracy, or sensitivity, and the risk of false negatives.

Those fears are overblown, in Juncker’s view, because the rapid tests can still help ferret out highly infectious people.

“If we just speak about diagnostic performance … the PCR test is the most effective one,” he said. “But if we think about what we want to use this for, as a public health tool that we want to use to contain and detect infectious individuals very quickly and isolate them very fast, that’s where rapid tests can be very helpful.”

Comment:  The need is to quickly identify people with enough viral load to infect others, and to become sick themselves.  Rapid tests excel at this when applied to people with symptoms that might or might not be Covid19.  Officials have been obsessed with PCR tests which are hyper-sensitive and show people as positive with too little load or even from a trace of dead virus.  Those false positives generate lots of fear and clog the system with people unnecessarily.

Background from Previous Post On Non-Infectious Covid Positives

Daniel Payne writes at Just the News Growing research indicates many COVID-19 cases might not be infectious at all. Excerpts in italics with my bolds.

Elevated ‘cycle thresholds’ may be detecting virus long after it is past the point of infection.

A growing body of research suggests that a significant number of confirmed COVID-19 infections in the U.S. — perhaps as many as 9 out of every 10 — may not be infectious at all, with much of the country’s testing equipment possibly picking up mere fragments of the disease rather than full-blown infections.

Yet a burgeoning line of scientific inquiry suggests that many confirmed infections of COVID-19 may actually be just residual traces of the virus itself, a contention that — if true — may suggest both that current high levels of positive viruses are clinically insignificant and that the mitigation measures used to suppress them may be excessive.

Background from previous post: New Better and Faster Covid Test

Kevin Pham reports on a breakthrough in coronavirus testing. Excerpts in italics with my bolds.

Another new test for COVID-19 was recently authorized — and this one could be a game-changer.

The Abbot Diagnostics BinaxNOW antigen test is a new point-of-care test that reportedly costs only $5 to administer, delivers results in as little as 15 minutes, and requires no laboratory equipment to perform. That means it can be used in clinics far from commercial labs or without relying on a nearby hospital lab.

That last factor is key. There are other quick COVID-19 tests on the market, but they have all required lab equipment that can be expensive to maintain and operate, and costs can be prohibitive in places that need tests most.

This kind of test is reminiscent of rapid flu tests that are ubiquitous in clinics. They’ll give providers tremendous flexibility in testing for the disease in not just clinics, but with trained and licensed medical professionals, in schools, workplaces, camps, or any other number of places.

So what’s new about this test? Most of the current tests detect viral RNA, the genetic material of SARS-CoV-2. This is a very accurate way of detecting the virus, but it requires lab equipment to break apart the virus and amplify the amount of genetic material to high enough levels for detection.

The BinaxNOW test detects antigens — proteins unique to the virus that are usually detectable whenever there is an active infection.

Abbott says it intends to produce 50 million tests per month starting in October. That’s far more than the number tested in July, when we were breaking new testing records on a daily basis with approximately 23 million tests recorded.

There’s a more important reason to be encouraged by this test coming available.  The viral load is not amplified by the test, so a positive is actually a person needing isolation and treatment.  As explained in a previous post below,  the PCR tests used up to now clutter up the record by showing as positive people with viral loads too low to be sick or to infect others.

Background from Previous Post The Truth About CV Tests

The peoples’ instincts are right, though they have been kept in the dark about this “pandemic” that isn’t.  Responsible citizens are starting to act out their outrage from being victimized by a medical-industrial complex (to update Eisenhower’s warning decades ago).  The truth is, governments are not justified to take away inalienable rights to life, liberty and the pursuit of happiness.  There are several layers of disinformation involved in scaring the public.  This post digs into the CV tests, and why the results don’t mean what the media and officials claim.

For months now, I have been updating the progress in Canada of the CV outbreak.  A previous post later on goes into the details of extracting data on tests, persons testing positive (termed “cases” without regard for illness symptoms) and deaths after testing positive.  Currently, the contagion looks like this.

The graph shows that deaths are less than 5 a day, compared to a daily death rate of 906 in Canada from all causes.  Also significant is the positivity ratio:  the % of persons testing positive out of all persons tested each day.  That % has been fairly steady for months now:  1% positive means 99% of people are not infected. And this is despite more than doubling the rate of testing.

But what does testing positive actually mean?  Herein lies more truth that has been hidden from the public for the sake of an agenda to control free movement and activity.  Background context comes from  Could Rapid Coronavirus Testing Help Life Return To Normal?, an interview at On Point with Dr. Michael Mina.  Excerpts in italics with my bolds. H/T Kip Hansen

A sign displays a new rapid coronavirus test on the new Abbott ID Now machine at a ProHEALTH center in Brooklyn on August 27, 2020 in New York City. (Spencer Platt/Getty Images)

Dr. Michael Mina:

COVID tests can actually be put onto a piece of paper, very much like a pregnancy test. In fact, it’s almost exactly like a pregnancy test. But instead of looking for the hormones that tell if somebody is pregnant, it looks for the virus proteins that are part of SA’s code to virus. And it would be very simple: You’d either swab the front of your nose or you’d take some saliva from under your tongue, for example, and put it onto one of these paper strips, essentially. And if you see a line, it means you’re positive. And if you see no line, it means you are negative, at least for having a high viral load that could be transmissible to other people.

An antigen is one of the proteins in the virus. And so unlike the PCR test, which is what most people who have received a test today have generally received a PCR test. And looking those types of tests look for the genome of the virus to RNA and you could think of RNA the same way that humans have DNA. This virus has RNA. But instead of looking for RNA like the PCR test, these antigen tests look for pieces of the protein. It would be like if I wanted a test to tell me, you know, that somebody was an individual, it would actually look for features like their eyes or their nose. And in this case, it is looking for different parts of the virus. In general, the spike protein or the nuclear capsid, these are two parts of the virus.

The reason that these antigen tests are going to be a little bit less sensitive to detect the virus molecules is because there’s no step that we call an amplification step. One of the things that makes the PCR test that looks for the virus RNA so powerful is that it can take just one molecule, which the sensor on the machine might not be able to detect readily, but then it amplifies that molecule millions and millions of times so that the sensor can see it. These antigen tests, because they’re so simple and so easy to use and just happen on a piece of paper, they don’t have that amplification step right now. And so they require a larger amount of virus in order to be able to detect it. And that’s why I like to think of these types of tests having their primary advantage to detect people with enough virus that they might be transmitting or transmissible to other people.”

The PCR test, provides a simple yes/no answer to the question of whether a patient is infected.
Source: Covid Confusion On PCR Testing: Maybe Most Of Those Positives Are Negatives.

Similar PCR tests for other viruses nearly always offer some measure of the amount of virus. But yes/no isn’t good enough, Mina added. “It’s the amount of virus that should dictate the infected patient’s next steps. “It’s really irresponsible, I think, to [ignore this]” Dr. Mina said, of how contagious an infected patient may be.

We’ve been using one type of data for everything,” Mina said. “for [diagnosing patients], for public health, and for policy decision-making.”

The PCR test amplifies genetic matter from the virus in cycles; the fewer cycles required, the greater the amount of virus, or viral load, in the sample. The greater the viral load, the more likely the patient is to be contagious.

This number of amplification cycles needed to find the virus, called the cycle threshold, is never included in the results sent to doctors and coronavirus patients, although if it was, it could give them an idea of how infectious the patients are.

One solution would be to adjust the cycle threshold used now to decide that a patient is infected. Most tests set the limit at 40, a few at 37. This means that you are positive for the coronavirus if the test process required up to 40 cycles, or 37, to detect the virus.

Any test with a cycle threshold above 35 is too sensitive, Juliet Morrison, a virologist at the University of California, Riverside told the New York Times. “I’m shocked that people would think that 40 could represent a positive,” she said.

A more reasonable cutoff would be 30 to 35, she added. Dr. Mina said he would set the figure at 30, or even less.

Another solution, researchers agree, is to use even more widespread use of Rapid Diagnostic Tests (RDTs) which are much less sensitive and more likely to identify only patients with high levels of virus who are a transmission risk.

Comment:  In other words, when they analyzed the tests that also reported cycle threshold (CT), they found that 85 to 90 percent were above 30. According to Dr. Mina a CT of 37 is 100 times too sensitive (7 cycles too much, 2^7 = 128) and a CT of 40 is 1,000 times too sensitive (10 cycles too much, 2^10 = 1024). Based on their sample of tests that also reported CT, as few as 10 percent of people with positive PCR tests actually have an active COVID-19 infection. Which is a lot less than reported.

Here is a graph showing how this applies to Canada.

It is evident that increased testing has resulted in more positives, while the positivity rate is unchanged. Doubling the tests has doubled the positives, up from 300 a day to nearly 600 a day presently.  Note these are PCR results. And the discussion above suggests that the number of persons with an active infectious viral load is likely 10% of those reported positive: IOW up from 30 a day to 60 a day.  And in the graph below, the total of actual cases in Canada is likely on the order of 13,000 total from the last 7 months, an average of 62 cases a day.

 

Salla Finland Should Host 2032 Summer Olympics

The Big Issue reports Why a tiny town in Lapland is bidding to host the summer Olympics.  Excerpts in italics with my bolds.

The residents of the Finnish town Salla are making the most of climate change, saying they’d be the perfect host of the 2032 Olympics: “Warm heart, we have it. Warm place, coming soon.”

“Welcome to Salla, the coldest place in Finland,” says local mayor Erkki Parkkinen in a tongue-in-cheek video declaring its candidacy, as residents explain how by 2032 Salla would be the ideal host.

“In 12 years, ice will be gone and this will be a perfect lake,” says a man taking a dip in freezing water while eating an ice cream. “No more slippery ice, thanks global warming!” adds a skateboarder.

Salla may only have a population of around 3,000 people – who are greatly outnumbered by the reindeer in the area – but it has plenty of sporting facilities.

A map of the town has been created showing where activities happen. Today there is skating and curling on the frozen lake. By 2032, those will become the swimming and beach volleyball arenas.

Salla residents look forward to the impact climate change will bring

Like all other Olympic Games, branding is of prime importance. Salla’s logo has the yellow Olympic ring melting snow on a mountain. There’s also a mascot – Kesa the heat-exhausted reindeer.

Kesa, the heat exhausted reindeer is Salla’s Olympic mascot

Footnote:  Greta’s Fridays for the Future appears at the end of the video clip.  So she should go all in for Salla 2032 since she is certain the planet has a fever with a 12-year deadline.

Actually, the notion of Salla 2032 is not that far-fetched, since July average temperatures are 15 degrees Celsius, along with long days of sunshine.  Montreal hosted the 1976 summer games, and we have ice hotels in the winter and nearly as much snow as Moscow.  The comedy is about the remoteness and facilities, not about the weather.  But there is hope for Salla: in Canada our average temperatures have risen by 1.7 C over the last 70 years, and it has been a great boon for our health and prosperity, not to mention record-setting crops from more CO2 in the air.  Why not for Finland as well?

Biden’s Hostile Takeover Triggers Poison Pills

Poison Pill is a defensive mechanism technique prevalent in the corporate world to thwart a hostile takeover. It is a strategy used by the Target Company to avoid the hostile takeovers completely or at least slow down the acquiring process.(Source: financemanagement.com)

Rupert Darwall explains the traps and pitfalls in the way of the Biden agenda in his Epoch Times article Fettering Biden’s Administrative State.  Excerpts in italics with my bolds.

Trump-era rules will constrain the new president’s activism

The administrative state will get a new lease on life under President Joe Biden, but America’s administrative state is far more constrained than that of many other countries. Britain, for example, wrote its net-zero climate target into law after only a 90-minute debate in the House of Commons, without any examination of what the cost might be. Arguably the European Union is an administrative state, where the unelected European Commission proposes legislation, enforces it, and even levies billion-euro fines on companies without so much as a court hearing.

By contrast, executive-agency rulemaking in the United States is more circumscribed. Agencies must show cause, respect precedent, and demonstrate that their rulemaking is properly grounded in the relevant statute and in a factual record sufficiently compelling to refute any suggestion that their action was “arbitrary or capricious.” They should expect controversial rules to be able to withstand challenges in the courts.

In 2016, the Supreme Court stayed the Environmental Protection Agency’s Clean Power Plan promulgated by the Obama administration to decarbonize the electrical grid. On the last full day of the Trump administration, the U.S. Court of Appeals for the District of Columbia vacated the Obama plan’s successor, the Affordable Clean Energy plan, in a 2-1 opinion. The majority ruled that the EPA’s interpretation of the Clean Air Act had been too narrow; the dissenting judge—a Trump appointee—opined that both plans relied erroneously on the wrong provision of the Act to regulate greenhouse-gas emissions. These rulings illustrate just how difficult the EPA will find crafting a new rule to fulfill Biden’s promise to decarbonize the grid by 2035.[See post: Latest Court Ruling re EPA and CO2]

The new administration is constrained not only by the courts but also by the late-term rulemaking of its predecessor. It could use the 1996 Congressional Review Act to nullify recently finalized federal regulations with a simple majority vote in each house of Congress. But Republicans can inflict a political price. Last October, the Department of Labor finalized a financial factors rule. It requires managers of corporate pension plans to justify incorporation of environmental, social, and corporate governance (ESG) factors solely on the grounds of boosting risk-adjusted investment returns by reference to generally accepted investment theories.

Wall Street hated it when the rule was first proposed, but all it does is operationalize the requirement of the Employee Retirement Income Security Act (ERISA) of 1974 that plan managers perform their duties for the exclusive purpose of providing benefits to plan beneficiaries and defraying reasonable plan expenses. In reality, opponents of the rule oppose the exclusivity hardwired into ERISA that pension savings be invested with “eye single” to the interests of plan beneficiaries. A vote to nullify the rule would be a vote in favor of socializing retirees’ savings and deploy them for wider societal ends. For Republicans, it would be a debate worth having.

Similarly, congressional Republicans can gain politically by taking a stand opposing nullification of the EPA’s Jan. 3, 2021 transparency-in-science rule. This rule broadens and strengthens the agency’s 2018 transparency rule and aims to ensure that regulatory decisions are taken on the basis of robust, verifiable scientific studies. Polling shows that voters are more motivated to support environmental regulations when presented as protecting public health. This creates a market for studies linking pollution to public-health harms, however flimsy they might be. Environmental regulations mandating national standards on ozone and PM2.5 targeting fossil-fuel combustion are often based on epidemiological studies drawing on undisclosed data that can’t be re-analyzed to check for errors and sensitivity to assumptions.

A justification often made for this anti-scientific practice is safeguarding patient anonymity in such studies, lsomething for which the new rule provides. Covering up is never a good look, however, and the spectacle of the self-proclaimed party of science arguing for secret science and against transparency would demonstrate how deeply politicized the science used to justify environmental regulation has become.

The Trump administration left the best till last. With just one week to go, on Jan. 13, the Federal Register published an EPA regulation that quickly became known as the banana peel rule. Section 111(b) of the Clean Air Act states that the EPA Administrator shall include a category of sources that “causes, or contributes significantly” to pollution anticipated to endanger public health or welfare. The new rule defines the level deemed “significant.” At the rule’s chosen level of 3 percent of U.S. stationary-source greenhouse-gas emissions, the only category deemed significant is electrical power generation—a category that accounts for 43 percent of such emissions.

Should the Biden administration ditch the 3 percent threshold and use the Clean Air Act to enmesh more sectors in greenhouse-gas targets, it will be compelled to develop an objective rationale for doing so. This is far from straightforward, hence the “banana peel” epithet. As the Trump rule notes, greenhouse gases “do not have the local, near-term ramifications found with other pollutants;” their impact is based on “cumulative global loading.” Directly or by inference, significance must therefore be linked to global emissions (U.S. power station emissions account for 3.6 percent of global emissions) and how effectively they are regulated at a global level. It would be irrational to regulate domestic emissions if there were little prospect of global emissions falling, too.

As the Obama administration realized after the collapse of the Copenhagen climate conference in 2009, when China—along with India, South Africa, and Brazil—vetoed a global climate treaty, Beijing holds the key to a credible global greenhouse-gas regime. The 2014 U.S.-China climate accord negotiated by Presidents Obama and Xi paved the way for the Paris Agreement on climate the following year. Xi’s statement at the U.N. last September that China would aim for “carbon neutrality” before 2060 is widely seen as a climate gamechanger.

Writing in a January 2021 Foreign Policy essay, Ted Nordhaus and Seaver Wang argue that China’s climate diplomacy is part of a bigger geopolitical play—Beijing’s desire to “counterbalance rising Western concerns about China’s belligerent posture in the South China Sea, its saber-rattling toward Taiwan, its human-rights crackdown in Hong Kong, its genocidal assault on the Uyghur minority in northwestern China, and much more.” It would be naïve not to recognize the geostrategic and political trade-offs in elevating China as climate savior. In a break with the routine formulation of climate change as existential threat trumping all else, Nordhaus and Wang warn that “a world that succeeds in addressing climate change will not necessarily be a more equitable, inclusive, or humane one.”

On his last full day as Secretary of State, Mike Pompeo declared China’s persecution of the Uyghurs a crime against humanity. His successor agrees. During his confirmation hearing, Tony Blinken said that he supported Pompeo’s genocide finding, and that China poses “the most significant challenge” of any nation-state to the United States. China is playing for higher stakes than the climate. This reality confronts the new administration with its greatest dilemma: “saving the planet” requires appeasing Beijing. How the dilemma is resolved could well come to define Joe Biden’s presidency.

2020 Update: US Coasts Not Flooding as Predicted

 

Previous Post Updated with 2020 Statistics

In 2018 climatists applied their considerable PR skills and budgets swamping the media with warnings targeting major coastal cities, designed to strike terror in anyone holding real estate in those places. Example headlines included:

Sea level rise could put thousands of homes in this SC county at risk, study says The State, South Carolina

Taxpayers in the Hamptons among the most exposed to rising seas Crain’s New York Business

Adapting to Climate Change Will Take More Than Just Seawalls and Levees Scientific American

The Biggest Threat Facing the City of Miami Smithsonian Magazine

What Does Maryland’s Gubernatorial Race Mean For Flood Management? The Real News Network

Study: Thousands of Palm Beach County homes impacted by sea-level rise WPTV, Florida

Sinking Land and Climate Change Are Worsening Tidal Floods on the Texas Coast Texas Observer

Sea Level Rise Will Threaten Thousands of California Homes Scientific American

300,000 coastal homes in US, worth $120 billion, at risk of chronic floods from rising seas USA Today

That last gets the thrust of the UCS study Underwater: Rising Seas, Chronic Floods, and the Implications for US Coastal Real Estate (2018)

Sea levels are rising. Tides are inching higher. High-tide floods are becoming more frequent and reaching farther inland. And hundreds of US coastal communities will soon face chronic, disruptive flooding that directly affects people’s homes, lives, and properties.

Yet property values in most coastal real estate markets do not currently reflect this risk. And most homeowners, communities, and investors are not aware of the financial losses they may soon face.

This analysis looks at what’s at risk for US coastal real estate from sea level rise—and the challenges and choices we face now and in the decades to come.

The report and supporting documents gave detailed dire warnings state by state, and even down to counties and townships. As example of the damage projections is this table estimating 2030 impacts:

State  Homes at Risk  Value at Risk Property Tax at Risk  Population in 
at-risk homes 
AL  3,542 $1,230,676,217 $5,918,124  4,367
CA  13,554 $10,312,366,952 $128,270,417  33,430
CT  2,540 $1,921,428,017 $29,273,072  5,690
DC  – $0 $0  –
DE  2,539 $127,620,700 $2,180,222  3,328
FL  20,999 $7,861,230,791 $101,267,251  32,341
GA  4,028 $1,379,638,946 $13,736,791  7,563
LA  26,336 $2,528,283,022 $20,251,201  63,773
MA  3,303 $2,018,914,670 $17,887,931  6,500
MD  8,381 $1,965,882,200 $16,808,488  13,808
ME  788 $330,580,830 $3,933,806  1,047
MS  918 $100,859,844 $1,392,059  1,932
NC  6,376 $1,449,186,258 $9,531,481  10,234
NH  1,034 $376,087,216 $5,129,494  1,659
NJ  26,651 $10,440,814,375 $162,755,196  35,773
NY  6,175 $3,646,706,494 $74,353,809  16,881
OR  677 $110,461,140 $990,850  1,277
PA  138 $18,199,572 $204,111  310
RI  419 $299,462,350 $3,842,996  793
SC  5,779 $2,882,357,415 $22,921,550  8,715
TX  5,505 $1,172,865,533 $19,453,940  9,802
VA  3,849 $838,437,710 $8,296,637  6,086
WA  3,691 $1,392,047,121 $13,440,420  7,320

The methodology, of course is climate models all the way down. They explain:

Three sea level rise scenarios, developed by the National Oceanic and Atmospheric Administration (NOAA) and localized for this analysis, are included:

  • A high scenario that assumes a continued rise in global carbon emissions and an increasing loss of land ice; global average sea level is projected to rise about 2 feet by 2045 and about 6.5 feet by 2100.
  • An intermediate scenario that assumes global carbon emissions rise through the middle of the century then begin to decline, and ice sheets melt at rates in line with historical observations; global average sea level is projected to rise about 1 foot by 2035 and about 4 feet by 2100.
  • A low scenario that assumes nations successfully limit global warming to less than 2 degrees Celsius (the goal set by the Paris Climate Agreement) and ice loss is limited; global average sea level is projected to rise about 1.6 feet by 2100.

Oh, and they did not forget the disclaimer:

Disclaimer
This research is intended to help individuals and communities appreciate when sea level rise may place existing coastal properties (aggregated by community) at risk of tidal flooding. It captures the current value and tax base contribution of those properties (also aggregated by community) and is not intended to project changes in those values, nor in the value of any specific property.

The projections herein are made to the best of our scientific knowledge and comport with our scientific and peer review standards. They are limited by a range of factors, including but not limited to the quality of property-level data, the resolution of coastal elevation models, the potential installment of defensive measures not captured by those models, and uncertainty around the future pace of sea level rise. More information on caveats and limitations can be found at http://www.ucsusa.org/underwater.

Neither the authors nor the Union of Concerned Scientists are responsible or liable for financial or reputational implications or damages to homeowners, insurers, investors, mortgage holders, municipalities, or other any entities. The content of this analysis should not be relied on to make business, real estate or other real world decisions without independent consultation with professional experts with relevant experience. The views expressed by individuals in the quoted text of this report do not represent an endorsement of the analysis or its results.

The need for a disclaimer becomes evident when looking into the details. The NOAA reference is GLOBAL AND REGIONAL SEA LEVEL RISE SCENARIOS FOR THE UNITED STATES NOAA Technical Report NOS CO-OPS 083

Since the text emphasizes four examples of their scenarios, let’s consider them here. First there is San Francisco, a city that sued oil companies over sea level rise. From tidesandcurrents comes this tidal gauge record
It’s a solid, long-term record providing more than a century of measurements from 1900 through 2020.  The graph below compares the present observed trend with climate models projections out to 2100.

Since the record is set at zero in 2000, the difference in 21st century expectation is stark. Instead of  the existing trend out to around 20 cm, models project 2.5 meters rise by 2100.

New York City is represented by the Battery tidal gauge:


Again, a respectable record with a good 20th century coverage.  And the models say:


The red line projects 2500 mm rise vs. 287 mm, almost a factor of 10 more.  The divergence is evident even in the first 20 years.

Florida comes in for a lot of attention, especially the keys, so here is Key West:


A similar pattern to NYC Battery gauge, and here is the projection:


The pattern is established: Instead of a rise of about 25 cm, the models project 250 cm.

Finally, probably the worst case, and already well-known to all is Galveston, Texas:


The water has been rising there for a long time, so maybe the models got this one close.

The gap is less than the others since the rising trend is much higher, but the projection is still nearly four times the past.  Galveston is at risk, all right, but we didn’t need this analysis to tell us that.

A previous post Unbelievable Climate Models goes into why they are running so hot and so extreme, and why they can not be trusted.

Footnote Regarding Alarms in Other Places

Recently there was a flap over future sea levels at Rhode Island, so I took a look at Newport RI, the best tidal gauge record there.  Same Story: Observed sea levels already well below projections that are 10 times the tidal gauge trend.

Another city focused upon urban flooding is Philadelphia.  As with other coastal settlements, claims of sea level rise from global warming are unfounded.

Philadelphia is a great example where a real concern will not be addressed by reducing CO2 emissions.  See Urban Flooding: The Philadelphia Story

Latest Court Ruling re EPA and CO2

There is the story of the court’s decision, and the back story told by one judge dissenting partly from the other two on the panel.  The overview comes from courthousenews DC Circuit Rejects Trump Rollback of Power Plant Emission Rules.  Excerpts in italics with my bolds.

Overview of Ruling on Affordable Clean Energy Rule

The federal appeals court’s 182-page opinion released Tuesday was unsigned, written by a mostly unanimous three-judge panel. U.S. Circuit Judge Justin Walker, a Trump appointee who joined the court just a month before the case was heard, penned only a partial dissent.

The panel found the outgoing president’s Affordable Clean Energy rule, adopted in 2019 as part of Trump’s effort to roll back what he considered anti-business regulations, is based on an “erroneous legal premise.” The ACE rule dropped all statewide emissions caps, giving state regulators greater autonomy and more time to reduce pollution.

The court held Tuesday that there is “no basis—grammatical, contextual, or otherwise—for the EPA’s assertion” about source-specific language in federal law that it claims limits its oversight of fossil fuel power sources.

While the ruling was welcomed by health and environmental groups, it only returns things to the status quo.  Litigation tied up Obama’s Clean Power Plan shortly after it was passed and it never took effect thanks to a Supreme Court stay in 2016.

The Trump effort to roll it back started in 2017 before culminating with the ACE rule in 2019. Now the ACE rule too will be bound up in legal purgatory, if not scrapped entirely by the incoming Biden administration.

Walker was joined on the panel by U.S. Circuit Judges Cornelia Pillard and Patricia Millett, both Obama appointees.  While the Trump appointee mostly concurred with his colleagues, Walker filed a partial dissent saying he took issue with both Obama and Trump’s regulatory efforts.

The Back Story–How We Got Here

Judge Walker wrote an interesting essay on the twists and turns with climate change, the EPA and CO2 emissions.  His statement is at the end of the court document (here).  Excerpts in italics with my bolds.

WALKER, Circuit Judge, concurring in part, concurring in the judgment in part, and dissenting in part: This case concerns two rules related to climate change. The EPA promulgated both rules under § 111 of the Clean Air Act.1

A major milestone in climate regulation, the first rule set caps for carbon emissions. Those caps would have likely forced shifts in power generation from higher-polluting energy sources (such as coal-fired power plants) to lower-emitting sources (such as natural gas or renewable energy sources). 2 That policy is called generation shifting.

Hardly any party in this case makes a serious and sustained argument that § 111 includes a clear statement unambiguously authorizing the EPA to consider off-site solutions like generation shifting. And because the rule implicates “decisions of vast economic and political significance,” Congress’s failure to clearly authorize the rule means the EPA lacked the authority to promulgate it.

The second rule repealed the first and partially replaced it with different regulations of coal-fired power plants. Dozens of parties have challenged both the repeal and the provisions replacing it.

In my view, the EPA was required to repeal the first rule and wrong to replace it with provisions promulgated under § 111. That’s because coal-fired power plants are already regulated under § 112, and § 111 excludes from its scope any power plants regulated under § 112. Thus, the EPA has no authority to regulate coal-fired power plants under§ 111.

Background Concerning EPA and Carbon Dioxide

In its clearest provisions, the Clean Air Act evinces a political consensus. For example, according to Massachusetts v. EPA, carbon dioxide is clearly a pollutant, and the Act’s § 202 unambiguously directs the EPA to curb pollution from new cars.

But for every carbon question answered in that case, many more were not even presented. For example, does the Clean Air Act force the electric-power industry to shift from fossil fuels to renewable resources? If so, by how much? And who will pay for it? Even if Congress could delegate those decisions, Massachusetts v. EPA does not say where in the Clean Air Act Congress clearly did so.

In 2009, Congress tried to supply that clarity through new legislation.

The House succeeded.
The President supported it.
But that effort stalled in the Senate.

Since climate change is real, man-made, and important, Congress’s failure to act was, to many, a disappointment. But the process worked as it was designed. In general, Senators from small states blocked legislation they viewed as adverse to their voters. And because small states have outsized influence in the Senate, no bill arrived on the President’s desk.

Nor have dozens of other climate-related bills introduced since then. So President Obama ordered the EPA to do what Congress wouldn’t. In 2015, after “years of unprecedented outreach and public engagement” — including 4.3 million public comments (about 4.25 million more than in Massachusetts v.EPA) — the EPA promulgated a rule aimed at “leading global efforts to address climate change.”

Entitled the Clean Power Plan, the EPA’s rule used the Clean Air Act’s § 111 to set limits for carbon emissions that would likely be impossible to achieve at individual coal-fired power plants because of costs, unavailable technologies, or a need to severely reduce usage. In that sense, the limits required generation shifting: shifting production from coal-fired power plants to facilities that use natural gas or renewable resources.

To be clear, the 2015 Rule did not expressly say, “Power plants must adopt off-site solutions.” But it did set strict emission limits in part by considering off-site solutions. And those emission limits would likely have been unachievable or too costly to meet if off-site solutions were off the table.

A political faction opposed generation shifting. It challenged the 2015 Rule in this Court, arguing that § 111 does not allow the EPA to consider off-site solutions when determining the best system of emission reduction. The faction included about twenty-four states, represented by many Senators who opposed the 2009 legislation. Conversely, a political faction of about eighteen states defended the rule. Many of their Senators had supported the stymied legislation.

At that litigation’s outset, our Court refused to stay the rule’s implementation. But in an unprecedented intervention, the Supreme Court did what this Court would not. And through its stay, the Supreme Court implied that the challengers would likely succeed on the case’s merits.

Taking the Supreme Court’s not-so-subtle hint, in 2019 President Trump’s EPA repealed the 2015 Rule and issued the Affordable Clean Energy Rule.

Like the rule it replaced, the 2019 Rule relies on the Clean Air Act’s § 111 to reduce carbon emissions. But unlike its predecessor, the 2019 Rule did not include generation shifting in its final determination of the best system of emission reduction.

A new faction then challenged the 2019 Rule. It looked a lot like the faction that had defended the 2015 Rule. Arrayed against that faction were many states and groups that had opposed the old rule. And so once again, politically diverse states and politically adverse special interest groups brought their political brawl into a judiciary designed to be apolitical.

In this latest round, the briefing’s word count exceeded a quarter of a million words. The oral argument lasted roughly nine hours. The case’s caption alone runs beyond a dozen pages. And yet, in all that analysis, hardly any of the dozens of petitioners or intervenors defending the 2015 Rule make a serious and sustained argument that § 111 includes a clear statement unambiguously authorizing the EPA to consider a system of emission reduction that includes off-site solutions or that § 111 otherwise satisfies the major-rules doctrine’s clear statement requirement. Neither does the EPA.

In light of that, I doubt § 111 authorizes the 2015 Rule — arguably one of the most consequential rules ever proposed by an administrative agency:
• It required a “more aggressive transformation in the domestic energy industry,” marking for President Obama a “major milestone for his presidency.”
• It aspired to reduce that industry’s carbon emissions by 32 percent — “equal to the annual emissions from more than 166 million cars.”
• Leaders of the environmental movement considered the rule “groundbreaking,” called its announcement “historic,” and labeled it a “critically important catalyst.”

The potential costs and benefits of the 2015 Rule are almost unfathomable. Industry analysts expected wholesale electricity’s cost to rise by $214 billion. The cost to replace shuttered capacity? Another $64 billion. (“A billion here, a billion there, and pretty soon you’re talking real money.”)

True, you can dismiss that research as industry-funded. But the EPA itself predicted its rule would cost billions of dollars and eliminate thousands of jobs.

On the benefits side of the ledger, the White House labeled the 2015 Rule a “Landmark,” and the President called it “the single most important step America has ever taken in the fight against global climate change.” With that in mind, calculating the rule’s benefits requires a sober appraisal of that fight’s high stakes. According to the rule’s advocates, victory over climate change will:

  • lower ocean levels;
  • preserve glaciers;
  • reduce asthma;
  • make hearts healthier;
  • slow tropical diseases;
  • abate hurricanes;
  • temper wildfires;
  • reduce droughts;
  • stop many floods;
  • rescue whole ecosystems; and
  • save from extinction up to “half the species on earth.”

These are, to put it mildly, serious issues. Lives are at stake. And even though it’s hard to put a dollar figure on the net value on what many understandably consider invaluable, the EPA tried: $36 billion, it said, give or take about a $10- billion margin of error.

So say what you will about the cost-benefit analysis behind generation shifting, it’s hardly a minor question.

Minor questions do not forestall consequences comparable to “the extinction event that wiped out the dinosaurs 65 million years ago.” Minor questions are not analogous to “Thermopylae, Agincourt, Trafalgar, Lexington and Concord, Dunkirk, Pearl Harbor, the Battle of the Bulge, Midway and Sept. 11.” Minor rules do not inspire “years of unprecedented outreach and public engagement.” Minor rules are not “the single most important step America has ever taken in the fight against global climate change.” Minor rules do not put thousands of men and women out of work. And minor rules do not calculate $10 billion in net benefits as their margin of error.

Rather, the question of how to make this “the moment when the rise of the oceans began to slow and our planet began to heal” — and who should pay for it — requires a “decision[] of vast economic and political significance.” That standard is not mine. It is the Supreme Court’s. And no cocktail of factors informing the major-rules doctrine can obscure its ultimate inquiry: Does the rule implicate a “decision[] of vast economic and political significance”?

Proponents of the 2015 Rule say it doesn’t. They have to. If it did, it’s invalid — because a clear statement is missing. And according to the Supreme Court, that is exactly what a major rule requires.

To be sure, if we frame a question broadly enough, Congress will have always answered it. Does the Clean Air Act direct the EPA to make our air cleaner? Clearly yes. Does it require at least some carbon reduction? According to Massachusetts v. EPA, again yes.

But how should the EPA reduce carbon emissions from power plants? And who should pay for it? To those major questions, the Clean Air Act’s answers are far from clear.

I admit the Supreme Court has proceeded with baby steps toward a standard for its major-rules doctrine. But “big things have small beginnings.” And even though its guidance has been neither sweeping nor precise, the Supreme Court has at least drawn this line in the sand: Either a statute clearly endorses a major rule, or there can be no major rule.

Moreover, if Congress merely allowed generation shifting (it didn’t), but did not clearly require it, I doubt doing so was constitutional. For example, imagine a Congress that says, “The EPA may choose to consider off-site solutions for its best system of emission reduction, but the EPA may choose not to consider off-site solutions.” In that instance, Congress has clearly delegated to the EPA its legislative power to determine whether generation shifting should be part of the best system of emission reduction — a “decision[] of vast economic and political significance.”

Such delegation might pass muster under a constitution amended by “moments” rather than the “reflection and choice” prescribed by Article V. But if ever there was an era when an agency’s good sense was alone enough to make its rules good law, that era is over.

Congress decides what major rules make good sense. The Constitution’s First Article begins, “All legislative Powers herein granted shall be vested in a Congress of the United States, which shall consist of a Senate and House of Representatives.” And every “law” must “pass[] the House of Representatives and the Senate” and “be presented to the President.” Thus, whatever multi-billion-dollar regulatory power the federal government might enjoy, it’s found on the open floor of an accountable Congress, not in the impenetrable halls of an administrative agency — even if that agency is an overflowing font of good sense.

Over time, the Supreme Court will further illuminate the nature of major questions and the limits of delegation. And under that case law, federal regulation will undoubtedly endure. So will federal regulators. Administrative agencies are constitutional, and they’re here to stay.

Beyond that, I leave it for others to predict what the Supreme Court’s emerging jurisprudence may imply for those agencies’ profiles. Here, regardless of deference and delegation doctrines, the regulation of coal-fired power plants under § 111 is invalid for a more mundane reason: A 1990 amendment to the Clean Air Act forbids it.

The Clean Air Act Amendments of 1990 prohibit the EPA from subjecting power plants to regulation under § 111 if they are already regulated under § 112. The 2015 Rule and the 2019 Rule rely on § 111 for the authority to regulate coal-fired power plants. Because the EPA already regulates those coal-fired power plants under § 112, the rules are invalid.

This case touches on some of administrative law’s most consequential, unresolved issues. What is the reach of Massachusetts v. EPA? What is the meaning of a major question? What are the limits of congressional delegation?

My comment:  I much appreciate Judge Walker’s reprise of the historical journey.  After earning my degree in organic chemistry, I am still offended that a bunch of  lawyers refer to CO2 as a “pollutant” as though it were an artificial chemical rather than the stuff of life.  And it annoys me that the American Lung Association fronted this legal attack, as though CO2 was causing breathing problems in addition to a bit of warming during our present ice age. And that list of ailments solved by reducing CO2 emissions rivals any snake oil poster ever printed.

Observers noted that this ruling produces a kind of limbo: Obama’s Clean Power Plan is out of order, and now Trumps Affordable Clean Energy program is shot down.  Likely Biden will try to return to CPP as though Trump never happened, but the same objections will still be raised.  Clearly Judge Walker sees the issue headed for the Supreme Court as the stakes are too high for anyone else.  After their lack of courage on the 2020 election scandal, who knows what the Supremes will do.

Footnote: See post The Poisonous Tree of Climate Change

The roots of this poisonous tree are found in citing the famous Massachusetts v. E.P.A. (2007) case decided by a 5-4 opinion of Supreme Court justices (consensus rate: 56%). But let’s see in what context lies that reference and whether it is a quotation from a source or an issue addressed by the court. The majority opinion was written by Justice Stevens, with dissenting opinions from Chief Justice Roberts and Justice Scalia. All these documents are available at sureme.justia.com Massachusetts v. EPA, 549 U.S. 497 (2007).  The linked post summarized the twisted logic that was applied.

 

End Covid19 Vaccine Obsession Now

Fig. 1. The four pillars of pandemic response to COVID-19. The four pillars of pandemic response to COVID-19 are:
1) contagion control or efforts to reduce spread of SARS-CoV-2,
2) early ambulatory or home treatment of COVID-19 syndrome to reduce hospitalization and death,
3) hospitalization as a safety net to prevent death in cases that require respiratory support or other invasive therapies,
4) natural and vaccination mediated immunity that converge to provide herd immunity and ultimate cessation of the viral pandemic.

Robert Clancy explains in his Quadrant article  COVID-19: A realistic approach to community management The author is Emeritus Professor of Pathology at the University of Newcastle Medical School. He is a member of the Australian Academy of Science’s COVID-19 Expert Database.   Excerpts in italics with my bolds.

Historically pandemics generate suspicion, speculation and emotion, before logic and empiric decisions determine optimal management. The current Covid19 pandemic is no exception. Twelve months on, there is an emerging consensus supporting an integration of a four-pillar plan: public health strategies; vaccination; early pre-hospital treatment; and hospital treatment. This position replaces an early confusion, with supporting data appearing on a near daily basis.

Public Health strategies are well understood and highly effective, forming the bedrock for disease control, while hospital management is a work in progress, but with progressive improvement in outcomes. Typical data for high risk subjects (>50 years with 1 or more co-morbidities) in the USA is currently 18-20% hospitalisation, with mortality around 1%. Less attention has been given to ongoing symptoms, with about 80% of hospitalised patients having profound fatigue and/or breathlessness 3-4 months after discharge. Many are unable to return to full time work 6 months after infection control.

The area of intense disagreement is community management combining prevention by vaccine and reduction of hospital admission, using pre-hospital treatment.

There is a global expectation that vaccines will dramatically change the current face of Covid19 while there is broad based denial that any of the available (unpatented) drugs beneficially alter the natural history of infection. Expectation of a vaccine nirvana alongside therapeutic nihilism are both incorrect, although each is promoted with a vigour rooted in socio-political conviction (& supported by the Pharma industry).

The conclusion based on logic and data, is that vaccines and early treatment strategies are both necessary for optimal disease control. As a result, a community plan has been formulated, aimed at keeping patients out of hospital. Experienced physicians have developed protocols based on evidence, with sequenced multi-drug regimens that support >80% reductions in admissions to hospital and death.

Implementation of this approach would effectively end the US, UK, Canada, and EU hospitalisation crises.

The objective of this brief review is to provide argument in support of these conclusions, based on an untangling of the pathobiology of Covid19 over the last 12 months; review of the available data on the three vaccines used in the western world; and current data supporting significant benefit of pre-hospital drug treatment.

The Vaccine

The idea that a vaccine would induce sterilising immunity and therefore prevent community spread by creating herd immunity, has become the dominant political and medical story. Political, economic and social planning has been based on a sense of certainty that this will happen. This was never a likely outcome, as such success was asking more of the immune apparatus responsible for containment of a respiratory virus, than had been observed. The first principal of vaccinology has always been that a vaccine is unlikely to give better protection than does the disease itself. First cousin Corona viruses cause recurrent airways infections over many years, manifest clinically as “common colds”. Recurrent Covid19 infections have been documented during the first year of the pandemic. Mucosal immunological memory for corona viruses is predictably poor.

The objective of any Covid19 vaccine is to limit virus replication within the mucosal compartment of the airways. This requires specific activation of the mucosal immune system, which differs from systemic immunity, geared to protect the internal spaces of the body. Blood antibody levels characterise the systemic immune response. These antibodies are very effective at neutralising virus that passes through the blood stream in its normal course of infection, such as the measles or mumps virus. These vaccines readily induce sterilising immunity.

The relevance of this immune machinery to Covid19 vaccines can be summarised:

  •  the systemic and mucosal immune compartments communicate poorly, with minimal mixing. Some mixing occurs in regions such as the nasal cavity and the alveolar space as demonstrated by injected pneumococcal vaccines where IgG antibody from blood can inhibit pneumocooci in the nose and the gas exchange apparatus of the lungs and thus protect from ear infections and pneumonia. This may explain the high level of “PCR-ve Covid19 infections” (that is, apparent clinical infection by Covid19 virus, but with a negative laboratory test in the mRNA trials, as discussed below. 
  • mucosal immune responses to the virus are transient.
  • immune senescence at a mucosal level is marked.

Three vaccines have been released in the US and UK after limited observation and review due to the dramatic circumstances of the pandemic. . . The outcome of these reviews suggest that little difference exists between the three vaccines.  To summarise current position with vaccines;

  • Little protection against infection occurs, although protection against symptomatic disease is significant, but is likely to be far less than 90%. It remains to be demonstrated whether this translates into protection against admission to hospital, & mortality since this was not the case in 2 months of follow-up with the mRNA vaccines. The duration of protection and level of protection in high risk individuals over time, need to be monitored. The likely outcome is that vaccines push the disease profile towards asymptomatic infection, rather than induce any discrete sterilising immune state. It is unhelpful and risky to attempt to choose between available vaccines until far more data becomes available.
  • Re-infection in vaccinated subjects appears to occur at a similar rate as it does for community non-vaccinated controls.
  • There is no realistic chance of herd immunity, given the high rate of asymptomatic infections in vaccinated individuals. This becomes more probable should the current intention of about 30% of the population (US figures) to not be vaccinated irrespective of advice given, be accurate. In Australia, every encouragement should support over 90% vaccination, with whatever of the available vaccines are available: dissention and argument over “false news” undermines this endeavour. In other words, though immunity with Covid19 vaccination appears to be neither complete nor durable, any chance of approaching “herd immunity” depends on a near 100% vaccination rate. Time will give answers to the critical questions, and vaccines still in trials, may be a better choice in the longer run. None of the “clever” delivery systems have yet proved to be an advantage over traditional (or 21st century variations) adjuvenated split vaccines (other than for those who own the patent).
  • It can be predicted that endemic spread of the virus throughout the population will occur. The observations in the UK trial of high levels of asymptomatic infection, and “Diagnostic Test Negative” infections in the Pfizer study, focus attention on confirming the dynamics of asymptomatic infection post vaccination, and the degree of transmission in the community, from that source. Data on these critical issues is limited.
  • There is a potential danger that as vaccination levels increase, but remain short of comprehensive cover, virus could spread with “hotspots” difficult to identify. Such spread could promote the emergence of resistant strains.The worst scenario would be an increase in mortality due to spread from unrecognised vaccinated subjects with asymptomatic infection, to those without vaccination protection. On this, current data is scanty, indicating about 20% of Covid19 infections are asymptomatic, but that the infectivity of these, is reduced, perhaps 3-4 fold. Similar data in the post-vaccine world will be of central importance.
Early Drug Treatment

It could be summarised that in a “Post Truth World’, those with Covid19 disease are denied safe, effective treatment which if given early can reduce admission to hospital and death. The main purpose of the comments to follow, is to show that the data have moved on, and that science-based decisions can and must be made if lives are to be saved. Two drugs are effective: hydroxychloroquine (HCQ), and ivermectin (IVM) with most effective trials including nutraceutical, zinc and intracellular antibiotics. These antivirals have been available as antimicrobial drugs for many years. Their antiviral activity is due to intracellular processes that inhibit virus assemblage – HCQ reduces acidity within cytoplasmic vesicles, and IVM blocks communication between the cytoplasm and the nucleus, while both have many sites of action that impair the inflammatory response.

The basic principal in treating viral infections is to treat early. This is well established for all virus therapy including acyclovir treatment for shingles, herpes simplex infections, and neuraminidase inhibitors in influenza. The same principal applies to treatment of Covid19. Treatment during the first “virus dominant” phase is directed at reducing the viral load within the mucosal compartment, while in the second phase treatment aims at inhibition of the damaging inflammatory response. Patients with significant second phase disease are usually in hospital, and treated with organ support, anticoagulation and anti-inflammatory drugs.

The data base supporting the value of early treatment of Covid19 disease is so strong, that it is hard to understand the current philosophy of “wait until you are sick enough, then go to hospital” (The question I put to naysayers is “would you give or not give HCQ or IVM to your grandparent with early Covid19 disease in an Australian aged care facility?”). Suggested reasons for unscientific denial include: ideological unmovable mind sets; a rapidly evolving pandemic where new data appears on a daily basis making it hard to keep up with the data flow; failure to understand the value of non RCT data sets which from a scientific and ethical viewpoint are appropriate to the circumstances of a pandemic; and a total focus on an anticipated Covid-free world following the release of vaccines.

The RCT mantra selectively used against HCQ and IVM is cynical, given the experience with Remdesivir. This antiviral agent has been tried in the treatment of Covid19 and was shown in a RCT to reduce time in hospital by 4 days, with no reduction in mortality. On this scanty evidence it was rushed through the regulatory process. Although three additional RCTs failed to confirm this slight benefit, it continues to be used at around A$4,000 a course, with many significant side effects.

Review of clinical studies in early (pre-hospital) disease as at end of November 2020 illustrate the data:

  • All 27 trials of HCQ showed protection (OR 0.37 (0.29-0.47)). 10 of these were RCT (OR 0.71 (0.54-0.95)) (the Odds Ratio , or OR of , say, 0.37, means “63% protection”, and 0.71 would be “29% protection”). The figures in ( ) are the 95% confidence levels: if <1.0, this is equivalent to {at least } a P value <0.05) (P value refers to probability of result being by chance. A value of 0.05 means a 1 in 20 chance that it is “by chance”, with that level taken as reflecting a significant observation).
  • 26 of 32 prophylaxis studies using HCQ showed protection (OR for 5 post exposure studies: 0.61 (0.4-0.74))
  • IVM in 8 studies, half of which were RCT, showed protection in early treatment studies (OR 0.28(0.13-0.59) P=0.004)

As this clinical data continues to accumulate, regions around the world are adopting therapy with HCQ or IVM with dramatic results, when compared to adjoining areas that have not adopted this therapy. This has been marked in regions in Brazil.

The incidence and mortality of Covid19 in the US and Europe is of crushing proportions with no end in sight. . . Planning on the basis that all this will change following introduction of vaccines, needs reassessment, as early review of trial data, while showing short term protection from significant symptomatic disease, must be tempered by evidence that infection is little reduced when asymptomatic and “PCR-ve Covid19” (a negative test for virus on a nasal swab) cases are counted. . . Uncertainties regarding the capacity of current vaccines to attain herd immunity due to continued asymptomatic infection, dictate that additional measures to reduce the impact of the pandemic must be put in place.

Two drugs used early in disease reduce admission into hospital and death, including in those considered high-risk subjects, and they go a significant way to filling this need: HCQ and IVM. Both can be used as prophylactic or therapeutic medications. From uncertain beginnings, an impressive data base has more recently accumulated, that strongly supports the use of HCQ and/or IVM. Their use in concert with vaccines can no longer be denied; indeed this is the only science-based option.

Background from previous post Why Pandemic Responses Fail

As reported in the journal Reviews in Cardiovascular Medicine, Top US medics recommend ‘sequenced multidrug therapy’ including HCQ & Ivermectin, for early high-risk COVID-19 infections (source Palmer Foundation).  Bureaucratic public health officials obsessed with top-down, high-tech solutions have failed to provide citizens with the most important pillar: advice and the means to treat themselves and take charge of their own health care.  Excerpts in italics with my bolds.

The pandemic of SARS-CoV-2 (COVID-19) is advancing unabated across the world with each country and region developing distinct epidemiologic patterns in terms of frequency, hospitalization, and death. There are four pillars to an effective pandemic response:
1) contagion control,
2) early treatment,
3) hospitalization, and
4) vaccination to assist with herd immunity (Fig. 1).

Additionally, when feasible, prophylaxis could be viewed as an additional pillar since it works to reduce the spread as well as the incidence of acute illness.

Many countries have operationalized all four pillars including the second pillar of early home-based treatment with distributed medication kits of generic medications and supplements as shown in Table 1.

In the US, Canada, United Kingdom, Western European Union, Australia, and some South American Countries there have been three major areas of focus for pandemic response:
1) containment of the spread of infection (masking, social distancing, etc.,
2) late hospitalization and delayed treatments (remdesivir, convalescent plasma, antiviral antibodies), and
3) vaccine development (Bhimraj et al., 2020; COVID-19 Treatment Guidelines, 2020).

Thus the missing pillar of pandemic response is early home-based treatment (as seen in Fig. 1).

The current three-pronged approach has missed the predominant opportunity to reduce hospitalization and death given the practice of directing patients to self-isolation at home. Early sequential multidrug therapy (SMDT) is the only currently available method by which hospitalizations and possibly death could be reduced in the short term (McCullough et al., 2020a).

Innovative SMDT regimens forCOVID-19 utilize principles learned from hospitalized patients as well as data from treated ambulatory patients.

For the ambulatory patient with recognized signs and symptoms of COVID-19 on the first day (Fig. 2), often with nasal real-time reverse transcription or oral antigen testing not yet performed, the following three therapeutic principles apply (Centers for Disease Control and Prevention, 2020) :

1) combination anti-infective therapy to attenuate viral replication,
2) corticosteroids to modulate cytokine storm, and
4) antiplatelet agent/antithrombotic therapy to prevent and manage micro- or overt vascular thrombosis.

For patients with cardinal features of the syndrome (fever, viral malaise, nasal congestion, loss of taste and smell, dry cough, etc) with pending or suspected false negative testing, therapy is the same as those with confirmed COVID-19.

Fig. 3. Sequential multidrug treatment algorithm for ambulatory acute COVID-19 like and confirmed COVID-19 illness in patients in self-quarantine. Yr = year, BMI = body mass index, Dz = disease, DM = diabetes mellitus, CVD = cardiovascular disease, chronic kidney disease, HCQ = hydroxychloroquine, IVM = ivermectin, Mgt = management, Ox = oximetry, reproduced with permission from reference.

Summary

The SARS-CoV-2 outbreak is a once in a hundred-year pandemic that has not been addressed by rapid establishment of infrastructure amenable to support the conduct of large, randomized trials in outpatients in the community setting.

The early flu-like stage of viral replication provides a therapeutic window of tremendous opportunity to potentially reduce the risk of more severe sequelae in high risk patients. Precious time is squandered with a “wait and see” approach in which there is no anti-viral treatment as the condition worsens, possibly resulting in unnecessary hospitalisation, morbidity, and death.

Once infected, the only means of preventing a hospitalization in a high-risk patient is to apply treatment before arrival of symptoms that prompt paramedic calls or emergency room visits. Given the current failure of government support for randomized clinical trials evaluating widely available, generic, inexpensive therapeutics, and the lack of instructive out-patient treatment guidelines (U.S., Canada, U.K., Western EU, Australia, some South American Countries), clinicians must act according to clinical judgement and in shared decision making with fully informed patients.

Early SMDT developed empirically based upon pathophysiology and evidence from randomized data and the treated natural history of COVID-19 has demonstrated safety and efficacy.

In newly diagnosed, high-risk, symptomatic patients with COVID-19, SMDT has a reasonable chance of therapeutic gain with an acceptable benefit-to-risk profile.

Until the pandemic closes with population-level herd immunity potentially augmented with vaccination, early ambulatory SMDT should be a standard practice in high risk and severely symptomatic acute COVID-19 patients beginning at the onset of illness.

Authors:

Peter A. McCullough, Paul E. Alexander, Robin Armstrong, Cristian Arvinte, Alan F. Bain, Richard P. Bartlett, Robert L. Berkowitz, Andrew C. Berry, Thomas J. Borody, Joseph H. Brewer, Adam M. Brufsky, Teryn Clarke, Roland Derwand, Alieta Eck, John Eck, Richard A. Eisner, George C. Fareed, Angelina Farella, Silvia N. S. Fonseca, Charles E. Geyer Jr., Russell S. Gonnering, Karladine E. Graves, Kenneth B. V. Gross, Sabine Hazan, Kristin S. Held, H. Thomas Hight, Stella Immanuel, Michael M. Jacobs, Joseph A. Ladapo, Lionel H. Lee, John Littell, Ivette Lozano, Harpal S. Mangat, Ben Marble, John E. McKinnon, Lee D. Merritt, Jane M. Orient, Ramin Oskoui, Donald C. Pompan, Brian C. Procter, Chad Prodromos, Juliana Cepelowicz Rajter, Jean-Jacques Rajter, C. Venkata S. Ram, Salete S. Rios , Harvey A. Risch, Michael J. A. Robb, Molly Rutherford, Martin Scholz, Marilyn M. Singleton, James A. Tumlin, Brian M. Tyson, Richard G. Urso, Kelly Victory, Elizabeth Lee Vliet, Craig M. Wax, Alexandre G. Wolkoff, Vicki Wooll, Vladimir Zelenko. Multifaceted highly targeted sequential multidrug treatment of early ambulatory high-risk SARS-CoV-2 infection (COVID-19). Reviews in Cardiovascular Medicine, 2020, 21(4): 517-530.

Energy: Third Rail of US Politics

This third rail, used to power trains, would likely result in the death by electrocution of anyone who comes into direct contact with it.

Wikipedia:  The third rail of a nation’s politics is a metaphor for any issue so controversial that it is “charged” and “untouchable” to the extent that any politician or public official who dares to broach the subject will invariably suffer politically. The metaphor comes from the high-voltage third rail in some electric railway systems.

On his first day in office Biden canceled the Keystone energy pipeline, and the backlash is immediate from the unions who supported him and now will suffer a punishing loss of middle-class jobs.

“Insulting”- Labor Unions That Endorsed Biden Now Lashing Out At Him is an article at Gateway Pundit.  Excerpts in italics with my bolds.

Joe Biden has already made labor unions regret their support for him. 
He’s only been in office three days.

Several unions that eagerly endorsed President Joe Biden during the 2020 presidential election are now learning the hard way what it means to support Democrat policies.

During his first day in office, the newly-inaugurated president revoked the construction permit for the Keystone XL oil pipeline, thus destroying thousands of jobs.  And not just any jobs — but union jobs.

The Laborer’s International Union Of North America issued this statement:

“The Biden Administration’s decision to cancel the Keystone XL pipeline permit on day one of his presidency is both insulting and disappointing to the thousands of hard-working LIUNA members who will lose good-paying, middle-class family-supporting jobs,”

“By blocking this 100-percent union project, and pandering to environmental extremists, a thousand union jobs will immediately vanish and 10,000 additional jobs will be foregone.”

This comes after LIUNA bragged about pushing Biden “over the top” in 2020.

The North American Building Trades Union said this:

“North America’s Building Trades Unions are deeply disappointed in the decision to cancel the Keystone XL permit on the President’s first official day in office. Environmental ideologues have now prevailed, and over a thousand union men and women have been terminated from employment on the project.

On a historic day that is filled with hope and optimism for so many Americans and people around the world, tens of thousands of workers are left to wonder what the future holds for them. In the midst of a pandemic that has claimed 400 thousand American lives and has wreaked havoc on the economic security and standard of living of tens of millions more, we must all stand in their shoes and acknowledge the uncertainty and anxiety this government action has caused.”

The United Association Of Union Plumbers and Pipefitters released this statement about Biden canceling the Keystone XL pipeline permit

“In revoking this permit, the Biden Administration has chosen to listen to the voices of fringe activists instead of union members and the American consumer on Day 1.”

Unions that backed Biden are finding out Biden works for radical Democrats, not labor unions.

 

WordPress Censorship?

A strange thing happened this morning.  I wanted to improve some wording in a post from last night, and the edit function came up in WordPress block editor.  To my surprise, some of the text and image blocks were no longer visible, covered with a warning label:  “This block contains unexpected or invalid content.” Canceled text started with a reference to the last US election and a video image of Biden signing executive orders.

If I open the post in classical editor, the text is available as I wrote it.  And the post still appears without censorship:  See: Climate Science Victim of Fake News

Footnote:  As suggested in comment below, here is a screenshot:

Hitting the “Attempt Block Recovery” button creates a blank block.

Climate Science Victim of Fake News

A recent article in the legacy media needed some editorial work in the public interest. Published at the Business Post, it began this way:

Climate science has long been the victim of ‘fake news’ obscuring uncomfortable truths. By pouncing on supposed uncertainties in climate science, big business interests and their supporters in the media divert attention away from the real climate emergency.

Now that is so misleading that a “False Alarm” label should be attached by fact checkers. In their absence, the next best thing is to rewrite to set the record straight and eliminate the falsehoods and hype. So let’s begin again.

Climate science has long been the victim of ‘false alarms’ obscuring the remarkable stability of our climate system. By exaggerating the dangers from extreme weather, entrenched environmental lobbies and ignorant media supporters frighten people for the sake of their tax-subsidized enterprises. (There, fixed.) To Continue:

A climate change awareness rally in Sydney in 2019. Picture: Don Arnold/Getty Images

Climate change is a popular crusade with catchy slogans and many social gatherings to celebrate solidarity. Actual scientific understanding of the climate is hard, lonely work collecting and analyzing data. And simplistic notions about “fighting climate change” are nonsense without rigorous cost and benefit analysis.

To the political classes and wider public still reeling from social and mass media censorship and warped “fact-checking”, and astounded that the world’s leading democracy could see its elections invalidated by a blizzard of lies and backroom vote counting, climate scientists might well say: Don’t be so naive.

Take Phil Jones, a quietly spoken climatologist at the University of East Anglia in England. In 2009, he was caught up in a whistleblower’s leak of context from the university’s email servers which was later dubbed “climategate.”

There are three threads in particular in the leaked documents which sent a shock wave through informed observers across the world. Perhaps the most obvious was the highly disturbing series of emails which show how Dr Jones and his colleagues had for years been discussing the devious tactics whereby they could avoid releasing their data to outsiders under freedom of information laws.

But the question which inevitably arises from this systematic refusal to release their data is – what is it that these scientists seem so anxious to hide? The second and most shocking revelation of the leaked documents is how they show the scientists trying to manipulate data through their tortuous computer programs, always to point in only the one desired direction – to lower past temperatures and to ‘adjust’ recent temperatures upwards, in order to convey the impression of an accelerated warming. This is what Mr McIntyre caught Dr Hansen doing with his GISS temperature record last year (after which Hansen was forced to revise his record), and two further shocking examples have now come to light from Australia and New Zealand.

The third shocking revelation of these documents is the ruthless way in which these academics have been determined to silence any expert questioning of the findings they have arrived at by such dubious methods – not just by refusing to disclose their basic data but by discrediting and freezing out any scientific journal which dares to publish their critics’ work. It seems they are prepared to stop at nothing to stifle scientific debate in this way, not least by ensuring that no dissenting research should find its way into the pages of IPCC reports. (Source:  excerpt from John Walker, former Laboratory Medical Director/Pathologist (1984-2011) See: Q&A Why So Many Climate Skeptics

Background from previous post:

In 2021, there may well be a new deluge of hysterical claims from the usual suspects published at the usual venues comprising legacy and social media.

These outrageous appeals by alarmists in the face of contrary facts remind me of the story defining the term “chutzpuh.” A young man is convicted of killing his parents, and later appears before the judge for sentencing. Asked to give any last words, he replies: “Go easy on me, your Honor, I’m an orphan.”
alcoholics-anonymous-logo-e1497443623248

Fortunately, there is help for climate alarmists. They can join or start a chapter of Alarmists Anonymous. By following the Twelve Step Program, it is possible to recover and unite in service to the real world and humanity.

Step One: Fully concede (admit) to our innermost selves that we were addicted to climate fear mongering.

Step Two: Come to believe that a Power greater than ourselves causes weather and climate, restoring us to sanity.

Step Three: Make a decision to study and understand how the natural world works.

Step Four: Make a searching and fearless moral inventory of ourselves, our need to frighten others and how we have personally benefited by expressing alarms about the climate.

Step Five: Admit to God, to ourselves, and to another human being the exact nature of our exaggerations and false claims.

Step Six: Become ready to set aside these notions and actions we now recognize as objectionable and groundless.

Step Seven: Seek help to remove every single defect of character that produced fear in us and led us to make others afraid.

Step Eight: Make a list of all persons we have harmed and called “deniers”, and become willing to make amends to them all.

Step Nine: Apologize to people we have frightened or denigrated and explain the errors of our ways.

Step Ten: Continue to take personal inventory and when new illusions creep into our thinking, promptly renounce them.

Step Eleven: Dedicate ourselves to gain knowledge of natural climate factors and to deepen our understanding of nature’s powers and ways of working.

Step Twelve: Having awakened to our delusion of climate alarm, we try to carry this message to other addicts, and to practice these principles in all our affairs.

Summary:

With a New Year just beginning, let us hope that many climate alarmists take the opportunity to turn the page by resolving a return to sanity. It is not too late to get right with reality before the cooling comes in earnest.

This is your brain on climate alarm.  Just say No!