Feb. 2021 Polar Vortex Hits Okhotsk Ice

 

Update Feb. 19, 2021 to previous post

This update is to note a dramatic effect on Okhotsk Sea ice coincidental with the Polar Vortex event that froze Texas and other midwestern US states.  When Arctic air extends so far south due to the weak and wavy vortex, warmer air replaces the icy air in Arctic regions.  In this case, the deficits to sea ice extent appear mostly in the Sea of Okhotsk in the Pacific.

The graph below shows a sharp drop in ice extent the last three days.

A closer look into the regions shows that Okhotsk peaked at 1.1M km2 on day 37, and lost 217k km2 down to 0.9M km2 yesterday.  That loss along with Bering flat extent makes up 70% of the present deficit to average.

Some comments from Dr. Judah Cohen Feb. 15 from his AER blog Arctic Oscillation and Polar Vortex Analysis and Forecasts  Excerpts in italics with my bolds.

I have been writing how the stratospheric PV disruption that has been so influential on our weather since mid-January has been unusual and perhaps even unique in the observational record, so I guess then it should be no surprise that it’s ending is also highly unusual. I was admittedly skeptical, but it does seem that the coupling between the stratospheric PV and the tropospheric circulation is about to come to an abrupt end.

The elevated polar cap geopotential height anomalies (PCHs) related to what I like to refer to the third and final PV disruption at the end of January/early February quickly propagates to the surface and even amplifies, peaking this past weekend. And as I have argued, it is during spikes in PCH when severe winter is most likely across the NH mid-latitudes, as demonstrated in Cohen et al. (2018).

But rather than the typical gradual influence from the stratospheric PV disruption over many weeks, maybe akin to the drip, drip, drip of a leaky faucet, the entire signal dropped all at once like an anchor. This also likely contributed to the severity of the current Arctic outbreak in the Central US that is generational and even historical in its severity. But based on the forecast the PV gave all it had all at once, and the entire troposphere-stratosphere-troposphere coupling depicted in Figure ii is about to abruptly end in the next few days.

I am hesitant to bring analogs before 2000 but the extreme cold in Texas did remind me of another winter that brought historic Arctic outbreaks including cold to Texas – January 1977. It does appear that the downward influence from the stratospheric PV to the surface came to an abrupt end at the end of January 1977 . . . Relative to normal, January 1977 was the coldest month for both Eurasia and the US when stratosphere-troposphere coupling was active. But the relative cold did persist in both the Eastern US and northern Eurasia in February post the stratosphere-troposphere coupling. By March the cold weather in the Eastern US was over but persisted for northern Eurasia.

See also No, CO2 Doesn’t Drive the Polar Vortex

Background from Previous Post

In January, most of the Arctic ocean basins are frozen over, and so the growth of ice extent slows down.  According to SII (Sea Ice Index) January on average adds 1.3M km2, and this month it was 1.4M.  (background is at Arctic Ice Year-End 2020).  The few basins that can grow ice this time of year tend to fluctuate and alternate waxing and waning, which appears as a see saw pattern in these images.

Two weeks into February Arctic ice extents are growing faster than the 14-year average, such that they are approaching the mean.  The graph below shows the ice recovery since mid-January for 2021, the 14-year average and several recent years.

The graph shows mid January a small deficit to average, then slow 2021 growth for some days before picking up the pace in the latter weeks.  Presently extents are slightly (1%) below average, close to 2019 and 2020 and higher than 2018.

February Ice Growth Despite See Saws in Atlantic and Pacific

As noted above, this time of year the Arctic adds ice on the fringes since the central basins are already frozen over.  The animation above shows Barents Sea on the right (Atlantic side) grew in the last two weeks by 175k km2 and is now 9% greater than the maximum last March.  Meanwhile on the left (Pacific side)  Bering below and Okhotsk above wax and wane over this period. Okhotsk is seen growing 210k km2 the first week, and giving half of it back the second week.  Bering waffles up and down ending sightly higher in the end.

The table below presents ice extents in the Arctic regions for day 44 (Feb. 13) compared to the 14 year average and 2018.

Region 2021044 Day 044 Average 2021-Ave. 2018044 2021-2018
 (0) Northern_Hemisphere 14546503 14678564 -132061 14140166 406337
 (1) Beaufort_Sea 1070689 1070254 435 1070445 244
 (2) Chukchi_Sea 966006 965691 315 965971 35
 (3) East_Siberian_Sea 1087120 1087134 -14 1087120 0
 (4) Laptev_Sea 897827 897842 -15 897845 -18
 (5) Kara_Sea 934988 906346 28642 874714 60274
 (6) Barents_Sea 837458 563224 274235 465024 372434
 (7) Greenland_Sea 645918 610436 35482 529094 116824
 (8) Baffin_Bay_Gulf_of_St._Lawrence 1057623 1487547 -429924 1655681 -598058
 (9) Canadian_Archipelago 854597 853146 1451 853109 1489
 (10) Hudson_Bay 1260471 1260741 -270 1260838 -367
 (11) Central_Arctic 3206263 3211892 -5630 3117143 89120
 (12) Bering_Sea 559961 674196 -114235 319927 240034
 (13) Baltic_Sea 116090 94341 21749 76404 39686
 (14) Sea_of_Okhotsk 1027249 930357 96892 911105 116144
 (15) Yellow_Sea 9235 28237 -19002 33313 -24078
 (16) Cook_Inlet 223 11137 -10914 11029 -10806

The table shows that Bering defict to average is offset by surplus in Okhotsk.  Baffin Bay show the largest deficit, mostly offset by surpluses in Barents, Kara and Greenland Sea.

The polar bears have a Valentine Day’s wish for Arctic Ice.

welovearcticicefinal

And Arctic Ice loves them back, returning every year so the bears can roam and hunt for seals.

Footnote:

Seesaw accurately describes Arctic ice in another sense:  The ice we see now is not the same ice we saw previously.  It is better to think of the Arctic as an ice blender than as an ice cap, explained in the post The Great Arctic Ice Exchange.

After Counting Mail-in Ballots, Senate Finds Trump Guilty

Babylon Bee has the special report In Mail-In Impeachment Vote, Senate Convicts Trump 8275 To 3.  Excerpts in italics with my bolds.

WASHINGTON, D.C.—In a historic move, the U.S. Senate decided to switch to voting by mail for Trump’s second impeachment trial. After all the votes were counted by an intern in a back room with no cameras, the Senate ruled to convict President Trump of incitement to violence by a vote of 8275 to 3.

“Our holy democracy has spoken,” said Senator Chuck Schumer. “Do not ask any questions or you are a blasphemer against the sacred sacredness of our vote. Everyone can go home now!”

A couple of troublemaking Senators attempted to overthrow the Constitution by bringing up the point that there are only 100 Senators, making it impossible to arrive at a tally of 8275 to 3, but they were quickly removed from the Senate Chambers and condemned for “attempting to suppress the votes of people of color.”

The Senate then moved on to other business, passing universal healthcare by a margin of 320,000 to 4.

Footnote:  SCOTUS Conference on Election Integrity

Humor aside, Election Integrity is up for conference at SCOTUS on Friday.  The petition to be discussed is the complaint by the Pennsylvania legislature against the state Election Officer Boockvar, a proceeding that began on Sept. 28, 2020.  The petition makes clear the intent is not to overturn any completed election, but to ensure future elections are conducted according to laws in force.  From scotusblog:

Republican Party of Pennsylvania v. Boockvar

Issue:  Whether the Pennsylvania Supreme Court usurped the Pennsylvania General Assembly’s plenary authority to “direct [the] Manner” for appointing electors for president and vice president under Article II of the Constitution, as well as the assembly’s broad power to prescribe “[t]he Times, Places, and Manner” for congressional elections under Article I, when the court issued a ruling requiring the state to count absentee ballots that arrive up to three days after Election Day as long as they are not clearly postmarked after Election Day; and (2) whether that decision is preempted by federal statutes that establish a uniform nationwide federal Election Day.

The petition to be discussed is the December 15, 2020 brief from the petitioners Republican Party:

No. 20-542 REPLY BRIEF IN SUPPORT OF PETITION FOR A WRIT OF CERTIORARI

Respondents’ Oppositions only confirm what some
Respondents told the Court just weeks ago: that the
Court should grant review and resolve the important
and recurring questions presented in this case. Pa.
Dems. Br. 9, No. 20A54 (Oct. 5, 2020) (advocating for
review because the questions presented are “of
overwhelming importance for States and voters across
the country”); Sec’y Br. 2-3, No. 20A54 (Oct. 5, 2020).
Respondents uniformly fail to mention that after the
Republican Party of Pennsylvania (RPP) filed its
Petition but more than a month before Respondents
filed their Oppositions, the Eighth Circuit created a
split on the question whether the Electors Clause
constrains state courts from altering election
deadlines enacted by state legislatures. See Carson v.
Simon, 978 F.3d 1051 (8th Cir. 2020). Instead,
Respondents seek to obfuscate the matter with a
welter of vehicle arguments turning on the fact that
Pennsylvania has certified the results of the 2020
general election. In reality, however, this case is an
ideal vehicle, in part precisely because it will not affect
the outcome of this election.

Indeed, this Court has repeatedly emphasized the
imperative of settling the governing rules in advance
of the next election, in order to promote the public
“[c]onfidence in the integrity of our electoral processes
[that] is essential to the functioning of our
participatory democracy.” Purcell v. Gonzalez, 549
U.S. 1, 4 (2006). This case presents a vital and unique
opportunity to do precisely that. By resolving the
important and recurring questions now, the Court can
provide desperately needed guidance to state
legislatures and courts across the country outside the
context of a hotly disputed election and before the next
election. The alternative is for the Court to leave
legislatures and courts with a lack of advance
guidance and clarity regarding the controlling law
only to be drawn into answering these questions in
future after-the-fact litigation over a contested
election, with the accompanying time pressures and
perceptions of partisan interest.

Note:  As reported in Gateway Pundit, legally required chain of custody for ballots was broken in every battleground state and in other states as well.

Democrats Were ONLY Able to “Win” in 2020 By Breaking Chain of Custody Laws in EVERY SWING STATE

President Trump was ahead in Pennsylvania by nearly 700,000 votes.
In Michigan Trump was ahead by over 300,000 votes.
In Wisconsin Trump was ahead by 120,000 votes.

Trump was also ahead in Georgia and Nevada.

And President Trump already trounced Joe Biden in Ohio, Florida, and Iowa — three states that ALWAYS go to the eventual presidential winner.

Then suddenly Pennsylvania, Michigan, and Wisconsin announced they would not be announcing their winner that night. This was an unprecedented and coordinated move in US history.

Then many crimes occurred to swing the election to Biden, but perhaps the greatest crime was the lack of dual controls and chain of custody records that ensure a fair and free election. At a high level, when ballots are transferred or changes are made in voting machines, these moves and changes should be done with two individuals present (dual control), one from each party, and the movements of ballots should be recorded.

So when states inserted drop boxes into the election, these changes first needed to be updated through the legislature, which they weren’t, and all movements from the time when the ballots were inserted into drop boxes needed to be recorded, which they weren’t.

 

 

 

Media Chose to Lie, Not Go Broke

Martin Gurri tells the story how legacy print and tv news descended into deceit and rabble-rousing when faced with decline and eventual bankruptcy.  His article Slouching Toward Post-Journalism at City Journal is a thorough and probing analysis, of which only some excerpts are posted here, in italics with my bolds and images. The journey of the NY Times exemplifies how and why mass media went from informing to inflaming the public.

The New York Times and other elite media outlets have openly embraced advocacy over reporting.

Traditional newspapers never sold news; they sold an audience to advertisers. To a considerable degree, this commercial imperative determined the journalistic style, with its impersonal voice and pretense of objectivity. The aim was to herd the audience into a passive consumerist mass. Opinion, which divided readers, was treated like a volatile substance and fenced off from “factual” reporting.

The digital age exploded this business model. Advertisers fled to online platforms, never to return. For most newspapers, no alternative sources of revenue existed: as circulation plummets to the lowest numbers on record, more than 2,000 dailies have gone silent since the turn of the century. The survival of the rest remains an open question.

Led by the New York Times, a few prominent brand names moved to a model that sought to squeeze revenue from digital subscribers lured behind a paywall. This approach carried its own risks. The amount of information in the world was, for practical purposes, infinite. As supply vastly outstripped demand, the news now chased the reader, rather than the other way around. Today, nobody under 85 would look for news in a newspaper.

Under such circumstances, what commodity could be offered for sale?

During the 2016 presidential campaign, the Times stumbled onto a possible answer. It entailed a wrenching pivot from a journalism of fact to a “post-journalism” of opinion—a term coined, in his book of that title, by media scholar Andrey Mir. Rather than news, the paper began to sell what was, in effect, a creed, an agenda, to a congregation of like-minded souls. Post-journalism “mixes open ideological intentions with a hidden business necessity required for the media to survive,” Mir observes. The new business model required a new style of reporting. Its language aimed to commodify polarization and threat: journalists had to “scare the audience to make it donate.” At stake was survival in the digital storm.

The experiment proved controversial. It sparked a melodrama over standards at the Times, featuring a conflict between radical young reporters and befuddled middle-aged editors. In a crucible of proclamations, disputes, and meetings, the requirements of the newspaper as an institution collided with the post-journalistic call for an explicit struggle against injustice.

The old media had needed happy customers. The goal of post-journalism, according to Mir, is to “produce angry citizens.” The August 2016 article marked the point of no return in the spiritual journey of the New York Times from newspaper of record to Vatican of liberal political furor. While the impulse originated in partisan herd instinct, the discovery of a profit motive would make the change irrevocable. Rutenberg professed to find the new approach “uncomfortable” and, “by normal standards, untenable”—but the fault, he made clear, lay entirely with the “abnormal” Trump, whose toxic personality had contaminated journalism. He was the active principle in the headline “The Challenge Trump Poses to Objectivity.”

A cynic (or a conservative) might argue that objectivity in political reporting was more an empty boast than a professional standard and that the newspaper, in pandering to its audience, had long favored an urban agenda, liberal causes, and Democratic candidates. This interpretation misses the transformation in the depths that post-journalism involved. The flagship American newspaper had turned in a direction that came close to propaganda. The oppositional stance, as Mir has noted, cannot coexist with newsroom independence: writers and editors were soon to be punished for straying from the cause. The news agenda became narrower and more repetitive as journalists focused on a handful of partisan controversies—an effect that Mir labeled “discourse concentration.”  The New York Times, as a purveyor of information and a political institution, had cut itself loose from its own history.

[The Russia Collusion story] was one of the most extraordinary episodes in American politics—and the first sustained excursion into post-journalism by the American news media, led every step of the way by the New York Times.

Future media historians may hold the Trump-Russia story to be a laboratory-perfect specimen of discourse concentration. For nearly two years, it towered over the information landscape and devoured the attention of the media and the public. The total number of articles on the topic produced by the Times is difficult to measure, but a Google search suggests that it was more than 3,000—the equivalent, if accurate, of multiple articles per day for the period in question. This was journalism as if conducted under the impulse of an obsessive-compulsive personality. Virtually every report either implied or proclaimed culpability. Every day in the news marked the beginning of the Trumpian End Times.

The sum of all this sound and fury was . . . zero. The most intensively covered story in history turned out to be empty of content. Mueller’s investigation “did not identify evidence that any US persons conspired or coordinated” with the Russians. Mueller’s halting television appearance in July 2019 convinced even the most vehement partisans that he was not the knight to slay the dragon in the White House. After two years of media frenzy came an awkward moment. The New York Times had reorganized its newsroom to pursue this single story—yet, just as it had missed Trump’s coming, the paper failed to see that Trump would stay.

Yet what looked like journalistic failure was, in fact, an astonishing post-journalistic success. The intent of post-journalism was never to represent reality or inform the public but to arouse enough political fervor in readers that they wished to enter the paywall in support of the cause. This was ideology by the numbers—and the numbers were striking. Digital subscriptions to the New York Times, which had been stagnant, nearly doubled in the first year of Trump’s presidency. By August 2020, the paper had 6 million digital subscribers—six times the number on Election Day 2016 and the most in the world for any newspaper.

The Russian collusion story, though refuted objectively, had been validated subjectively, by the growth in the congregation of the paying faithful.

In throwing out the old textbook, post-journalism made transgression inevitable. In July 2019, Jonathan Weisman, who covered Congress for the Times and happened to be white, questioned on Twitter the legitimacy of leftist members of the House who happened to be black. Following criticism, Weisman deleted the offending tweets and apologized elaborately, but he was demoted nonetheless.

The dramatic confrontation had been triggered by Weisman’s tweets and the heretical headline but was really about the boundaries of expression—what was allowed and what was taboo—in a post-objective, post-journalistic time. On the contentious subjects of Trump and race, managers and reporters at the paper appeared to hold similar opinions. No one in the room defended Trump as a normal politician whose views deserved a hearing. No one questioned the notion that the United States, having elected Trump, was a fundamentally racist country. But as Baquet fielded long and pointed questions from his staff, it became clear that management and newsroom—which translated roughly to middle age and youth—held radically divergent visions of the post-journalism future.

Unlike management, the reporters were active on social media, where they had to face the most militant elements of the subscriber base. In this way, they represented the forces driving the information agenda. Baquet had disparaged Twitter and insisted that the Times would not be edited by social media. He was mistaken. The unrest in the newsroom had been propelled by outrage on the web, and the paper had quickly responded. Generational attitudes, displayed on social media, allowed no space for institutional loyalty. Baquet had demoted Weisman because of his inappropriate behavior—but the newsroom turned against him because he had picked a fight with the wrong enemy.

Two days after the town hall meeting, the New York Times inaugurated, in its magazine section, the “1619 Project”—an attempt, said Baquet, “to try to understand the forces that led to the election of Donald Trump.” Rather than dig deep into the “half of America” that had voted for the president, the newspaper chose to blame the events of 2016 on the country’s pervasive racism, not only here and now but everywhere and always.

The 1619 Project rode the social-justice ambitions of the newsroom to commodify racial polarization—and, not incidentally, to fill the void left by Robert Mueller’s failure to launch.

The project showed little interest in investigative reporting or any other form of old-school journalism. It produced no exposés of present-day injustice. Instead, it sold agenda-setting on a grand scale: the stated mission was to “reframe the country’s history by placing the consequences of slavery and the contributions of black Americans at the center of our national narrative.” The reportorial crunch implicit in this high-minded posture might be summarized as “All the news that’s fit to reframe history.”

The 1619 Project has come under fire for its extreme statements and many historical inaccuracies. Yet critics missed the point of the exercise, which was to stake out polarizing positions in the mode of post-truth: opinions could be transformed into facts if held passionately enough. The project became another post-journalistic triumph for the Times. Public school systems around the country have included the material in their curricula. Hannah-Jones received a Pulitzer Prize for her “sweeping, provocative, and personal essay”—possibly the first award offered for excellence in post-journalism. The focus on race propelled the Times to the vanguard of establishment opinion during the convulsions that followed the death of George Floyd under the knee of a white Minneapolis police officer in May 2020.

That episode replaced the Russia collusion story as the prime manufacturer of “angry citizens” and added an element of inflexibility to the usual rigors of post-journalism. Times coverage of antipolice protests was generally sympathetic to the protesters. Trump was, of course, vilified for “fanning the strife.” But the significant change came in the severe tightening of discourse: the reframing imperative now controlled the presentation of news. Reporting minimized the violence that attended the protests, for example, and sought to keep the two phenomena sharply segregated.

Less than two weeks after Floyd’s death, amid spreading lawlessness in many American cities, the paper posted an opinion piece by Republican senator Tom Cotton in its online op-ed section, titled “Time to Send in the Troops.” It called for “an overwhelming show of force” to pacify troubled urban areas. To many loyal to the New York Times, including staff, allowing Cotton his pitch smacked of treason. Led by young black reporters, the newsroom rebelled.

Once again, the mutiny began on Twitter. Many reporters had large followings; they could appeal directly to readers. In the way of social media, the most excited voices dominated among subscribers. As the base roared, the rebels moved to confront their employer.

The history-reframing mission is now in the hands of a deeply self-righteous group that has trouble discerning the many human stopping places between true and false, good and evil, objective and subjective. According to one poll, a majority of Americans shared the opinion that Cotton expressed in his op-ed. That had no bearing on the discussion. In the letter and the town hall meetings, the rebels wielded the word “truth” as if they owned it. By their lights, Cotton had lied, and the fact that the public approved of his lies was precisely what made his piece dangerous.

Revolutions tend to radicalization. The same is true of social media mobs: they grow ever more extreme until they explode.

But the New York Times is neither of these things—it’s a business, and post-journalism is now its business model. The demand for moral clarity, pressed by those who own the truth, must increasingly resemble a quest for radical conformism; but for nonideological reasons, the demand cannot afford to leave subscriber opinion too far behind. Radicalization must balance with the bottom line.

The final paradox of post-journalism is that the generation most likely to share the moralistic attitude of the newsroom rebels is the least likely to read a newspaper. Andrey Mir, who first defined the concept, sees post-journalism as a desperate gamble, doomed in the end by demographics. For newspapers and their multiple art forms developed over a 400-year history, Mir writes, the collision with the digital tsunami was never going to be a challenge to surmount but rather “an extinction-level event.”

 

 

 

Biden’s Bizarre Climate Charade

David Krayden explains in his Human Events article Joe Biden Thinks He’s Tackling Climate Change, but He’s Really Sacking the U.S. Economy.  Excerpt in italics with my bolds and images.

The Paris Accords and cancelling Keystone is just the beginning of life under the new climate regime.

President Biden’s vision is to “lead a clean energy revolution” that will free the United States from the “pollution” of carbon dioxide by 2035 and have “net-zero emissions” by 2050.

Of course, the President himself will likely not be around to see if the United States achieves either target, even if his insane plan survives successive administration. Instead, he sits in his chair like a languorous old man assiduously reading his speaking notes from his desk, looking like he is under house arrest. Still, he is governing—or at least, appearing to do so—by executive order, and the sheer mass of those dictates is not just staggering but terrifying.

The new President had barely warmed his Oval Office seat when he announced that the U.S. would return to the Paris climate accord—a job-destroying bit of global authoritarianism that is not worth the diplomatic paper it is printed on, let alone the lavish parties staged while it was being negotiated. Then, he quickly produced an executive order to cancel the XL pipeline. With the flash of another one of those pens that Biden runs through on a daily basis, he canceled 10,000 jobs in the U.S., along with another 3,000 in Canada. And this in the midst of a pandemic that even Biden has called our “dark winter!” Even uber-environmentalist Canadian Prime Minister Justin Trudeau supports the XL pipeline, and promptly said so.

Has President Biden discovered the miracle fuel that is going to make petroleum obsolescent and put the oil industry out of business—even before his administration decides to do it for them? Is that what he was up to during all those months when he cowered in his basement instead of campaigning for the presidency? Clearly, the Biden administration has not thought this through beyond the talking points.

Whether the President chooses to acknowledge it or not, oil will continue to be the principal source of energy for American consumers for quite some time to come—at least until perpetual motion is discovered. That oil that the XL pipeline was supposed to transport from America’s closest ally—Canada—will now have to be brought in by rail, a potentially more dangerous and far less environmentally friendly method than a pipeline.

Fossil fuels remain the overwhelming source of all of America’s energy needs: petroleum and natural gas account for 69% of energy usage, coal 11%, and nuclear power 8%. Renewable energy accounts for 11%, and that includes the wood you burn in the fireplace or woodstove every winter. Solar and wind power account for only a fraction of that 11%.

So clearly, with all his activist policy around climate change, President Biden has America on track for a return trip to the Middle Ages.

And like they did in the Middle Ages, the President expects Americans to have blind faith in the climate change priests who will be integral to his administration. If you don’t think the climate change movement is a religion or at least a passable cult, just listen to how its adherents talk about environmental policy. When Democrats were trying to convince us that the California wildfires were somehow the result of climate change, and not just bad forestry management, House Speaker Nancy Pelosi, sounding more like a pagan devotee than the good Catholic she claims to be, exploded: “Mother earth is angry, she is telling us. Whether she’s telling us with hurricanes in the Gulf Coast, fires in the West, whatever it is, the climate crisis is real.”

So if climate change is the culprit for every Act of God, will President Biden’s plan for Americans to live in caves and shut off the heat actually work? Not without China’s cooperation, where 29% of greenhouse gasses are emitted. Without addressing that reality, we’ll continue to spend untold trillions, lose the energy independence that we gained under former President Donald Trump, and sit in the dark, while China continues to play by its own rules—just as it has throughout the coronavirus pandemic.

What is so undemocratic about President Biden’s climate change plan is that it has been served up as an executive order, without debate, and without Congressional approval. What is so ominous about it is not its specificity—which sounds relatively harmless—but its vagueness and political potential. It’s a veritable environmental Enabling Act that can be used to justify any economic dictate, any security violation, or any foreign policy entanglement. Senate Majority Leader Chuck Schumer (D-NY) publicity advised Biden to “call a climate emergency … He can do many, many things under the emergency powers… that he could do without legislation.”

Even the President’s promise to replace the federal government’s gas-operating vehicles with electrical-powered versions is contained in another executive order to “buy American.”

The Biden administration is lying about the economic opportunities embedded in green energy, and its decision to “tackle” climate change is a blatant attempt to appease the left-wing Democrats who see Biden as their puppet. In the process, as he is doing with so many of these executive orders,

President Biden is destroying the American economy and naively trusting that brutal dictatorships like China will surrender before a bourgeois fetish like a greenhouse gas reduction target.

So much will be lost for nothing except America’s further prostration to China.

Feb. 2021 Arctic Ice Stays the Course

In January, most of the Arctic ocean basins are frozen over, and so the growth of ice extent slows down.  According to SII (Sea Ice Index) January on average adds 1.3M km2, and this month it was 1.4M.  (background is at Arctic Ice Year-End 2020).  The few basins that can grow ice this time of year tend to fluctuate and alternate waxing and waning, which appears as a see saw pattern in these images.

Two weeks into February Arctic ice extents are growing faster than the 14-year average, such that they are approaching the mean.  The graph below shows the ice recovery since mid-January for 2021, the 14-year average and several recent years.

The graph shows mid January a small deficit to average, then slow 2021 growth for some days before picking up the pace in the latter weeks.  Presently extents are slightly (1%) below average, close to 2019 and 2020 and higher than 2018.

February Ice Growth Despite See Saws in Atlantic and Pacific

As noted above, this time of year the Arctic adds ice on the fringes since the central basins are already frozen over.  The animation above shows Barents Sea on the right (Atlantic side) grew in the last two weeks by 175k km2 and is now 9% greater than the maximum last March.  Meanwhile on the left (Pacific side)  Bering below and Okhotsk above wax and wane over this period. Okhotsk is seen growing 210k km2 the first week, and giving half of it back the second week.  Bering waffles up and down ending sightly higher in the end.

The table below presents ice extents in the Arctic regions for day 44 (Feb. 13) compared to the 14 year average and 2018.

Region 2021044 Day 044 Average 2021-Ave. 2018044 2021-2018
 (0) Northern_Hemisphere 14546503 14678564 -132061 14140166 406337
 (1) Beaufort_Sea 1070689 1070254 435 1070445 244
 (2) Chukchi_Sea 966006 965691 315 965971 35
 (3) East_Siberian_Sea 1087120 1087134 -14 1087120 0
 (4) Laptev_Sea 897827 897842 -15 897845 -18
 (5) Kara_Sea 934988 906346 28642 874714 60274
 (6) Barents_Sea 837458 563224 274235 465024 372434
 (7) Greenland_Sea 645918 610436 35482 529094 116824
 (8) Baffin_Bay_Gulf_of_St._Lawrence 1057623 1487547 -429924 1655681 -598058
 (9) Canadian_Archipelago 854597 853146 1451 853109 1489
 (10) Hudson_Bay 1260471 1260741 -270 1260838 -367
 (11) Central_Arctic 3206263 3211892 -5630 3117143 89120
 (12) Bering_Sea 559961 674196 -114235 319927 240034
 (13) Baltic_Sea 116090 94341 21749 76404 39686
 (14) Sea_of_Okhotsk 1027249 930357 96892 911105 116144
 (15) Yellow_Sea 9235 28237 -19002 33313 -24078
 (16) Cook_Inlet 223 11137 -10914 11029 -10806

The table shows that Bering defict to average is offset by surplus in Okhotsk.  Baffin Bay show the largest deficit, mostly offset by surpluses in Barents, Kara and Greenland Sea.

The polar bears have a Valentine Day’s wish for Arctic Ice.

welovearcticicefinal

And Arctic Ice loves them back, returning every year so the bears can roam and hunt for seals.

Footnote:

Seesaw accurately describes Arctic ice in another sense:  The ice we see now is not the same ice we saw previously.  It is better to think of the Arctic as an ice blender than as an ice cap, explained in the post The Great Arctic Ice Exchange.

IPCC Scenarios Ensure Unreal Climate Forecasts

 

Figure 5. CO2 emissions (a) and concentrations (b), anthropogenic radiative forcing (c), and global mean temperature change (d) for the three long-term extensions. As in Fig. 3, concentration, forcing, and temperature outcomes are calculated with a simple climate model (MAGICC version 6.8.01 BETA; Meinshausen et al., 2011a, b). Outcomes for the CMIP5 versions of the long-term extensions of RCP2.6 and RCP8.5 (Meinshausen et al., 2011c), as calculated with the same model, are shown for comparison.

Roger Pielke Jr. has a new paper at Science Direct Distorting the view of our climate future: The misuse and abuse of climate pathways and scenarios.  Excerpt in italics with my bolds.

Abstract

Climate science research and assessments under the umbrella of the Intergovernmental Panel on Climate Change (IPCC) have misused scenarios for more than a decade. Symptoms of misuse have included the treatment of an unrealistic, extreme scenario as the world’s most likely future in the absence of climate policy and the illogical comparison of climate projections across inconsistent global development trajectories.

Reasons why such misuse arose include (a) competing demands for scenarios from users in diverse academic disciplines that ultimately conflated exploratory and policy relevant pathways, (b) the evolving role of the IPCC – which extended its mandate in a way that creates an inter-relationship between literature assessment and literature coordination, (c) unforeseen consequences of employing a temporary approach to scenario development, (d) maintaining research practices that normalize careless use of scenarios, and (e) the inherent complexity and technicality of scenarios in model-based research and in support of policy.

Consequently, much of the climate research community is presently off-track from scientific coherence and policy-relevance.

Attempts to address scenario misuse within the community have thus far not worked. The result has been the widespread production of myopic or misleading perspectives on future climate change and climate policy. Until reform is implemented, we can expect the production of such perspectives to continue, threatening the overall credibility of the IPCC and associated climate research. However, because many aspects of climate change discourse are contingent on scenarios, there is considerable momentum that will make such a course correction difficult and contested – even as efforts to improve scenarios have informed research that will be included in the IPCC 6th Assessment.

Discussion of How Imaginary Scenarios Spoil Attempts to Envision Climate Futures

The article above is paywalled, but a previous post reprinted below goes into the background of the role of scenarios in climate modelling, and demonstrates the effects referring to results from the most realistic model, INMCM5

Roger Pielke Jr. explains that climate models projections are unreliable because they are based on scenarios no longer bounded by reality.  His article is The Unstoppable Momentum of Outdated Science.  Excerpts in italics with my bolds.

Much of climate research is focused on implausible scenarios of the future, but implementing a course correction will be difficult.

In 2020, climate research finds itself in a similar situation to that of breast cancer research in 2007. Evidence indicates the scenarios of the future to 2100 that are at the focus of much of climate research have already diverged from the real world and thus offer a poor basis for projecting policy-relevant variables like economic growth and carbon dioxide emissions. A course-correction is needed.

In a new paper of ours just out in Environmental Research Letters we perform the most rigorous evaluation to date of how key variables in climate scenarios compare with data from the real world (specifically, we look at population, economic growth, energy intensity of economic growth and carbon intensity of energy consumption). We also look at how these variables might evolve in the near-term to 2040.

We find that the most commonly-used scenarios in climate research have already diverged significantly from the real world, and that divergence is going to only get larger in coming decades. You can see this visualized in the graph above, which shows carbon dioxide emissions from fossil fuels from 2005, when many scenarios begin, to 2045. The graph shows emissions trajectories projected by the most commonly used climate scenarios (called SSP5-8.5 and RCP8.5, with labels on the right vertical axis), along with other scenario trajectories. Actual emissions to date (dark purple curve) and those of near-term energy outlooks (labeled as EIA, BP and ExxonMobil) all can be found at the very low end of the scenario range, and far below the most commonly used scenarios.

Our paper goes into the technical details, but in short, an important reason for the lower-than-projected carbon dioxide emissions is that economic growth has been slower than expected across the scenarios, and rather than seeing coal use expand dramatically around the world, it has actually declined in many regions.

It is even conceivable, if not likely, that in 2019 the world has passed “peak carbon dioxide emissions.” Crucially, the projections in the figure above are pre-Covid19, which means that actual emissions 2020 to 2045 will be even less than was projected in 2019.

While it is excellent news that the broader community is beginning to realize that scenarios are increasingly outdated, voluminous amounts of research have been and continue to be produced based on the outdated scenarios. For instance, O’Neill and colleagues find that “many studies” use scenarios that are “unlikely.” In fact, in their literature review such “unlikely” scenarios comprise more than 20% of all scenario applications from 2014 to 2019. They also call for “re-examining the assumptions underlying” the high-end emissions scenarios that are favored in physical climate research, impact studies and economic and policy analyses.

Make no mistake. The momentum of outdated science is powerful. Recognizing that a considerable amount of climate science to be outdated is, in the words of the late Steve Rayer, “uncomfortable knowledge” — that knowledge which challenges widely-held preconceptions. According to Rayner, in such a context we should expect to see reactions to uncomfortable knowledge that include:

  • denial (that scenarios are off track),
  • dismissal (the scenarios are off track, but it doesn’t matter),
  • diversion (the scenarios are off track, but saying so advances the agenda of those opposed to action) and,
  • displacement (the scenarios are off track but there are perhaps compensating errors elsewhere within scenario assumptions).

Such responses reinforce the momentum of outdated science and make it more difficult to implement a much needed course correction.

Responding to climate change is critically important. So too is upholding the integrity of the science which helps to inform those responses. Identification of a growing divergence between scenarios and the real-world should be seen as an opportunity — to improve both science and policy related to climate — but also to develop new ways for science to be more nimble in getting back on track when research is found to be outdated.

[A previous post is reprinted below since it demonstrates how the scenarios drive forecasting by CMIP6 models, including the example of the best performant model: INMCM5]

Background from Previous Post : Best Climate Model: Mild Warming Forecasted

Links are provided at the end to previous posts describing climate models 4 and 5 from the Institute of Numerical Mathematics in Moscow, Russia.  Now we have forecasts for the 21st Century published for INM-CM5 at Izvestiya, Atmospheric and Oceanic Physics volume 56, pages218–228(July 7, 2020). The article is Simulation of Possible Future Climate Changes in the 21st Century in the INM-CM5 Climate Model by E. M. Volodin & A. S. Gritsun.  Excerpts are in italics with my bolds, along with a contextual comment.

Abstract

Climate changes in 2015–2100 have been simulated with the use of the INM-CM5 climate model following four scenarios: SSP1-2.6, SSP2-4.5, and SSP5-8.5 (single model runs) and SSP3-7.0 (an ensemble of five model runs). Changes in the global mean temperature and spatial distribution of temperature and precipitation are analyzed. The global warming predicted by the INM-CM5 model in the scenarios considered is smaller than that in other CMIP6 models. It is shown that the temperature in the hottest summer month can rise more quickly than the seasonal mean temperature in Russia. An analysis of a change in Arctic sea ice shows no complete Arctic summer ice melting in the 21st century under any model scenario. Changes in the meridional stream function in atmosphere and ocean are studied.

Overview

The climate is understood as the totality of statistical characteristics of the instantaneous states of the atmosphere, ocean, and other climate system components averaged over a long time period.

Therefore, we restrict ourselves to an analysis of some of the most important climate parameters, such as average temperature and precipitation. A more detailed analysis of individual aspects of climate change, such as changes in extreme weather and climate situations, will be the subject of another work. This study is not aimed at a full comparison with the results of other climate models, where calculations follow the same scenarios, since the results of other models have not yet been published in peer reviewed journals by the time of this writing.

The INM-CM5 climate model [1, 2] is used for the numerical experiments. It differs from the previous version, INMCM4, which was also used for experiments on reproducing climate change in the 21st century [3], in the following:

  • an aerosol block has been added to the model, which allows inputting anthropogenic emissions of aerosols and their precursors;
  • the concentrations and optical properties of aerosols are calculated, but not specified, like in the previous version;
  • the parametrizations of cloud formation and condensation are changed in the atmospheric block;
  • the upper boundary in the atmospheric block is raised from 30 to 60 km;
  • the horizontal resolution in the ocean block is doubled along each coordinate; and,
  • the software related to adaptation to massively parallel computers is improved, which allows the effective use a larger number of compute cores.

The model resolution in the atmospheric and aerosol blocks is 2° × 1.5° in longitude and latitude and 73 levels and, in the ocean, 0.5° × 0.25° and 40 levels. The calculations were performed at supercomputers of the Joint Supercomputer Center, Russian Academy of Sciences, and Moscow State University, with the use of 360 to 720 cores. The model calculated 6–10 years per 24 h in the above configuration.

Four scenarios were used to model the future climate: SSP1-2.6, SSP2-4.5, SSP3-7.0, and SSP5-8.5. The scenarios are described in [4]. The figure after the abbreviation SSP (Shared Socioeconomic Pathway) is the number of the mankind development path (see the values in [4]). The number after the dash means the radiation forcing (W m–2) in 2100 compared to the preindustrial level. Thus, the SSP1-2.6 scenario is the most moderate and assumes rapid actions which sharply limit and then almost completely stop anthropogenic emissions. Within this scenario, greenhouse gas concentrations are maximal in the middle of the 21st century and then slightly decrease by the end of the century. The SSP5-8.5 scenario is the warmest and implies the fastest climate change. The scenarios are recommended for use in the project on comparing CMIP6 (Coupled Model Intercomparison Project, Phase 6, [5]) climate models.  Each scenario includes the time series of:

  • carbon dioxide, methane, nitrous oxide, and ozone concentrations;
  • emissions of anthropogenic aerosols and their precursors;
  • the concentration of volcanic sulfate aerosol; and
  • the solar constant. 

One model experiment was carried out for each of the above scenarios. It began at the beginning of 2015 and ended at the end of 2100. The initial state was taken from the so-called historical experiment with the same model, where climate changes were simulated for 1850–2014, and all impacts on the climate system were set according to observations. The results of the ensemble of historical experiments with the model under consideration are given in [6, 7]. For the SSP3-7.0 scenario, five model runs was performed differing in the initial data taken from different historical experiments. The ensemble of numerical experiments is required to increase the statistical confidence of conclusions about climate changes.

[My Contextual Comment inserted Prior to Consideration of Results]

Firstly, the INM-CM5 historical experiment can be read in detail by following a linked post (see Resources at the end), but this graphic summarizes the model hindcasting of past temperatures (GMT) compared to HadCrutv4.

Figure 1. The 5-year mean GMST (K) anomaly with respect to 1850–1899 for HadCRUTv4 (thick solid black); model mean (thick solid red). Dashed thin lines represent data from individual model runs: 1 – purple, 2 – dark blue, 3 – blue, 4 – green, 5 – yellow, 6 – orange, 7 – magenta. In this and the next figures numbers on the time axis indicate the first year of the 5-year mean.

Secondly, the scenarios are important to understand since they stipulate data inputs the model must accept as conditions for producing forecasts according to a particular scenario (set of assumptions).  The document with complete details referenced as [4] is The Scenario Model Intercomparison Project (ScenarioMIP) for CMIP6.

All the details are written there but one diagram suggests the implications for the results described below.

Figure 5. CO2 emissions (a) and concentrations (b), anthropogenic radiative forcing (c), and global mean temperature change (d) for the three long-term extensions. As in Fig. 3, concentration, forcing, and temperature outcomes are calculated with a simple climate model (MAGICC version 6.8.01 BETA; Meinshausen et al., 2011a, b). Outcomes for the CMIP5 versions of the long-term extensions of RCP2.6 and RCP8.5 (Meinshausen et al., 2011c), as calculated with the same model, are shown for comparison.

As shown, the SSP1-26 is virtually the same scenario as the former RCP2.6, while SSP5-85 is virtually the same as RCP8.5, the wildly improbable scenario (impossible according to some analysts).  Note that FF CO2 emissions are assumed to quadruple in the next 80 years, with atmospheric CO2 rising from 400 to 1000 ppm ( +150%).  Bear these suppositions in mind when considering the INMCM5 forecasts below.

Results [Continuing From Volodin and Gritsun]

Fig. 1. Changes in the global average surface temperature (K) with respect to the pre-industrial level in experiments according to the SSP1-2.6 (triangles), SSP2-4.5 (squares), SSP3-7.0 (crosses), and SSP5-8.5 (circles) scenarios.

Let us describe some simulation results of climate change in the 21st century. Figure 1 shows the change in the globally averaged surface air temperature with respect to the data of the corresponding historical experiment for 1850–1899. In the warmest SSP5-8.5 scenario (circles), the temperature rises by more than 4° by the end of the 21st century. In the SSP3-7.0 scenario (crosses), different members of the ensemble show warming by 3.4°–3.6°. In the SSP2-4.5 scenario (squares), the temperature increases by about 2.4°. According to the SSP1-2.6 scenario (triangles) , the maximal warming by ~1.7° occurs in the middle of the 21st century, and the temperature exceeds the preindustrial temperature by 1.4° by the end of the century.

[My comment: Note that the vertical scale starts with +1.0C as was seen in the historical experiment. Thus an anomaly of 1.4C by 2100 is an increase of only 0.4C, while the SSP2-4.5 result adds 1.4C to the present]. 

The results for other CMIP6 models have not yet been published in peer-reviewed journals. However, according to the preliminary analysis (see, e.g.  https://cmip6workshop19.sciencesconf.org/ data/Session1_PosterSlides.pdf, p.29), the INM-CM5 model shows the lowest temperature increase among the CMIP6 models considered for all the scenarios due to the minimal equilibrium sensitivity to the CO2 concentration doubling, which is ~2.1° for the current model version, like for the previous version, despite new condensation and cloud formation blocks. [For more on CMIP6 comparisons see post Climate Models: Good, Bad and Ugly]

Fig. 2. Differences between the annual average surface air temperatures (K) in 2071–2100 and 1981–2010 for the (a) SSP5-8.5 and (b) SSP1-2.6 scenarios.

The changes in the surface air temperature are similar for all scenarios; therefore, we analyze the difference between temperatures in 2071–2100 and 1981–2010 under the SSP5-8.5 and SSP1-2.6 scenarios (Fig. 2). The warming is maximal in the Arctic; it reaches 10° and 3°, respectively. Other features mainly correspond to CMIP5 data [8], including the INMCM4 model, which participates in the comparison. The warming on the continents of the Northern Hemisphere is about 2 times higher than the mean, and the warming in the Southern Hemisphere is noticeably less than in the Northern Hemisphere. The land surface is getting warmer than the ocean surface in all the scenarios except SSP1-2.6, because the greenhouse effect is expected to weaken in the second half of the 21st century in this scenario, and the higher heat capacity of the ocean prevents it from cooling as quickly as the land.

The changes in precipitation in December–February and June–August for the SSP3-7.0 scenario averaged over five members of the ensemble are shown in Fig. 4. All members of the ensemble show an increase in precipitation in the winter in a significant part of middle and high latitudes. In summer, the border between the increase and decrease in precipitation in Eurasia passes mainly around or to the north of 60°. In southern and central Europe, all members of the ensemble show a decrease in precipitation. Precipitation also increases in the region of the summer Asian monsoon, over the equatorial Pacific, due to a decrease in the upwelling and an increase in ocean surface temperature (OST). The distribution of changes in precipitation mainly corresponds to that given in [6, Fig. 12.22] for all CMIP5 models.

The change in the Arctic sea ice area in September, when the ocean ice cover is minimal over the year, is of interest. Figure 5 shows the sea ice area in September 2015–2019 to be 4–6 million km2 in all experiments, which corresponds to the estimate from observations in [11]. The Arctic sea ice does not completely melt in any of the experiments and under any scenario. However, according to [8, Figs. 12.28 and 12.31], many models participating in CMIP6, where the Arctic ice area is similar to that observed at the beginning of the 21st century, show the complete absence of ice by the end of the 21st century, especially under the RCP8.5 scenario, which is similar to SSP5-8.5.

The reason for these differences is the lower equilibrium sensitivity of the INM-CM5 model.

Note that the scatter of data between experiments under different scenarios in the first half of the 21st century is approximately the same as between different members of the ensemble under the SSP3-7.0 scenario and becomes larger only after 2070. The sea ice area values are sorted in accordance with the radiative forcing of the scenarios only after 2090. This indicates the large contribution of natural climate variability into the Arctic ice area. In the SSP1-2.6 experiment, the Arctic ice area at the end of the 21st century approximately corresponds to its area at the beginning of the experiment.

Climate changes can be also traced in the ocean circulation. Figure 6 shows the change in the 5-year averaged intensity of the Atlantic meridional circulation, defined as the maximum of the meridional streamfunction at 32° N. All experiments show a decrease in the intensity of meridional circulation in the 21st century and natural fluctuations against this decrease. The decrease is about 4.5–5 Sv for the SSP5-8.5 scenario, which is close to values obtained in the CMIP5 models [8, Fig. 12.35] under the RCP8.5 scenario. Under milder scenarios, the weakening of the meridional circulation is less pronounced. The reason for this weakening of the meridional circulation in the Atlantic, as far as we know, is not yet fully understood.

Conclusion

Numerical experiments have been carried out to reproduce climate changes in the 21st century according to four scenarios of the CMIP6 program [4, 5], including an ensemble of five experiments under the SSP3-7.0 scenario. The changes in the global mean surface temperature are analyzed. It is shown that the global warming predicted by the INM-CM5 model is the lowest among the currently published CMIP6 model data. The geographical distribution of changes in the temperature and precipitation is considered. According to the model, the temperature in the warmest summer month will increase faster than the summer average temperature in Russia.

None of the experiments show the complete melting of the Arctic ice cover by the end of the 21st century. Some changes in the ocean dynamics, including the flow velocity and the meridional stream function, are analyzed. The changes in the Hadley and Ferrel circulation in the atmosphere are considered.

Resources:

Climate Models: Good, Bad and Ugly

2018 Update: Best Climate Model INMCM5

Temperatures According to Climate Models

Government of the elite, by the elite and for the elite.

It wasn’t supposed to be this way.  Along with canceling Lincoln’s name on schools (San Fran), the left is turning his words upside down.

As Frieda Vizel explains at the Tablet, the threat in our time is the enslaving of ordinary citizens by imposing a political orthodoxy favored by some elite people and wannabes.  Her article is Conformity to the Good Is Not Good.  Excerpts in italics with my bolds and added images.

These days, stringent orthodoxies seem to be the only thing on tap

These days, sadly, stringent orthodoxies seem to be the only thing on tap. For some, the dogma requires blind faith in partisan politics, even as our elected officials stoop ever lower in their conduct. For some, the dogma calls for believing our intellectual betters know best, even as their decisions—to continue and champion lockdowns as the best way to fight COVID-19, for example, even as studies show that lockdowns don’t work—run contrary to observable reality and create an endless stream of problems. Question these dogmas, question the fiery loyalty to the tribe of like-minded believers, and you meet an opposition just as vicious and close-minded as anything I’d experienced when I informed my former friends and neighbors that I was leaving the fold.

To an extent, that shouldn’t come as much of a surprise. Most people are more or less conformists. That’s how we organize as communities, as an interconnected social species. As blogger and entrepreneur Paul Graham has noted, for this interconnectedness to work, human beings must largely not only obey rules but, frequently, enforce them when they see others deviating from the norm; only a small minority swim upstream, and those could as easily end up compromising the collective as leading it to new great heights.

If my own experience taught me anything, however, it’s that Graham is only partly right: Personalities matter, but social context may matter more. How conformist we are depends on our place in society, and how much we stand to lose or gain by conforming. Oftentimes, we don’t as much conform as stay away from the can of worms that will be too much trouble if opened. As I learned all too well in my former life, in a system like that, where people are afraid to speak because they don’t want to face the consequence, it’s the bullies that thrive.

When the most cherished value is adherence to an agreed-upon, not-to-be-questioned set of beliefs, when you walk around feeling—as most Americans currently feel—that you’re not free to speak your mind, you can expect little but stasis, or, worst, a descent into a stifling social order of repression and coercion. I’ve seen this happen in my own community. Dear reader, we do not want this to be the case in America.

I get, of course, that most people who conform to a system, even one they feel is needlessly harsh about demanding obedience, do so because they believe they’re doing something good. Sure, you may think, a bunch of people on my side, in my tribe, are much too zealous, but they mean well so I’m just going to go ahead and play for the team and not worry too much about what I can or can’t say in public. Maybe I believe that the right is too hospitable to radical groups that recycle dangerous old racial theories, but I’m conservative and don’t want to give liberals the satisfaction. Maybe I think that political correctness has gone way too far, but I’m a liberal and don’t want to say anything that would give conservatives pleasure. That sort of self-censorship is precisely how communities grow more and more airless.

After I left my community, and after the psychological wounds of seeing basically everyone I ever knew and loved turn on me healed sufficiently, one thing that gave me great strength and comfort was knowing that no matter how hurt I still felt, I was at least free to think, to speak my mind, to make decisions based on my own judgments and ideas rather than follow the herd. This, to me, is what America is very much about, and I’ve come to know it enough to realize that this freedom is neither free nor easy. It depends on our collective commitment to standing up to any and all attempts to require groupthink and conformity at all costs. Now, then, is the time for all those well-meaning people who told me I was courageous for leaving a fundamentalist sect to in turn be brave and leave their own narrow-minded silos. It’s time for us all to be brave.

Advance Briefing for Glasgow COP 2021

 

Presently the next climate Conference of Parties is scheduled for Glasgow this November, Covid allowing.  (People used to say “God willing”, or “Weather permitting”, but nowadays it’s a virus in charge.)  Actually, climate hysteria is like a seasonal sickness.  Each year a contagion of anxiety and fear is created by disinformation going viral in both legacy and social media in the run up to the autumnal COP (postponed last year due to pandemic travel restrictions).  Now that climatists have put themselves at the controls of the formidable US federal government, we can expect the public will be hugely hosed with alarms over the next few months.  Before the distress signals go full tilt, individuals need to inoculate themselves against the false claims, in order to build some herd immunity against the nonsense the media will promulgate. This post is offered as a means to that end.

Media Climate Hype is a Cover Up

Back in 2015 in the run up to Paris COP, French mathematicians published a thorough critique of the raison d’etre of the whole crusade. They said:

Fighting Global Warming is Absurd, Costly and Pointless.

  • Absurd because of no reliable evidence that anything unusual is happening in our climate.
  • Costly because trillions of dollars are wasted on immature, inefficient technologies that serve only to make cheap, reliable energy expensive and intermittent.
  • Pointless because we do not control the weather anyway.

The prestigious Société de Calcul Mathématique (Society for Mathematical Calculation) issued a detailed 195-page White Paper presenting a blistering point-by-point critique of the key dogmas of global warming. The synopsis with links to the entire document is at COP Briefing for Realists

Even without attending to their documentation, you can tell they are right because all the media climate hype is concentrated against those three points.

Finding: Nothing unusual is happening with our weather and climate.
Hype: Every metric or weather event is “unprecedented,” or “worse than we thought.”

Finding: Proposed solutions will cost many trillions of dollars for little effect or benefit.
Hype: Zero carbon will lead the world to do the right thing.  Anyway, the planet must be saved at any cost.

Finding: Nature operates without caring what humans do or think.
Hype: Any destructive natural event is blamed on humans burning fossil fuels.

How the Media Throws Up Flak to Defend False Suppositions

The Absurd Media:  Climate is Dangerous Today, Yesterday It was Ideal.

Billions of dollars have been spent researching any and all negative effects from a warming world: Everything from Acne to Zika virus.  A recent Climate Report repeats the usual litany of calamities to be feared and avoided by submitting to IPCC demands. The evidence does not support these claims. An example:

 It is scientifically established that human activities produce GHG emissions, which accumulate in the atmosphere and the oceans, resulting in warming of Earth’s surface and the oceans, acidification of the oceans, increased variability of climate, with a higher incidence of extreme weather events, and other changes in the climate.

Moreover, leading experts believe that there is already more than enough excess heat in the climate system to do severe damage and that 2C of warming would have very significant adverse effects, including resulting in multi-meter sea level rise.

Experts have observed an increased incidence of climate-related extreme weather events, including increased frequency and intensity of extreme heat and heavy precipitation events and more severe droughts and associated heatwaves. Experts have also observed an increased incidence of large forest fires; and reduced snowpack affecting water resources in the western U.S. The most recent National Climate Assessment projects these climate impacts will continue to worsen in the future as global temperatures increase.

Alarming Weather and Wildfires

But: Weather is not more extreme.


And Wildfires were worse in the past.
But: Sea Level Rise is not accelerating.

post-glacial_sea_level

Litany of Changes

Seven of the ten hottest years on record have occurred within the last decade; wildfires are at an all-time high, while Arctic Sea ice is rapidly diminishing.

We are seeing one-in-a-thousand-year floods with astonishing frequency.

When it rains really hard, it’s harder than ever.

We’re seeing glaciers melting, sea level rising.

The length and the intensity of heatwaves has gone up dramatically.

Plants and trees are flowering earlier in the year. Birds are moving polewards.

We’re seeing more intense storms.

But: Arctic Ice has not declined since 2007.

But: All of these are within the range of past variability.

In fact our climate is remarkably stable, compared to the range of daily temperatures during a year where I live.

And many aspects follow quasi-60 year cycles.

The Impractical Media:  Money is No Object in Saving the Planet.

Here it is blithely assumed that the court can rule the seas to stop rising, heat waves to cease, and Arctic ice to grow (though why we would want that is debatable).  All this will be achieved by leaving fossil fuels in the ground and powering civilization with windmills and solar panels.  While admitting that our way of life depends on fossil fuels, they ignore the inadequacy of renewable energy sources at their present immaturity.

 

An Example:
The choice between incurring manageable costs now and the incalculable, perhaps even irreparable, burden Youth Plaintiffs and Affected Children will face if Defendants fail to rapidly transition to a non-fossil fuel economy is clear. While the full costs of the climate damages that would result from maintaining a fossil fuel-based economy may be incalculable, there is already ample evidence concerning the lower bound of such costs, and with these minimum estimates, it is already clear that the cost of transitioning to a low/no carbon economy are far less than the benefits of such a transition. No rational calculus could come to an alternative conclusion. Defendants must act with all deliberate speed and immediately cease the subsidization of fossil fuels and any new fossil fuel projects, and implement policies to rapidly transition the U.S. economy away from fossil fuels.

But CO2 relation to Temperature is Inconsistent.

But: The planet is greener because of rising CO2.

But: Modern nations (G20) depend on fossil fuels for nearly 90% of their energy.

But: Renewables are not ready for prime time.

People need to know that adding renewables to an electrical grid presents both technical and economic challenges.  Experience shows that adding intermittent power more than 10% of the baseload makes precarious the reliability of the supply.  South Australia is demonstrating this with a series of blackouts when the grid cannot be balanced.  Germany got to a higher % by dumping its excess renewable generation onto neighboring countries until the EU finally woke up and stopped them. Texas got up to 29% by dumping onto neighboring states, and some like Georgia are having problems.

But more dangerous is the way renewables destroy the economics of electrical power.  Seasoned energy analyst Gail Tverberg writes:

In fact, I have come to the rather astounding conclusion that even if wind turbines and solar PV could be built at zero cost, it would not make sense to continue to add them to the electric grid in the absence of very much better and cheaper electricity storage than we have today. There are too many costs outside building the devices themselves. It is these secondary costs that are problematic. Also, the presence of intermittent electricity disrupts competitive prices, leading to electricity prices that are far too low for other electricity providers, including those providing electricity using nuclear or natural gas. The tiny contribution of wind and solar to grid electricity cannot make up for the loss of more traditional electricity sources due to low prices.

These issues are discussed in more detail in the post Climateers Tilting at Windmills

The Irrational Media:  Whatever Happens in Nature is Our Fault.

An Example:

Other potential examples include agricultural losses. Whether or not insurance
reimburses farmers for their crops, there can be food shortages that lead to higher food
prices (that will be borne by consumers, that is, Youth Plaintiffs and Affected Children).
There is a further risk that as our climate and land use pattern changes, disease vectors
may also move (e.g., diseases formerly only in tropical climates move northward).36 This
could lead to material increases in public health costs

But: Actual climate zones are local and regional in scope, and they show little boundary change.

But: Ice cores show that it was warmer in the past, not due to humans.

The hype is produced by computer programs designed to frighten and distract children and the uninformed.  For example, there was mention above of “multi-meter” sea level rise.  It is all done with computer models.  For example, below is San Francisco.  More at USCS Warnings of Coastal Floodings

In addition, there is no mention that GCMs projections are running about twice as hot as observations.

Omitted is the fact GCMs correctly replicate tropospheric temperature observations only when CO2 warming is turned off.

Figure 5. Simplification of IPCC AR5 shown above in Fig. 4. The colored lines represent the range of results for the models and observations. The trends here represent trends at different levels of the tropical atmosphere from the surface up to 50,000 ft. The gray lines are the bounds for the range of observations, the blue for the range of IPCC model results without extra GHGs and the red for IPCC model results with extra GHGs.The key point displayed is the lack of overlap between the GHG model results (red) and the observations (gray). The nonGHG model runs (blue) overlap the observations almost completely.

In the effort to proclaim scientific certainty, neither the media nor IPCC discuss the lack of warming since the 1998 El Nino, despite two additional El Ninos in 2010 and 2016.

Further they exclude comparisons between fossil fuel consumption and temperature changes. The legal methodology for discerning causation regarding work environments or medicine side effects insists that the correlation be strong and consistent over time, and there be no confounding additional factors. As long as there is another equally or more likely explanation for a set of facts, the claimed causation is unproven. Such is the null hypothesis in legal terms: Things happen for many reasons unless you can prove one reason is dominant.

Finally, advocates and IPCC are picking on the wrong molecule. The climate is controlled not by CO2 but by H20. Oceans make climate through the massive movement of energy involved in water’s phase changes from solid to liquid to gas and back again. From those heat transfers come all that we call weather and climate: Clouds, Snow, Rain, Winds, and Storms.

Esteemed climate scientist Richard Lindzen ended a very fine recent presentation with this description of the climate system:

I haven’t spent much time on the details of the science, but there is one thing that should spark skepticism in any intelligent reader. The system we are looking at consists in two turbulent fluids interacting with each other. They are on a rotating planet that is differentially heated by the sun. A vital constituent of the atmospheric component is water in the liquid, solid and vapor phases, and the changes in phase have vast energetic ramifications. The energy budget of this system involves the absorption and reemission of about 200 watts per square meter. Doubling CO2 involves a 2% perturbation to this budget. So do minor changes in clouds and other features, and such changes are common. In this complex multifactor system, what is the likelihood of the climate (which, itself, consists in many variables and not just globally averaged temperature anomaly) is controlled by this 2% perturbation in a single variable? Believing this is pretty close to believing in magic. Instead, you are told that it is believing in ‘science.’ Such a claim should be a tip-off that something is amiss. After all, science is a mode of inquiry rather than a belief structure.

Summary:  From this we learn three things:

Climate warms and cools without any help from humans.

Warming is good and cooling is bad.

The hypothetical warming from CO2 would be a good thing.

 

Green Cotton Candy Climate Science

With Biden replacing the skeptical Trump presidency, we are seeing how extensive is the mass addiction to climate pseudoscience.  As children, many of us experienced cotton candy at a county fair, circus midway or amusement park.  This science is like that:  a sugar high from a fluffy, vaporous cloud lacking any calories of substance.  How strange now to see elite adults addicted to this green stuff; people like politicians, financiers, judges and captains of industry, added to children and teachers who swallowed green junk science for decades.

Richard Schulman explains in his Founders Broadsheet article The false science behind the Biden green program.  Excerpts in italics with my bolds and some added images.

Little further warming will occur at present levels of atmospheric CO2, even if present levels were to double from 400 ppm to 800 ppm.

Senator Ed Markey (D-MA) and Congresswoman Alexandria Ocasio-Cortez (D-NY) in 2019. Markey is the Senate author of the Green New Deal, the inspiration behind the Biden administration’s present green climate initiatives. (GreenNewDeal Presser)

The green program behind President Biden’s post-inaugural flood of executive orders is unscientific and deficient in economic and geostrategic common sense. The major media have endeavored to conceal that truth by means of three decades of spurious climate scare stories.

The claim that humanity faces an imminent global warming catastrophe because of its use of fossil fuels is junk science.

The central lie pushed by major media and the UN’s Intergovernmental Panel on Climate Change (IPCC) is that increases in atmospheric CO2 from fossil-fuel use will keep boosting global temperatures. This supposedly will be further multiplied catastrophically by positive feedback: as oceans warm, atmospheric water vapor, another greenhouse gas, will increase. This will foster more warming, which will further warm the oceans, generating more atmospheric water vapor, and so on. The result of these alleged positive feedback mechanisms for CO2 and H2O, the narratives claim, will be that Earth ends up being a hot lifeless planet like Venus.

The feedbacks are negative

If such positive feedbacks existed, Earth already would have turned into another Venus. The reason this hasn’t happened is because negative feedback mechanisms operate for both CO2 and H2O. These the IPCC and media have studiously avoided discussing before the public:

The truth of the matter is that increases in atmospheric CO2 have rapidly diminishing power as a greenhouse gas. Little further warming will occur at present levels of atmospheric CO2, even if present levels were to double from 400 ppm to 800 ppm.

This has been explained recently in papers by W. A. van Wijngaarden and W. Happer (“Dependence of Earth’s Thermal Radiation on Five Most Abundant Greenhouse Gases”), William Happer “Radiation Transfer,” and Howard Hayden (“CO2 and Climate: A Tutorial”).

William Happer is Professor Emeritus in Physics at Princeton University. The Happer paper, “Radiation Transfer,” was issued as one of nine “Red Team” type papers by the Executive Office of the President of the United States during the last days of the Trump administration. They were subsequently denied that White House imprimatur by Trump Science Advisor, Kelvin Droegemeier. Climate scientist Roy Spencer, another of the “Red Team” authors, believes Droegemeier did this “for political purposes and professional gain.” Droegemeier served on the National Science Board for 12 years during the George W. Bush and Obama administrations. From 2012 to 2016, he served as Obama’s Vice Chairman of the NSB.

Fears of CO2 Are Overblown

Present atmospheric CO2 levels are around 410 parts per million. The global warming catastrophists animating the Biden administration believe that further increases will doom the planet. Biden’s climate envoy John Kerry, the latest in a long line of failed prophets of doom, gives the planet just nine more years unless drastic action is taken against fossil fuel consumption.

Hayden writes that: “[T]he warming that would be caused by the next 400 ppmv (parts per million by volume) of CO2 would be about one-tenth as much as caused by the first 400 ppmv. Again, this is likely an overestimate.”

In “Radiation Transfer,” Happer writes:

At the mean distance of Earth from the Sun, sunlight carries an energy flux of about 1,360 Watts per square meter (Wm-2) …. [F]or cloud-free temperate latitudes, doubling the concentration of carbon dioxide would decrease thermal radiation to space by about 3 Wm-2.

Seasonal warming much greater than doubled CO2 warming

By way of comparison, Happer writes, because of the Earth’s elliptical orbit, which modestly modifies the distance between Sun and Earth during its annual orbit, “there is 91 Wm-2 change [in] flux from summer to winter.” In other words, the extra heating of Earth by a doubling of CO2 from present values would produce just 3.3% of the heat difference regularly observed on Earth between January vs. July. “Great efforts are needed to concoct a ‘scientific’ argument that 3 Wm-2 is worth worrying about,” Happer concludes, adding that

Any doubling of carbon dioxide concentrations will produce the same 3 Wm-2 decrease of flux to space whether we consider doubling the pre-industrial value of 280 parts per million (ppm) to 560 ppm, which could happen by about the year 2100 at the current rate of increase around 2 ppm/year. Doubling the current 410 ppm atmospheric concentration to 820 ppm would take about two centuries.

The physics of diminishing CO2 greenhouse behavior

The reason that CO2 exhibited diminishing returns as a greenhouse gas (GHG) — even before present atmospheric levels were reached — is that to be a GHG, an atmospheric gas must prevent infrared radiation from escaping into space. CO2 does this only in a very narrow band of infrared frequencies (the 14 to 16 micrometer band). This band quickly, at much lower atmospheric levels of CO2 than at present, tends towards saturation. But meanwhile, radiation across the rest of the much larger spectrum of infrared frequencies escapes into space, thereby maintaining Earth at an equilibrium temperature.

This is one of the important negative feedback mechanisms that prevents Earth from becoming a Venus.

Why isn’t this obvious to the climate catastrophists backing the UN’s IPCC? There are two approaches to this question: one, scientific; the other sociological. The scientific rationale, if there is a respectable rationale to be made, is described by Hayden as follows:

[W]e know from satellite measurements that the temperature rise since about 1979 has been almost 0.6º C, far above that caused directly by CO2. Climate scientists are fully aware of these numbers. They know that increasing CO2 concentration—by itself—has little effect on temperature even if the amount doubles. The claim is that the warming is amplified by the increase in the H2O greenhouse effect.

If this claim is wrong, any warming beyond the little that can be attributed to increased CO2 must be of natural origin (or changes in land use, not fossil fuels).

H2O greenhouse behavior also diminishes, Hayden adds:

In case you are wondering why the earth did not bootstrap itself into boiling temperatures during the Eemian Interglacial, the Holocene Climate Optimum, the Minoan Warm Period, the Roman Warm Period, the Medieval Warm Period, or thousands of other warmings, the answer is that the climate is not controlled by positive feedback—where hot weather begets even hotter weather—but by negative feedback—where, as things get hotter, they shed more heat. For the last half-billion years, this negative feedback system has kept the temperature of the surface of the earth within a few percent of its present 288 K.

Water vapor also is subject to negative feedback. Increased atmospheric levels of it condense as clouds, rain, snow, or ice. Nor does it escape the same saturation effect to which CO2 is subject. Wijngaarden and Happer write:

[A]t current concentrations, the forcings [greenhouse gas behaviors] from all greenhouse gases are saturated. The saturations of the abundant greenhouse gases H2O and CO2 are so extreme that the per-molecule forcing is attenuated by four orders of magnitude with respect to the optically thin values [values at lower concentrations]. Saturation also suppresses the forcing power per molecule for the less abundant greenhouse gases O3 , N2O and CH4 from their optically thin values, but far less than for H2O and CO2.

The cheapening of climate science programs

A compelling sociological explanation for the perspectives of the CO2 catastrophists is provided by climate scientist Judith Curry. In an interview with Christopher Balkaran of the Strong and Free podcast channel, she explains what happened to the field of climate studies between when she began to study climate at graduate school in the late seventies and early eighties and the present.

Curry: Climate change wasn’t a really big issue at that point. At the time, it was all about geophysical fluid dynamics, trying to understand the circulations of atmosphere and the ocean, radiative transfer, cloud physics. It was, it was very physics based. I would hear in the media about people talking about, Oh, the ice age is coming , or doom and gloom from CO2 emissions, but nobody was really paying attention to all that very much in terms of what I would say the mainstream field until the late 1980s, really.

[D]efenders of the IPCC started pushing the idea that anybody who doubts us or challenges us, they are in the pay of big oil. After that, it became much more difficult to really challenge all that. And certainly by the turn of the century, anybody who was questioning the hockey stick or any of these other things were slammed as deniers and ostracized. And then after Climategate in 2010, the consensus enforcers became very militant. So it’s a combination of politics, and some mediocre scientists trying to protect their careers. [T]hey saw this whole thing as a way for career advancement, and it gives them a seat at the big table and political power….

I was old school at the University of Chicago with geophysical fluid dynamics and all this really hard stuff…. There’s very few universities that have good programs in climate dynamics at this point…. Climate dynamics is still there, but it’s far from dominant…. [T]here are all these new degree programs spinning up in climate, that are far away from the geo-physical roots. These new programs combine policy with a little bit of science and economics and whatever… [T]he science part of it basically gets minimized. And that’s where all the students are running to… leaving a talent dearth of people with the good mathematical physical mindset and wanting to enter into the more challenging fields.

Conclusion: Retribution in 2022?

In conclusion, the Biden administration is basing its entire economic and geopolitical policies on bad CO2 science.

The policies are harmful to working people here and abroad. The policies abandon US fossil fuel strengths to the benefit of a hostile China, Russia, and Iran. They damage the US economy. As Sen. John Barasso (R-Wyo) pointed out in a USA Today op-ed,

A federal leasing ban would kill an estimated 62,000 jobs in New Mexico, nearly 120,000 in Texas and more than 33,000 in my home state of Wyoming next year alone, according to the American Petroleum Institute. It will also eliminate hundreds of millions of dollars in revenue that these states depend on for public schools, roads, water projects and other essential services. In response to the Keystone cancellation, TC Energy has announced 1,000 layoffs, and the potential union jobs lost could be 10 times higher.

If there’s something approximating a fair election in 2022, Democrats will pay a price for their unscientific climate policy, so contrary to the national interest, in the House of Representatives and Senate.

Rapid Covid Tests Saving the Day

Finally, the promise of rapid Covid tests is paying off because public officials put them to use.  An article at Epoch Times explains: Calls Grow for US to Rely on Rapid Tests to Fight Pandemic  Excerpts in italics with my bolds.

When a Halloween party sparked a COVID-19 outbreak at North Carolina Agricultural and Technical State University, school officials conducted rapid screening on more than 1,000 students in a week, including many who didn’t have symptoms.

Although such asymptomatic screening isn’t approved by regulators and the 15-minute tests aren’t as sensitive as the genetic one that can take days to yield results, the testing director at the historically black college credits the approach with quickly containing the infections and allowing the campus to remain open.

Within the span of a week, we had crushed the spread. If we had had to stick with the PCR test, we would have been dead in the water,” said Dr. Robert Doolittle, referring to the polymerase chain reaction test that is considered the gold standard by many doctors and Food and Drug Administration regulators.

Compounding the problem is that an estimated 40% of people infected don’t develop symptoms. It’s among these silent spreaders that Mina says rapid tests have the clear advantage over lab tests. With its medical precision, he argues that the PCR test continues to detect COVID-19 in many people who have already fought off the virus and are no longer contagious. The rapid test, while less sensitive, is better at quickly catching the virus during the critical early days of infection when it can spread explosively through communities.

“This isn’t a clinical test — it’s a public health screening test,” Mina said.

The case for widescale rapid testing is getting a boost from universities and school systems that have used the approach to stay open through the latest waves of the pandemic. And proponents point to apparent success stories like the small European nation of Slovakia, which saw infections drop after screening two-thirds of its roughly 5 million people with the tests.

And from the Jerusalem Post, a report Israel is following the Slovakian approach: Rapid COVID-19 tests reduce morbidity in Slovakia by 60% in one week.  Excerpts in italics with my bolds.

Quidel’s Sofia kits for rapid coronavirus testing. (photo credit: SOFIA ISRAEL)

The Slovakian model for combating the spread of the virus has proven itself time and time again • pandemic could be completely eradicated worldwide within six weeks

Perhaps the most widely used coronavirus rapid test kit is the Sofia kit developed by American company Quidel. In the past couple of months, Israel has been receiving shipments of the rapid test kits and deploying them throughout the country, while connecting them to systems used by the Health Ministry.

“Rapid and large-scale tests have proven to be efficient in dealing with the coronavirus pandemic around the globe, at least until the long-anticipated vaccine reaches the entire population, and probably after as well,” a statement by Sofia Israel said.

The 300 kits of Sofia which were purchased by Israel have been distributed to retirement homes, medical clinics, health funds and IDF bases, as well as to Israel Police and Prison Services facilities. Sofia Israel has noted that this means Israel would be able to adopt the Slovakian model of conducting large-scale rapid tests successfully.

“The importance of receiving immediate results is critical in managing the outbreak, and overpowers the benefits of slow and more accurate lab tests that require several days for providing results,” a Sofia Israel press release said.

Each Sofia kit can administer 40 tests per hour and provide results within 15 minutes. In a day, 960 Israelis can be tested using one kit only. The 300 kits already distributed, working 24/7, have the capacity to test 288,000 Israelis every day.

Background from previous post: Covid Rapid Tests Finally Out from Quebec Storage (200 Scientists Ask)

Our son is working mightily to keep his HVAC enterprise alive during the contagion.  Last week a key employee after visiting a pharmacy where a person tested positive was told to go home and get tested (took two days), and then stay home to wait for results (3 more days).  So a work week was lost and the business hobbled during that time.  So I wondered what happened to all those rapid covid tests that were purchased back in September.

The story came out in the Montreal Gazette January 14, 2021 Scientists publish open letter calling for Quebec to use rapid testing.  Excerpts in italics with my bolds.

“We have 1.2 million of those tests just sitting in a warehouse in Farnham,” Université de Montréal professor Roxane Borgès Da Silva says.

A group of 213 scientists, professors, health-care workers and patients published an open letter to the Legault government Thursday calling on Quebec to roll out rapid COVID-19 tests to curb outbreaks more quickly and to step up its communications strategy.

“We have 1.2 million of those tests just sitting in a warehouse in Farnham,” Roxane Borgès Da Silva, a professor with the Université de Montréal’s school of public health, said in an interview Wednesday night. “We have reached a point in the evolution of the pandemic where the health system is at the breaking point. It is time that we use every tool at our disposal.

In the letter, the signatories from institutions including the Université de Montréal, McGill, Laval, UQAM and the Institut national de santé publique du Québec (INSPQ), note the partial closure of businesses and restrictions on gatherings have failed to stem the increase in coronavirus cases, hospitalizations and deaths.

With a month-long curfew instituted Saturday, officials must take advantage of the expected decline in cases to counter the spread through better screening, which is “all the more important in a context where possible variants of the SARS-CoV-2 virus, which are more infectious, could spread in Quebec,” the letter reads.

Using rapid tests could help to identify positive cases more quickly, particularly those among asymptomatic patients and those who have been at so-called “super-spreader” events, the letter states

“In this regard, we believe that the simple and widely available access throughout the territory to rapid tests … allowing a result to be obtained in a few minutes could be a game-changer, especially in places of propagation.”

Quebec has been hesitant to use the tests widely because it fears their lack of sensitivity could clear people for COVID-19 when they actually have the virus. But Da Silva said the tests are close to 90-per-cent accurate when used on patients who are in an extremely contagious phase, which is crucial to stopping the most dangerous transmitters. The tests could be used at workplaces, high schools and CHSLDs, or be made available at pharmacies and doctors’ offices to allow the public to get tested quickly, Da Silva said.

As well, the signatories argue the government needs to do a better job of transmitting data and information to the public to improve citizens’ confidence in the science and improve compliance with regulations. They suggest using improved communication campaigns that appeal to the general public and using new strategies that incorporate humour and music.

More from CBC (here): 

The tests, which return a result in as little as 15 minutes, have variously been called a “game-changer” (Ontario Premier Doug Ford) and “less safe” as the gold-standard PCR lab test (Quebec Health Minister Christian Dubé, to Le Devoir).

The chair of McGill University’s bioengineering department, Dr. David Juncker, leans to the Ford end of the spectrum —provided the tests are used effectively. Right now they are not, he said, and it’s to the detriment of the broader testing effort.

“The current testing system isn’t very effective in terms of contact, trace and isolate … it’s too slow, it’s too cumbersome, it has too many delays. That’s one of the reasons we’re failing in containing the spread of the pandemic,” said Juncker, an expert on diagnostic testing.

The main area of concern for the provincial government — also voiced by federal officials — is the the rapid test’s lower accuracy, or sensitivity, and the risk of false negatives.

Those fears are overblown, in Juncker’s view, because the rapid tests can still help ferret out highly infectious people.

“If we just speak about diagnostic performance … the PCR test is the most effective one,” he said. “But if we think about what we want to use this for, as a public health tool that we want to use to contain and detect infectious individuals very quickly and isolate them very fast, that’s where rapid tests can be very helpful.”

My Comment:  The need is to quickly identify people with enough viral load to infect others, and to become sick themselves.  Rapid tests excel at this when applied to people with symptoms that might or might not be Covid19.  Officials have been obsessed with PCR tests which are hyper-sensitive and show people as positive with too little load or even from a trace of dead virus.  Those false positives generate lots of fear and clog the system with people unnecessarily.

Background from Previous Post On Non-Infectious Covid Positives

Daniel Payne writes at Just the News Growing research indicates many COVID-19 cases might not be infectious at all. Excerpts in italics with my bolds.

Elevated ‘cycle thresholds’ may be detecting virus long after it is past the point of infection.

A growing body of research suggests that a significant number of confirmed COVID-19 infections in the U.S. — perhaps as many as 9 out of every 10 — may not be infectious at all, with much of the country’s testing equipment possibly picking up mere fragments of the disease rather than full-blown infections.

Yet a burgeoning line of scientific inquiry suggests that many confirmed infections of COVID-19 may actually be just residual traces of the virus itself, a contention that — if true — may suggest both that current high levels of positive viruses are clinically insignificant and that the mitigation measures used to suppress them may be excessive.

Background from previous post: New Better and Faster Covid Test

Kevin Pham reports on a breakthrough in coronavirus testing. Excerpts in italics with my bolds.

Another new test for COVID-19 was recently authorized — and this one could be a game-changer.

The Abbot Diagnostics BinaxNOW antigen test is a new point-of-care test that reportedly costs only $5 to administer, delivers results in as little as 15 minutes, and requires no laboratory equipment to perform. That means it can be used in clinics far from commercial labs or without relying on a nearby hospital lab.

That last factor is key. There are other quick COVID-19 tests on the market, but they have all required lab equipment that can be expensive to maintain and operate, and costs can be prohibitive in places that need tests most.

This kind of test is reminiscent of rapid flu tests that are ubiquitous in clinics. They’ll give providers tremendous flexibility in testing for the disease in not just clinics, but with trained and licensed medical professionals, in schools, workplaces, camps, or any other number of places.

So what’s new about this test? Most of the current tests detect viral RNA, the genetic material of SARS-CoV-2. This is a very accurate way of detecting the virus, but it requires lab equipment to break apart the virus and amplify the amount of genetic material to high enough levels for detection.

The BinaxNOW test detects antigens — proteins unique to the virus that are usually detectable whenever there is an active infection.

Abbott says it intends to produce 50 million tests per month starting in October. That’s far more than the number tested in July, when we were breaking new testing records on a daily basis with approximately 23 million tests recorded.

There’s a more important reason to be encouraged by this test coming available.  The viral load is not amplified by the test, so a positive is actually a person needing isolation and treatment.  As explained in a previous post below,  the PCR tests used up to now clutter up the record by showing as positive people with viral loads too low to be sick or to infect others.

Background from Previous Post The Truth About CV Tests

The peoples’ instincts are right, though they have been kept in the dark about this “pandemic” that isn’t.  Responsible citizens are starting to act out their outrage from being victimized by a medical-industrial complex (to update Eisenhower’s warning decades ago).  The truth is, governments are not justified to take away inalienable rights to life, liberty and the pursuit of happiness.  There are several layers of disinformation involved in scaring the public.  This post digs into the CV tests, and why the results don’t mean what the media and officials claim.

For months now, I have been updating the progress in Canada of the CV outbreak.  A previous post later on goes into the details of extracting data on tests, persons testing positive (termed “cases” without regard for illness symptoms) and deaths after testing positive.  Currently, the contagion looks like this.

The graph shows that deaths are less than 5 a day, compared to a daily death rate of 906 in Canada from all causes.  Also significant is the positivity ratio:  the % of persons testing positive out of all persons tested each day.  That % has been fairly steady for months now:  1% positive means 99% of people are not infected. And this is despite more than doubling the rate of testing.

But what does testing positive actually mean?  Herein lies more truth that has been hidden from the public for the sake of an agenda to control free movement and activity.  Background context comes from  Could Rapid Coronavirus Testing Help Life Return To Normal?, an interview at On Point with Dr. Michael Mina.  Excerpts in italics with my bolds. H/T Kip Hansen

A sign displays a new rapid coronavirus test on the new Abbott ID Now machine at a ProHEALTH center in Brooklyn on August 27, 2020 in New York City. (Spencer Platt/Getty Images)

Dr. Michael Mina:

COVID tests can actually be put onto a piece of paper, very much like a pregnancy test. In fact, it’s almost exactly like a pregnancy test. But instead of looking for the hormones that tell if somebody is pregnant, it looks for the virus proteins that are part of SA’s code to virus. And it would be very simple: You’d either swab the front of your nose or you’d take some saliva from under your tongue, for example, and put it onto one of these paper strips, essentially. And if you see a line, it means you’re positive. And if you see no line, it means you are negative, at least for having a high viral load that could be transmissible to other people.

An antigen is one of the proteins in the virus. And so unlike the PCR test, which is what most people who have received a test today have generally received a PCR test. And looking those types of tests look for the genome of the virus to RNA and you could think of RNA the same way that humans have DNA. This virus has RNA. But instead of looking for RNA like the PCR test, these antigen tests look for pieces of the protein. It would be like if I wanted a test to tell me, you know, that somebody was an individual, it would actually look for features like their eyes or their nose. And in this case, it is looking for different parts of the virus. In general, the spike protein or the nuclear capsid, these are two parts of the virus.

The reason that these antigen tests are going to be a little bit less sensitive to detect the virus molecules is because there’s no step that we call an amplification step. One of the things that makes the PCR test that looks for the virus RNA so powerful is that it can take just one molecule, which the sensor on the machine might not be able to detect readily, but then it amplifies that molecule millions and millions of times so that the sensor can see it. These antigen tests, because they’re so simple and so easy to use and just happen on a piece of paper, they don’t have that amplification step right now. And so they require a larger amount of virus in order to be able to detect it. And that’s why I like to think of these types of tests having their primary advantage to detect people with enough virus that they might be transmitting or transmissible to other people.”

The PCR test, provides a simple yes/no answer to the question of whether a patient is infected.
Source: Covid Confusion On PCR Testing: Maybe Most Of Those Positives Are Negatives.

Similar PCR tests for other viruses nearly always offer some measure of the amount of virus. But yes/no isn’t good enough, Mina added. “It’s the amount of virus that should dictate the infected patient’s next steps. “It’s really irresponsible, I think, to [ignore this]” Dr. Mina said, of how contagious an infected patient may be.

We’ve been using one type of data for everything,” Mina said. “for [diagnosing patients], for public health, and for policy decision-making.”

The PCR test amplifies genetic matter from the virus in cycles; the fewer cycles required, the greater the amount of virus, or viral load, in the sample. The greater the viral load, the more likely the patient is to be contagious.

This number of amplification cycles needed to find the virus, called the cycle threshold, is never included in the results sent to doctors and coronavirus patients, although if it was, it could give them an idea of how infectious the patients are.

One solution would be to adjust the cycle threshold used now to decide that a patient is infected. Most tests set the limit at 40, a few at 37. This means that you are positive for the coronavirus if the test process required up to 40 cycles, or 37, to detect the virus.

Any test with a cycle threshold above 35 is too sensitive, Juliet Morrison, a virologist at the University of California, Riverside told the New York Times. “I’m shocked that people would think that 40 could represent a positive,” she said.

A more reasonable cutoff would be 30 to 35, she added. Dr. Mina said he would set the figure at 30, or even less.

Another solution, researchers agree, is to use even more widespread use of Rapid Diagnostic Tests (RDTs) which are much less sensitive and more likely to identify only patients with high levels of virus who are a transmission risk.

Comment:  In other words, when they analyzed the tests that also reported cycle threshold (CT), they found that 85 to 90 percent were above 30. According to Dr. Mina a CT of 37 is 100 times too sensitive (7 cycles too much, 2^7 = 128) and a CT of 40 is 1,000 times too sensitive (10 cycles too much, 2^10 = 1024). Based on their sample of tests that also reported CT, as few as 10 percent of people with positive PCR tests actually have an active COVID-19 infection. Which is a lot less than reported.

Here is a graph showing how this applies to Canada.

It is evident that increased testing has resulted in more positives, while the positivity rate is unchanged. Doubling the tests has doubled the positives, up from 300 a day to nearly 600 a day presently.  Note these are PCR results. And the discussion above suggests that the number of persons with an active infectious viral load is likely 10% of those reported positive: IOW up from 30 a day to 60 a day.  And in the graph below, the total of actual cases in Canada is likely on the order of 13,000 total from the last 7 months, an average of 62 cases a day.