February Arctic Ice Jumps Over 15 Wadhams a Month Early

For ice extent in the Arctic, the bar is set at 15M km2. The highest daily average in the last 18 years occurs on day 61 at 15.08M before descending. Most years are able to clear 15M, but in recent previous years, 2017, 2018, 2019 and 2021 ice extents failed to clear the bar at 15M km2.  Now on February 11, 2024 (day 42) Arctic ice extent has already leaped over that bar 20 days early.

All years including averages are from MASIE, except for SII 2024.

The graph shows how rapidly the Arctic froze this year, reaching 14.4M km2 extent already on January 24.  Then the extent waffled around that level, until suddenly a Hockey Stick shape appeared when 600k km2 of ice was added in just the last four days. That is 400k km2 above average, and well above many other years, including 2006.  SII is also lagging at 400k km2 lower.

The table shows the distribution of ice compared to day 45 averages and other years on that day.

Region 2024042 Day 45 Ave. 2024-Ave. 2006045 2024-2006
 (0) Northern_Hemisphere 15040629 14687838 352791 14419407 621223
 (1) Beaufort_Sea 1070983 1070317 667 1069711 1273
 (2) Chukchi_Sea 966006 965761 245 966006 0
 (3) East_Siberian_Sea 1087137 1087131 6 1087103 35
 (4) Laptev_Sea 897845 897837 8 897773 71
 (5) Kara_Sea 934647 908486 26161 932726 1920
 (6) Barents_Sea 662793 582078 80715 530801 131992
 (7) Greenland_Sea 825638 622774 202864 579677 245961
 (8) Baffin_Bay_Gulf_of_St._Lawrence 1340370 1456370 -115999 1227497 112873
 (9) Canadian_Archipelago 854860 853383 1478 852715 2145
 (10) Hudson_Bay 1260903 1260579 325 1257433 3470
 (11) Central_Arctic 3233243 3208074 25168 3198987 34255
 (12) Bering_Sea 631508 700745 -69237 889518 -258010
 (13) Baltic_Sea 136308 90991 45317 79904 56404
 (14) Sea_of_Okhotsk 1101713 923694 178019 759197 342516

The Pacific basins show a moderate deficit in Bering Sea offset by a large 178k km2 surplus in Okhotsk.  Baffin Bay is down 120k km2, offset by Greenland Sea over 200k km2 and Barents up 81k km2.

These results fly in the face of those claiming for years that Arctic ice is in a “death spiral.”  More sober and clear-eyed observers have called out the alarmists for their exaggerations.  A recent example comes from Allan Alsup Jensen at Nordic Institute of Product Sustainability, Environmental Chemistry and Toxicology, Denmark.  His December 2023 paper is Time Trend of the Arctic Sea Ice Extent.  Excerpts in italics with my bolds and added images.

Since 2007 no significant decline has been observed

Abstract

The NSIDC website, IPCC’s reports and some scientific papers have announced that the Arctic Sea ice extent, when it is lowest in September month, in recent years has declined dramatically, and in few decades the sea ice is supposed to disappear completely in the summer. In that way new and shorter ships routes will open up north of the continents.

The facts are, that the Arctic Sea ice extent measured by satellites since 1978 expresses annual variations and it has declined considerably from 1997 to 2007. However, before that time period, from 1978 to 1996, the downward trend was minimal, and in the last 17 years from 2007 to 2023 the downward trend has also been about zero. Therefore, there is no indication that we should expect the Arctic Sea summer ice to disappear completely, as predicted, in one or two decades.

Regarding the extent of the summer (February) sea ice at the Antarctic, the downward trend during the years 1979-2021 was very small but in 2022 and 2023 a considerable decline was observed, and a decline was also clearly observed for the whole period of 2007- 2023. That was in contradiction to what happened in the Arctic. The pattern of the annual levels was not the same for the Arctic and Antarctic, indicating different drivers in the North and the South.

Figure 4: The minimum extent of the sea ice at Antarctic
in February month 1979-2023 (data
from NSIDC.org)

These data show that there is no apparent correlation between the variable extent of the Arctic and the Antarctic Sea ice and the gradually increasing CO2-concentrations in the atmosphere as proposed by NSIDC, IPCC and others, also for these areas of cold climate.

Postscript Feb. 14

Some seek to deny the current plateau in Arctic Sea Ice by saying that extent measure is only surface, while volume would be a truer metric.  That is true in theory, but in practice obtaining accurate and consistent data on sea ice thickness is a challenge yet to be reached.  As you can imagine, detecting a depth dimension from satellites is fraught with errors, especially with drift ice not land anchored, moving around, sometimes piling up from winds.  The scientific effort to measure volume has a short history and several uncertainties to ovecome before it can be trusted.

Unfortunately for those wanting an ice free Arctic (well, no more than 1 Wadham they say), the volume record so far shows the same plateau:

“Satellite derived sea ice thickness (CryoSat 2, AWI algorithm v2.6) shows an anomaly thickness pattern very similar to that from PIOMAS, but CS2 shows negative anomalies propagating north of the Canadian Archipelago into the central Arctic while PIOMAS has neutral conditions there. A positive thickness anomaly around Wrangle Island is spatially more extensive in CS2. January 2024 adds another month to the record of CS2 data which now spans 13 years. Neither CS2 nor PIOMAS show any discernible trend over that time period underlining the importance of internal variability at decadal timescales.”  Source: Polar Science Center

 

UAH January 2024: Ocean Warm, Land Cooling

The post below updates the UAH record of air temperatures over land and ocean. Each month and year exposes again the growing disconnect between the real world and the Zero Carbon zealots.  It is as though the anti-hydrocarbon band wagon hopes to drown out the data contradicting their justification for the Great Energy Transition.  Yes, there has been warming from an El Nino buildup coincidental with North Atlantic warming, but no basis to blame it on CO2.  

As an overview consider how recent rapid cooling  completely overcame the warming from the last 3 El Ninos (1998, 2010 and 2016).  The UAH record shows that the effects of the last one were gone as of April 2021, again in November 2021, and in February and June 2022  At year end 2022 and continuing into 2023 global temp anomaly matched or went lower than average since 1995, an ENSO neutral year. (UAH baseline is now 1991-2020). Now we have an usual El Nino warming spike of uncertain cause, but unrelated to steadily rising CO2.

For reference I added an overlay of CO2 annual concentrations as measured at Mauna Loa.  While temperatures fluctuated up and down ending flat, CO2 went up steadily by ~60 ppm, a 15% increase.

Furthermore, going back to previous warmings prior to the satellite record shows that the entire rise of 0.8C since 1947 is due to oceanic, not human activity.

gmt-warming-events

The animation is an update of a previous analysis from Dr. Murry Salby.  These graphs use Hadcrut4 and include the 2016 El Nino warming event.  The exhibit shows since 1947 GMT warmed by 0.8 C, from 13.9 to 14.7, as estimated by Hadcrut4.  This resulted from three natural warming events involving ocean cycles. The most recent rise 2013-16 lifted temperatures by 0.2C.  Previously the 1997-98 El Nino produced a plateau increase of 0.4C.  Before that, a rise from 1977-81 added 0.2C to start the warming since 1947.

Importantly, the theory of human-caused global warming asserts that increasing CO2 in the atmosphere changes the baseline and causes systemic warming in our climate.  On the contrary, all of the warming since 1947 was episodic, coming from three brief events associated with oceanic cycles. And now in 2023 we are seeing an amazing episode with a temperature spike driven by ocean air warming in all regions, with some cooling the last two months. 

Update August 3, 2021

Chris Schoeneveld has produced a similar graph to the animation above, with a temperature series combining HadCRUT4 and UAH6. H/T WUWT

image-8

 

mc_wh_gas_web20210423124932

See Also Worst Threat: Greenhouse Gas or Quiet Sun?

January 2024 El Nino Spikes Higher While Land Cools 

banner-blog

With apologies to Paul Revere, this post is on the lookout for cooler weather with an eye on both the Land and the Sea.  While you will hear a lot about 2020-21 temperatures matching 2016 as the highest ever, that spin ignores how fast the cooling set in.  The UAH data analyzed below shows that warming from the last El Nino had fully dissipated with chilly temperatures in all regions. After a warming blip in 2022, land and ocean temps dropped again with 2023 starting below the mean since 1995.  Spring and Summer 2023 saw a series of warmings, continuing into October, but with cooling since. 

UAH has updated their tlt (temperatures in lower troposphere) dataset for January 2024. Posts on their reading of ocean air temps this month preceded updated records from HadSST4.  I last posted on SSTs using HadSST4 Ocean Warming Spike Recedes December 2023.  This month also has a separate graph of land air temps because the comparisons and contrasts are interesting as we contemplate possible cooling in coming months and years.

Sometimes air temps over land diverge from ocean air changes.  November 2023 was notable for a dichotomy between Ocean and Land air temperatures in UAH dataset. Remarkably a new high for Ocean air temps appeared with warming in all regions, while Land air temps dropped with cooling in all regions.  As a result the Global Ocean and Land anomaly result remained little changed. Now again in January 2024, ocean temps went higher driven by El Nino and NH, while all land regions cooled except for Tropics.

Note:  UAH has shifted their baseline from 1981-2010 to 1991-2020 beginning with January 2021.  In the charts below, the trends and fluctuations remain the same but the anomaly values change with the baseline reference shift.

Presently sea surface temperatures (SST) are the best available indicator of heat content gained or lost from earth’s climate system.  Enthalpy is the thermodynamic term for total heat content in a system, and humidity differences in air parcels affect enthalpy.  Measuring water temperature directly avoids distorted impressions from air measurements.  In addition, ocean covers 71% of the planet surface and thus dominates surface temperature estimates.  Eventually we will likely have reliable means of recording water temperatures at depth.

Recently, Dr. Ole Humlum reported from his research that air temperatures lag 2-3 months behind changes in SST.  Thus cooling oceans portend cooling land air temperatures to follow.  He also observed that changes in CO2 atmospheric concentrations lag behind SST by 11-12 months.  This latter point is addressed in a previous post Who to Blame for Rising CO2?

After a change in priorities, updates are now exclusive to HadSST4.  For comparison we can also look at lower troposphere temperatures (TLT) from UAHv6 which are now posted for January.  The temperature record is derived from microwave sounding units (MSU) on board satellites like the one pictured above. Recently there was a change in UAH processing of satellite drift corrections, including dropping one platform which can no longer be corrected. The graphs below are taken from the revised and current dataset.

The UAH dataset includes temperature results for air above the oceans, and thus should be most comparable to the SSTs. There is the additional feature that ocean air temps avoid Urban Heat Islands (UHI).  The graph below shows monthly anomalies for ocean air temps since January 2015.

Note 2020 was warmed mainly by a spike in February in all regions, and secondarily by an October spike in NH alone. In 2021, SH and the Tropics both pulled the Global anomaly down to a new low in April. Then SH and Tropics upward spikes, along with NH warming brought Global temps to a peak in October.  That warmth was gone as November 2021 ocean temps plummeted everywhere. After an upward bump 01/2022 temps reversed and plunged downward in June.  After an upward spike in July, ocean air everywhere cooled in August and also in September.   

After sharp cooling everywhere in January 2023, all regions were into negative territory. Note the Tropics matched the lowest value, but since have spiked sharply upward +1.7C, with the largest increases in April to July, and continuing through adding to a new high January 2024. NH also spiked upward to a new high, while Global ocean rise was more modest due to slight SH cooling.  

Land Air Temperatures Tracking in Seesaw Pattern

We sometimes overlook that in climate temperature records, while the oceans are measured directly with SSTs, land temps are measured only indirectly.  The land temperature records at surface stations sample air temps at 2 meters above ground.  UAH gives tlt anomalies for air over land separately from ocean air temps.  The graph updated for December is below.

Here we have fresh evidence of the greater volatility of the Land temperatures, along with extraordinary departures by SH land.  Land temps are dominated by NH with a 2021 spike in January,  then dropping before rising in the summer to peak in October 2021. As with the ocean air temps, all that was erased in November with a sharp cooling everywhere.  After a summer 2022 NH spike, land temps dropped everywhere, and in January, further cooling in SH and Tropics offset by an uptick in NH. 

Remarkably, in 2023, SH land air anomaly shot up 2.1C, from  -0.6C in January to +1.5 in September, then dropped sharply, now down to 0.6 in January 2024.  NH land temps have also dropped 0.3C down to 1.0C, resulting in Global land temps cooling to 0.9C, matching the peak in Feb. 2016. Land in the Tropics was unchanged in January, down slightly from its October peak.

The Bigger Picture UAH Global Since 1980

The chart shows monthly Global anomalies starting 01/1980 to present.  The average monthly anomaly is -0.05, for this period of more than four decades.  The graph shows the 1998 El Nino after which the mean resumed, and again after the smaller 2010 event. The 2016 El Nino matched 1998 peak and in addition NH after effects lasted longer, followed by the NH warming 2019-20.   An upward bump in 2021 was reversed with temps having returned close to the mean as of 2/2022.  March and April brought warmer Global temps, later reversed

With the sharp drops in Nov., Dec. and January 2023 temps, there was no increase over 1980. Now in 2023 the buildup to the October/November peak exceeded the sharp April peak of the El Nino 1998 event. It also surpassed the February peak in 2016.  December and January are down slightly, but where it goes from here, up or down further, remains to be seen.

The graph reminds of another chart showing the abrupt ejection of humid air from Hunga Tonga eruption.

TLTs include mixing above the oceans and probably some influence from nearby more volatile land temps.  Clearly NH and Global land temps have been dropping in a seesaw pattern, nearly 1C lower than the 2016 peak.  Since the ocean has 1000 times the heat capacity as the atmosphere, that cooling is a significant driving force.  TLT measures started the recent cooling later than SSTs from HadSST3, but are now showing the same pattern. Despite the three El Ninos, their warming has not persisted prior to 2023, and without them it would probably have cooled since 1995.  Of course, the future has not yet been written.

 

Covid19 mRNA Vaccines Fiasco 2024 Update

A thorough examination of the trials and tribulations of the mRNA shots proclaimed and sometimes forced upon people comes from a dream team of experts in the field:   The authors of this research paper are highly qualified experts, including, according to Liberty Counsel, “biologist and nutritional epidemiologist M. Nathaniel Mead; research scientist Stephanie Seneff, Ph.D.; biostatistician and epidemiologist Russ Wolfinger, Ph.D.; immunologist and biochemist Dr. Jessica Rose; biostatistician and epidemiologist Kris Denhaerynck, Ph.D.; Vaccine Safety Research Foundation Executive Director Steve Kirsch; and cardiologist, internist, and epidemiologist Dr. Peter McCullough.”

The peer reviewed paper is at Cureus COVID-19 mRNA Vaccines: Lessons Learned from the Registrational Trials and Global Vaccination Campaign.  Excerpts in italics with my bolds and added images.

Abstract

Our understanding of COVID-19 vaccinations and their impact on health and mortality has evolved substantially since the first vaccine rollouts. Published reports from the original randomized phase 3 trials concluded that the COVID-19 mRNA vaccines could greatly reduce COVID-19 symptoms. In the interim, problems with the methods, execution, and reporting of these pivotal trials have emerged. Re-analysis of the Pfizer trial data identified statistically significant increases in serious adverse events (SAEs) in the vaccine group. Numerous SAEs were identified following the Emergency Use Authorization (EUA), including death, cancer, cardiac events, and various autoimmune, hematological, reproductive, and neurological disorders.

Furthermore, these products never underwent adequate safety and toxicological testing in accordance with previously established scientific standards. Among the other major topics addressed in this narrative review are:

♦   the published analyses of serious harms to humans;
♦   quality control issues and process-related impurities;
♦   mechanisms underlying adverse events (AEs):
♦   the immunologic basis for vaccine inefficacy; and
♦   concerning mortality trends based on the registrational trial data.

The risk-benefit imbalance substantiated by the evidence to date contraindicates further booster injections and suggests that, at a minimum, the mRNA injections should be removed from the childhood immunization program until proper safety and toxicological studies are conducted. Federal agency approval of the COVID-19 mRNA vaccines on a blanket-coverage population-wide basis had no support from an honest assessment of all relevant registrational data and commensurate consideration of risks versus benefits.

Given the extensive, well-documented SAEs and unacceptably high harm-to-reward ratio, we urge governments to endorse a global moratorium on the modified mRNA products until all relevant questions pertaining to causality, residual DNA, and aberrant protein production are answered.

Background

Political and financial incentives may have played a key role in undermining the scientific evaluation process leading up to the EUA. Lalani and colleagues documented the major investments made by the US government well before authorization [18]. Even prior to the pandemic, the US National Institutes of Health invested $116 million (35%) in mRNA vaccine technology, the Biomedical Advanced Research and Development Authority (BARDA) had invested $148 million (44%), while the Department of Defense (DOD) contributed $72 million (21%) to mRNA vaccine development. BARDA and the DOD also collaborated closely in the co-development of Moderna’s mRNA vaccine, dedicating over $18 billion, which included guaranteed vaccine purchases [18]. This entailed pre-purchasing hundreds of millions of mRNA vaccine doses, alongside direct financial support for the clinical trials and the expansion of Moderna’s manufacturing capabilities. The public funding provided for developing these products through Operation Warp Speed surpassed investments in any prior public initiative [19]. Once the pandemic began, $29.2 billion (92% of which came from US public funds) was dedicated to the purchase of COVID-19 mRNA products; another $2.2 billion (7%) was channelled into supporting clinical trials, and $108 million (less than 1%) was allocated for manufacturing and basic research [18]. This profuse spending of taxpayer dollars continued throughout the pandemic: BARDA spent another $40 billion in 2021 alone [20].

Using US taxpayer money to purchase so many doses in advance would suggest that, prior to the EUA process, US federal agencies were strongly biased toward successful outcomes for the registrational trials. Moreover, it is reasonable to surmise that such extensive vested interests could have influenced the decision to prematurely halt the registrational trials. Unblinding essentially nullified the “placebo-controlled” element of the trials, eliminating the control group and thus undermining the ability to objectively assess the mRNA vaccines’ safety profile and potential serious AEs (SAEs). Thus, while the accelerated authorization showcased the government’s dedication to provide these novel products, it also raised concerns among many experts regarding risk-benefit issues and effectively eliminated the opportunity to learn about the potential long-range harms of the mRNA inoculations. The political pressures to rapidly deliver a solution may have compromised the thoroughness and integrity of the scientific evaluation process while downplaying and obfuscating scientific concerns about the potential risks associated with mRNA technology.

Concerns about inadequate safety testing extend beyond the usual regulatory approval standards and practices. Although we employ the terms “vaccine” and “vaccination” throughout this paper, the COVID-19 mRNA products are also accurately termed gene therapy products (GTPs) because, in essence, this was a case of GTP technology being applied to vaccination [21]. European regulations mandate the inclusion of an antigen in vaccines, but these immunogenic proteins are not intrinsic to the mRNA vaccines [22]. The GTP vaccine platform has been studied for over 30 years as an experimental cancer treatment, with the terms gene therapy and mRNA vaccination often used interchangeably [23]. This is due to the mRNA products’ specific mode of action: synthetic mRNA strands, encapsulated within a protective lipid nanoparticle (LNP) vehicle, are translated within the cells into a specific protein that subsequently stimulates the immune system against a specific disease. Another accurate label would be prodrugs because these products stimulate the recipient’s body to manufacture the target protein [24]. As there were no specific regulations at the time of the rapid approval process, regulatory agencies quickly “adapted” the products, generalized the definition of “vaccine” to accommodate them, and then authorized them for EUA for the first time ever against a viral disease. However, the rationale for regulating these products as vaccines and excluding them from regulatory oversight as GTPs lacks both scientific and ethical justification [21]. (Note: Throughout this review, the terms vaccines and vaccinations will be used interchangeably with injections, inoculations, biologicals, or simply, products.)

Due to the GTPs’ reclassification as vaccines, none of their components have been thoroughly evaluated for safety. The main concern, in a nutshell, is that the COVID-19 mRNA products may transform body cells into viral protein factories that have no off-switch (i.e., no built-in mechanism to stop or regulate such proliferation), with the spike protein (S-protein) being generated for prolonged periods, causing chronic, systemic inflammation and immune dysfunction [25,26]. This S-protein is the common denominator between the coronavirus and the vaccine, which helps to explain the frequent overlap in AEs generated by both the infection and the inoculation [25]. The vaccine-induced S-protein is more immunogenic than its viral counterpart; and yet, the increased antibody production is also associated with more severe immunopathology and other adverse effects [27]. The Pfizer and Moderna mRNA products contain mRNA with two modified codons that result in a version of the S-protein that is stabilized in its prefusion state [28]. This nucleoside-modified messenger RNA technology is intended to extend the synthetic mRNA’s persistence in the body. When the S-protein enters the bloodstream and disseminates systemically, it may become a contributing factor to diverse AEs in susceptible individuals [25].

Revisiting the registrational trials

Although randomized controlled trials (RCTs) are viewed as the gold standard for testing the safety and efficacy of medical products (due to minimizing bias), trials of limited scope can readily obscure the true safety and efficacy issues with respect to different segments of the population. In this case, the trials excluded key sub-groups, notably children, pregnant women, frail elderly persons, and immuno-compromised individuals, as well as those with cancer, autoimmune disease, and other chronic inflammatory conditions [45]. Whereas the founding trials did not recruit individuals with comorbidities, vaccine recipients in the rollouts showed the actual presence of these underlying conditions. Rather than assess these well-known safety and comorbid risk concerns, the focus was narrowly placed on the potential for inflammatory lung injury as had been seen in COVID-19 patients and, many years earlier, in immunized animal models infected with SARS-CoV [46]. We are now beginning to recognize the folly of this narrow safety focus, as millions of severe and life-threatening events associated with the COVID-19 vaccines continue to be documented in the medical literature [47-51].

What did the pivotal trials reveal about overall (all-cause) mortality? After carefully analyzing the ACM for the Pfizer and Moderna trials, Benn and colleagues found 61 deaths total (31 in vaccine, 30 in placebo) and a mortality RR of 1.03 (0.63-1.71), comparing the vaccinated to placebo [52]. These findings can be interpreted as “no significant difference” or no gold-standard evidence showing these mRNA vaccines reduce mortality. The lack of significant differences in deaths between the study arms is noteworthy. The true mortality impact remains unknown in this context, and this fact alone is relevant, as it would be preferable to take a vaccine with good trial evidence of reduced mortality than to take a vaccine where trial evidence does not show convincing evidence of improved survival [53]. Similarly, a subsequent analysis of the Pfizer trial data concluded that mortality rates were comparable between vaccinated and placebo groups during the initial 20-week period of the randomized trial [54]. The fact that the mRNA vaccinations did not lead to a reduction in overall mortality implies that, if the injections were indeed averting deaths specifically attributable to COVID-19, any such reduction might be offset by an increase in mortality stemming from other causes, such as SAEs.

For the Pfizer and Moderna registrational trials, Benn et al. also reported a non-significant 45% increase in cardiovascular deaths (RR=1.45; 95%CI 0.67-3.13) in the vaccine arms of the trials [52]. This outcome was consistent with numerous reports of COVID-19 vaccine-related cardiovascular pathology among both young and old segments of the population [57-63]. None of the mortality estimates from the trials are statistically significant. Nevertheless, the upward trends for both ACM and cardiovascular deaths are concerning. If the Pfizer trial had not been prematurely discontinued, and assuming death rates remain the same in both arms as observed in the first six months, the ACM difference would reach the standard threshold for statistical significance (p < 0.05) at approximately 2.8 years (34 months). The p-value is 0.065 at 2.5 years and 0.053 at 2.75 years (see Appendix 1). These calculations were independently confirmed by Masterjohn [64].

Absolute risk and the “number needed to vaccinate (NNV)”

It is imperative to carefully weigh all potential risks associated with the COVID-19 mRNA products. Should substantial harms be linked to their use, the perceived “reward” conveyed by the NNV would necessitate a re-appraisal. For example, assuming an NNV of 119 and an IFR of 0.23% (both conservative estimates), approximately 52,000 vaccinations would be needed to prevent one COVID-19-related death. Thus, for the BNT162b2 injection, a generous estimate would be two lives saved from COVID-19 for every 100,000 courses of the biological. Given the evidence of trial misconduct and data integrity problems (see next section), we conjecture that this estimate is an “upper bound”, and therefore the true benefit is likely to be much lower. Regarding potential harms, assuming 30% false-positive reports and a moderate under-reporting factor of 21, we calculate a risk of 27 deaths per 100,000 doses of BNT162b2. Thus, applying these reasonable, conservative assumptions, the estimated harms of the COVID-19 mRNA vaccines greatly outweigh the rewards: for every life saved, there were nearly 14 times more deaths caused by the modified mRNA injections (for details, see Appendix 2).

Underreporting of harms and data integrity issues

When Pfizer’s Six-Month Interim Report of Adverse Events (C4591001) revealed a total death count of 38 [35], the number seemed unexpectedly low for a clinical trial involving 44,060 participants amidst a pandemic. To investigate, Michels and colleagues estimated the anticipated deaths based on US mortality rates in 2020, presuming comparability across participating countries [54]. With 132 trial sites in the US and 80% of subjects, they estimated that 222 deaths should have occurred between July 27, 2020, and March 13, 2021, making the observed 38 deaths only 17% of the projected number. Most of the trial sites had fewer deaths than anticipated, possibly attributed to a considerable percentage of “Lost to Follow-up” subjects (4.2% of randomized subjects), including 395 unique subjects within the study period. While some sites recorded negligible losses, others exhibited substantial figures, up to 5% of the site’s subjects [54]. These numbers likely contributed to the seemingly low overall death count and should have prompted increased efforts to locate these individuals. Losing track of nearly 400 study participants in the follow-up observation period could have substantially compromised the validity and generalizability of the results. The missing data can produce biased estimates, leading to invalid conclusions. This could result in a distortion of vaccine efficacy and underestimation of SAEs (including deaths), thus misrepresenting the safety profile of the mRNA products. In short, Pfizer’s failure to minimize participant attrition seriously undermined the accuracy and reliability of the six-month study’s conclusions.

These concerns are further compounded by revelations concerning substandard research practices and inadequate data management in the pivotal trials. A whistleblower report by a former employee of the contract research organization responsible for enrolling patients in Pfizer’s pivotal trial raises significant questions regarding data integrity and the safety of trial participants [85]. Among the trial conduct issues documented were failure to report protocol deviations, improper storage of vaccines, mislabeling of laboratory specimens, and lack of timely follow-up for patients experiencing AEs, possibly leading to underreporting. In terms of regulatory oversight, the FDA inspected only nine out of the 153 study sites involved in the Pfizer trial [86].

Finally, an unblinding of participants occurred early in the trial, potentially on a wide scale across different study sites. Participants were not presented with clear information regarding potential AEs in both trial protocols and consent forms [87]. Some parts of the consent form were misleading and merely intended to elicit participation that might not otherwise have occurred if the volunteers had been made aware that what was promised in theory or “on paper” was unlikely to happen in reality [87]. As a result, participants were not being granted truly informed consent; the potential injuries and AEs most likely to be caused by the vaccinations were never openly stated.

This lack of informed consent carried over into the real-world setting following the EUA. For example, not publicly disclosing the Pfizer trial’s exclusion of pregnant women is arguably among the CDC’s most egregious oversights when asserting the safety of COVID-19 vaccine administration during pregnancy [1]. The Nuremberg Code established patients’ rights to voluntary informed consent in the aftermath of World War II [88]. US courts consistently support informed consent as a fundamental right for patients’ autonomy [89]. Informed consent procedures must provide clear distinctions between risks that are frequently observed, risks that occur rarely, and the more obvious risk of lack of effectiveness or waning immunity, which is separate from the risk of SAEs. Whether in a clinical trial or free-living real-world setting, informed consent is essential to providing a clear understanding of the potential risks associated with receiving a genetic vaccine. Throughout the pandemic, healthcare workers were duty-bound to provide clear risk-benefit information to patients. In practice, however, informed consent was non-existent, as information sheets were blank [90], and vaccinees were never informed of potential risks beforehand.

Shifting narratives, illusions of protection

The best evidence for the failure of the COVID-19 mRNA vaccine’s ability to confer protection against COVID-19 comes from two large cohort studies of employees within the Cleveland Clinic Health System (CCHS) after the bivalent mRNA boosters became available [99,100]. In the first study (n=51,017), COVID-19 occurred in 4,424 (8.7%) during the 26-week observation period [99]. In terms of preventing infections by the three prevailing Omicron subvariants, the vaccine effectiveness was 29%, 20%, and a non-significant 4%, respectively [99]. No protection was provided when the XBB lineages were dominant. Notably, the risk of “breakthrough” infection was significantly higher among those who received the earlier vaccine, and a higher frequency of vaccinations resulted in a greater risk of COVID-19 [100]. In a second CCHS cohort study (n= 48,344), adults who were “not up-to-date” by the CDC definition had a 23% lower incidence of COVID-19 than those “up-to-date” with their vaccinations [100]. These findings are further reinforced by multiple real-world studies showing rapidly waning protection against Omicron infection after the boosters [101]. The vaccine effectiveness against laboratory-confirmed Omicron infection and symptomatic disease rapidly wanes within three months of the primary vaccination cycle and booster dose [97].

In a recent study of nearly five million adults, those who had a SARS-CoV-2 infection within 21 days post injection showed an eight-fold increased risk of ischemic stroke (OR=8.00, 95%CI 4.18-15.31) and a five-fold increased risk of hemorrhagic stroke when compared to vaccinees without concurrent infection (OR=5.23, 95%CI 1.11-24.64) [121]. The risk was highest for those receiving the mRNA-1273 injections. Thus, SARS-CoV-2 infection close to the time of vaccination produced a strong association with early incidence of ischemic and hemorrhagic strokes [121]. Again, with a hybrid immunity approach, the potential harms may greatly outweigh the rewards.

Natural immunity carries none of these risks and is more than sufficient against the mild virulence of Omicron subvariants. Much evidence now indicates that natural immunity confers robust, durable, and high-level protection against COVID-19 severe illness [122-126]. A large United Kingdom (UK) study of over 30,000 healthcare workers, having a prior history of SARS-CoV-2 infection, showed an 84% reduced risk of reinfection, with a median protective period of seven months [125]. In a large observational study in Israel, previously infected individuals who remained unvaccinated were 6-13 times less likely to contract the virus compared to those who were vaccinated [122]. Among 32,000 individuals within the same healthcare system, vaccinated individuals had a 27-time higher risk of developing symptomatic COVID-19 and an eight-time higher risk of hospitalization compared to their unvaccinated counterparts [122].

After recovering from COVID-19, the body harbors long-lived memory immune cells, indicating an enduring capacity to respond to new infections, potentially lasting many years [127]. Mounting evidence suggests that the training of antibodies and induction of T-cell memory resulting from repeated natural infection with Omicron can augment the mitigation of future infections [128,129]. In a recent cohort study, children who had experienced prior infection showed long-lasting protection against reinfection with SARS-CoV-2 for a minimum of 18 months [130]. Such children between the ages of five and 11 years demonstrated no decline in protection during the entire study, while those aged 12-18 experienced a mild yet measurable decline in protection over time [130]. For these younger generations in particular, natural immunity is more than sufficient and of course vastly safer than the mRNA inoculations.

Analyses of serious harms to humans

For both the Pfizer and Moderna trials combined, there were about 125 SAEs per 100,000 vaccine recipients, which translates into one SAE for every 800 vaccinees [50]. Because the trials avoided the most frail as participants, one would expect to see even higher proportions of SAEs in the population-wide rollouts. Remarkably, the Pfizer trial exhibited a 36% higher risk of SAEs in the vaccine group compared to the placebo, with a risk difference of 18.0 (95%CI 1.2-34.9) per 10,000 vaccinated; risk ratio 1.36 (95%CI 1.02-1.83). These findings stand in sharp contrast with the FDA’s initial claim that SAEs reported by the two pivotal trials were “balanced between treatment groups” [15,50]. The discrepancy may be partly explained by the fact that the FDA was focusing only on individual participant data, and yet many of those individuals were experiencing multiple SAEs. Instead of analyzing individuals, Fraiman et al. focused on total SAEs to take into account the multiple, concurrent events [50]. When the SAEs were viewed collectively, the risks in the vaccine group were substantially elevated beyond those previously determined by the FDA.

Analyses of two large drug safety reporting systems in the US and Europe revealed over 7.8 million AEs reported by approximately 1.6 million individuals following COVID-19 vaccination [47]. When compared to individuals aged 18-64 years, the older age groups exhibited a higher frequency of death, hospitalizations, and life-threatening reactions, with RR estimates ranging from 1.49 (99%CI 1.44-1.55) to 8.61 (99%CI 8.02-9.23). Signals were identified for myocardial infarction, pulmonary embolism, cardio-respiratory arrest, cerebral infarction, and cerebral hemorrhage associated with both mRNA vaccines. These signals, along with ischemic strokes, were confirmed by a large disproportionality analysis [48]. In an independent risk-benefit analysis, BNT162b2 produced 25 times more SAEs than the number of severe COVID-19 cases prevented [51].

Finally, autopsy studies have provided additional evidence of serious harms. In a comprehensive systematic review with full independent adjudication, 74% of autopsy findings (240 out of 325 cases), were judged to have been caused by the COVID-19 mRNA products [139]. The mean time from injection to death was 14.3 days, and the vast majority of deaths had the cardiovascular system as the single fatal organ system injury to the body.

These findings help explain the wide range of well-documented COVID-19 vaccine-induced toxicities that impact the nervous, gastrointestinal, hepatic, renal, hematological, immune, and reproductive systems [25,144,145]. Post-mortem examinations are critical for identifying potential SAEs of the mRNA inoculations. However, as clinics and hospital administrations have a large vested interest in the COVID-19 vaccines’ distribution, the common administrative practice of discouraging autopsies and postponing autopsy reports only serves to undermine comprehensive risk assessment, perpetuate public misconceptions regarding safety, and weaken public health policymaking [145].

Conclusions

Based on the research presented in this narrative review, the global COVID-19 vaccination campaign should be regarded as a grave medical error. Medical errors represent a substantial threat to personal and public safety and have long constituted a leading cause of death [288-290]. Misguided political and regulatory decisions were made at the highest levels and may have been heavily influenced by financial incentives. Government agencies should have considered all reasonable treatment alternatives and deflected away pressures from the medical-pharmaceutical industry rather than allowing population-wide distribution of experimental genetic vaccines.

Had the FDA recognized the nearly four-fold increase in cardiac SAEs (including deaths) subsequently identified in the Pfizer trial’s vaccine group [54], it is doubtful that the EUA would have transpired in December 2020. An in-depth investigation of the COVID-19 vaccine’s long-term safety profile is now urgently needed. Despite the many striking revelations discussed in this review, most developed countries continue to advocate the ongoing adoption of COVID-19 mRNA boosters for the entire eligible population. US federal agencies still emphasize the safety of the vaccines in reducing severe illness and deaths caused by the coronavirus, despite the absence of any randomized, double-blind, placebo-controlled trials to support such claims. This reflects a bewildering disconnect between evidence-based scientific thinking and public health policy.

Careful, objective evaluation of COVID-19 mRNA product safety is crucial for upholding ethical standards and evidence-informed decision-making. Our narrative review concerning the registrational trials and the EUA’s aftermath offers evidence-informed insights into how these genetic vaccines were able to enter the market. In the context of the two pivotal trials, safety was never assessed in a manner commensurate with previously established scientific standards either for vaccines or for GTPs, the more accurate classification of these products. Many key trial findings were either misreported or omitted entirely from published reports. The usual safety testing protocols and toxicology requirements were bypassed by the FDA and vaccine manufacturers, and the premature termination of both trials obviated any unbiased assessment of potential SAEs due to an insufficient timeframe for proper trial evaluation.

It was only after the EUA that the serious biological consequences of rushing the trials became evident, with numerous cardiovascular, neurological, reproductive, hematological, malignant, and autoimmune SAEs identified and published in the peer-reviewed medical literature. Moreover, the COVID-19 mRNA vaccines produced via Process 1 and evaluated in the trials were not the same products eventually distributed worldwide; all of the COVID-19 mRNA products released to the public were produced via Process 2 and have been shown to have varying degrees of DNA contamination. The failure of regulatory authorities to heretofore disclose process-related impurities (e.g., SV40) has further increased concerns regarding safety and quality control oversight of mRNA vaccine manufacturing processes.

Since early 2021, excess deaths, cardiac events, strokes, and other SAEs have often been wrongly ascribed to COVID-19 rather than to the COVID-19 mRNA vaccinations. Misattribution of SAEs to COVID-19 often may be due to the amplification of adverse effects when mRNA injections are followed by SARS-CoV-2 subvariant infection. Injuries from the mRNA products overlap with both PACS and severe acute COVID-19 illness, often obscuring the vaccines’ etiologic contributions. Multiple booster injections appear to cause immune dysfunction, thereby paradoxically contributing to heightened susceptibility to COVID-19 infections with successive doses. For the vast majority of adults under the age of 50, the perceived benefits of the mRNA boosters are profoundly outweighed by their potential disabling and life-threatening harms. Potential harms to older adults appear to be excessive as well.

Given the well-documented SAEs and unacceptable harm-to-reward ratio, we urge governments to endorse and enforce a global moratorium on these modified mRNA products until all relevant questions pertaining to causality, residual DNA, and aberrant protein production are answered.

 

Net Zero Not Only Inhuman, It’s Also Ecocidal

Roger Palmer speaks quietly, but with the force of knowledge and logic on the subject of global warming/climate change.  Two expressions of his perspective are presented here: firstly a brief video and transcript, and secondly excerpts from his 2024 paper. Transcript in italics with my bolds and added images.  H/T Raymond Inauen

1. Trust Climate History, Not Hysteria

I’m Roger Palmer, a retired engineer living in Victoria, British Columbia. Today I want to talk about climate change hysteria. The popular press is overflowing with sensational but scary headlines: the hottest day on record, sea levels are rising, climate catastrophe. It’s never been like this before, climate change is an existential threat, we are declaring a climate emergency, it’s man’s fault.

These hysterical messages are reinforced at disruptions organized by career demonstrators and professional protesters. Politicians are falling over themselves to agree with these claims and position themselves as the only viable saviors of mankind who are able to stop the climate from changing. You can’t get elected if you are perceived as being soft on climate change.

The authors of all this spurious noise unfortunately do not have a good understanding of science or the historical paleoclimatic record. These people are so arrogant and self-centered that they believe that man can control the solar system and somehow cancel the naturally occurring climate cycles, so that the earth’s climate stays just the way they want it.

Let’s start the discussion by outlining a difference between weather and climate. When a person speaks about weather they are referring to how the atmosphere is behaving over the short term hours or days and usually over a small area. The term climate refers to the statistics of weather over a defined large region over a long period of time, decades or more. the atmospheric characteristics being described include temperature, winds, moisture, clouds and precipitation.

But it is the temperature that most people seem to focus on. In the 1970s the concern was about global cooling, but it has now shifted to global warming. An example of a weather statement is: “It will be cooler and windy in downtown Ottawa tomorrow.” An example of a climate statement is: “North America will be warmer over the next two decades.”

Reliable equipment for measuring temperature has been available since the early 1800s, but unfortunately the number and placement of temperature recording stations has changed considerably over time. So it is often difficult to get a complete and consistent record for a specific area temperature history. The period preceding the 19th century must be inferred by analyzing ice cores, tree growth rings, sediments and corals. Ice cores typically from Greenland, Antarctica or the Arctic are the most commonly used proxies. And it is possible to infer temperatures from thousands or millions of years ago. It is also possible to use ice cores to estimate the historical composition of the atmosphere.

Although surface temperature is what humans actually feel on a day-to-day basis, that data can be contaminated by urban heat islands. So it is sometimes more meaningful to talk about the temperature of the troposphere, which is the lowest layer of the earth’s atmosphere about 20 kilometers thick and is where all the weather takes place; the clouds, precipitation, storms, winds etc.  Temperatures in the troposphere can be directly measured by balloon-borne radiosondes or inferred from satellite radiometry.

Geological records show that the earth’s average temperature has varied cyclically for many millions of years. Sometimes it has been much hotter than today and sometimes much cooler. This graph estimates variations in temperature during the last 500 million years. The earth is approximately four and a half billion years old; predecessors of man have been on earth for about two and a half million years; and modern homo sapiens have been around for about two hundred thousand years.

Here is what the earth’s temperature has been doing over the past five hundred thousand years and here is the temperature record for more recent times; the last 11, 000 years otherwise known as the Holocene era.

The earth would be a much cooler place if it did not have an atmosphere. The atmosphere contains a number of gases that warm the earth by what is called the greenhouse effect. Which is when solar radiation from the sun can easily pass through the gases to the earth, but outgoing infrared radiation from the earth’s surface is partially blocked from radiating off into space by these same gases. Further details of this mechanism are given in the references.

There are several different greenhouse gases but everyone seems to focus on just one of them: carbon dioxide known as CO2. The concentration of carbon dioxide in the atmosphere is sometimes thought to be the driver of the earth’s temperature, but the geological record shows that there has been no correlation.

The absolutely dominant greenhouse gas is water vapor. The earth’s glaciers and ice caps have grown and shrunk cyclically over time. The earth recently exited the Little Ice Age and is currently warming just as in previous cycles. There is definitely a new ice age coming, but none of us will live to experience it. We are currently in an interglacial, which is a period between ice ages.

As shown by the earlier graphs the earth’s climate is not being driven by changes in the co2 level. Indeed changes in the atmospheric CO2 concentration are probably a result of changes in the earth’s temperature as oceans and land masses release stored CO2 resulting from long-term temperature changes. As the glaciers and ice caps cyclically build and recede, there are corresponding changes in the sea level. The sea level has cyclically varied from today’s levels by as much as plus or minus 200 meters. And these fluctuations are expected to continue for thousands of years to come.

So what is causing these long-term cyclical changes in the earth’s average temperature? A recently posted youtube series entitled Paleoclimatology parts one through three gives an in-depth analysis of the factors at work. Here is a summary of just some of the main factors:

♦  Continental drifts as result of plate tectonics has caused very long-term climate changes as the ocean’s heat carrying currents have been forced to take different paths;

♦  Milankovitch cycles due to changes in the earth’s tilt, precession and orbital eccentricity and cyclical changes in the solar system’s orbital alignments have demonstrably produced corresponding changes in the earth’s climate over both the long term and the short term;

♦  Cyclical changes in the sun’s total output radiated power. cyclical changes in the sun’s output spectral distribution especially the ultraviolet component

♦  Variations in the earth’s magnetic field resulting in changes in the magnitude and position of the earth’s magnetosphere which shields us from incoming cosmic particles and the solar wind

♦  Variations in upper level bacteria which serve as nucleation sites for clouds and precipitation

♦  Changes in the earth’s average cloud cover as a result of changes in many of the factors just mentioned

♦  Changes in the earth’s upper atmospheric wind currents that are used to distribute heat energy throughout the pallet of the planet

Note that carbon dioxide concentration is not a significant cause of these natural cyclical changes. CO2 has some effect on long-term climate changes but it is not the dominant determinant of global temperature. Then why are the agitators and politicians so obsessed with this and why are they arbitrarily blaming man-made CO2 emissions from the combustion of fossil fuels as threatening disruptions to their climate nirvana?

Perhaps there’s a hidden agenda. Current proposals to decarbonize the earth by eliminating fossil fuels will have a minor effect on climate, but will cause extraordinary economic harm. Maybe the true goal of the protesters is to destroy capitalism in the western world.

CO2 is a clear odorless gas. Atmospheric CO2 levels have been much higher in the past and were sometimes much lower. CO2 is not a pollutant–it is essential to life. If the atmospheric CO2 concentration were to drop below 150 parts per million, the earth’s vegetation would not be able to survive and the earth would become a barren wasteland.  There have been proposals to use large-scale geoengineering to alter the earth’s climate, such as by surrounding the earth with orbiting reflective particles or mirrors. But such schemes are fraught with political as well as technical dangers.

The Intergovernmental Panel on Climate Change known as the IPCC is often identified as the final authority when it comes to questions about the earth’s climate.  However the IPCC does not conduct research; it merely reviews papers in the field. And the IPCC should not be considered as unbiased. Because when they were created by the United Nations they were specifically charged to investigate how mankind is causing the earth’s climate to change.

In other words the conclusion had already been reached that man was to blame before any investigations were performed. The IPCC is a political animal; nothing is published before it has been approved by the representatives of all the participating countries to make sure that it aligns with their governments’ objectives and policies. IPCC has published numerous forecasts of ever increasing global temperatures being driven by rising atmospheric CO2 concentrations. But these are based on incomplete and inaccurate computer models and they have all drastically overestimated the forthcoming temperature rise.

These computer models ignore or inadequately account for many factors, including clouds and solar variations. It is claimed that 97 % of scientists agree that man-made emissions of CO2 are having significant negative effects on the earth’s climate. However consensus is not a valid way to conduct scientific research. Group think is a major problem in this field. Remember Galileo was able to prove that the earth orbited the sun rather than the other way around. But public opinion and the church forced him to recant his findings. Consensus overruled scientific evidence just like it appears to be doing today.

The earth is getting warmer and it will continue to do so until the temperature trend reverses sometime in the future and we head into the next ice age. Mankind needs to recognize that we are an observer of naturally occurring climate cycles. There is very little that we can do to stop, change or influence these cycles. The best thing that man can do is learn to adapt to these natural cycles. Stop wasting our money and damaging our economy on futile and inefficient schemes to reduce man’s CO2 emissions, appearing to be trying to thwart what are perfectly natural cyclical changes of the earth’s climate.

Learn to live with these changes. Mankind has to adapt. Have a nice day and enjoy the warmth while we have it. Here are links to references providing more details on many of these points

2. Net Zero is Both Suicidal and Ecocidal

Source: Roger Palmer publication  Understanding Climate Change.  Excerpts in italics with my bolds and added images.

Net Zero

As mentioned above, many governments have decided to pursue the goal of becoming “Net Zero” by 2050 (or possibly later). This means that they want all CO2 emitted by man’s activities either to be eliminated or somehow compensated for by 2050 in the belief that this will slow the current rise in global temperatures, and limit the rise to 1.5°C above pre-industrial levels.

As discussed in previous sections, CO2 concentration is not the primary driver of global temperature, and indeed, rising CO2 levels might actually be a result of warming due to entirely natural factors. Despite the dubious scientific justification, politicians and special-interest groups have embraced the “Net Zero” battle cry, and are falling over themselves with announcements, proclamations, and protests as they attempt to destroy the world’s economy.

The concept of Net Zero is that any continuing emissions of CO2 need to be “offset” by actions to remove the same amount of CO2 from the atmosphere. These “offsets” could be the planting of trees that absorb CO2, or they could involve operating actual equipment that removes CO2 from the atmosphere, and then sequesters it in a safe storage facility (this is called CCS, which stands for Carbon Capture and Sequestration). A marketplace has now developed whereby “carbon credits” are bought and sold, and some rather flimsy schemes have been created.

As an example of how ludicrous this churning process is, consider the example of the DRAX power plant that is located in the U.K. This power plant was built in 1974, and burned coal to generate electricity (in a conventional steam turbine system). Starting in 2013, this power plant was converted to burn compressed wood pellets. The pellets are manufactured in Canada, and shipped to the UK from the port of Prince Rupert, BC. The pellets were originally supposed to use scrap wood left over from existing logging operations, but demand eventually required that trees be specifically grown to feed the process. It was claimed that the entire process (growing trees, converting the wood to pellets, transporting them between continents, and then burning them in a thermal power plant) was “sustainable”, because new trees were planted to replace those that were cut down!

Direct Carbon Capture(DCC)

There are several companies developing technology and equipment for actually extracting (“capturing”) CO2 from the air. The CO2 is then stored (“sequestered”) either as a gas, or converted to some other form. The justification for doing this is that governments and agencies mistakenly believe that CO2 emissions from human activities is causing the world to warm, and that not only must these emissions stop, but some of the CO2 must be removed in order to lower the concentration in the atmosphere, thereby supposedly preventing future temperature rises.

The processes used for DCC are complex, and require large amounts of energy to operate. It is claimed that the energy will come from “sustainable” sources (hydro, solar, wind, nuclear), so the whole process will help a country reach the goal of “net zero”. Funding for these projects effectively comes from selling “carbon credits”, because governments have inadvisably placed a dollar value on CO2.  If these proposed projects go ahead, the scale and costs involved will be enormous. And remember, lowering the CO2 concentration in the atmosphere by 1 ppm will only potentially reduce the temperature by between 9 and 15 thousandths of a degree C!

Energy and Transortation

As part of the charge toward the Holy Grail of “Net Zero”, the entire transportation infrastructure is being forced to dispense with the burning of fossil fuels. Governments apply so-called “Carbon Taxes” on the sale of hydrocarbon fuels, and the tax rates are methodically being increased as time goes by, in an effort to get users to switch to another type of energy.

Oil has been a major energy source for over two centuries. It has a high energy density (ie: a small and light weight amount of the substance has the potential to create a large amount of energy). A few decades ago, there was worldwide concern that we were running out of these fuels and only had a limited supply, but new exploration/extraction techniques, combined with more efficient energy use have allayed those concerns.

Fossil fuels are converted to energy by the process of combustion. Almost 40% of the material’s potential energy is extracted in modern gasoline or diesel engines, and almost 55% in modern combined-cycle gas-fired power plants. The remaining energy is turned into waste heat. In building heating applications, the fossil fuel is burned to directly create heat: this process can have efficiencies of over 95%. All of these combustion processes generate CO2, and this is the main focus of politicians, scientists, and environmentalists, despite evidence (as outlined earlier) that climate change is not being primarily driven by increases in CO2 concentration.

Wind turbines and solar cells have received most of the publicity in recent years as large arrays of these devices have been installed around the world. The biggest problem is the intermittent nature of their output. To compensate for this, excess generating capacity has to be installed, and very large energy storage devices (batteries, pumped water, etc) have to be included to ensure a reliable source of supply. If electricity is produced by techniques (such as hydro, solar, wind, or nuclear) that do not emit any greenhouse gases, there is strong political motivation to convert existing consumers of fossil fuels to use electricity as their energy source. Transportation has been a major user of fossil fuels, and the sector is highly visible to the public, so there is considerable pressure to electrify it.

Fossil fuels are an ideal way to power mobile devices (especially road vehicles, aircraft, and ships): the energy density (KW-h per Kg) is very high, and it is easy to quickly refuel as required. There has been much development in electrical technology for road vehicles, but the major problem has been the availability of electrical energy storage devices (primarily batteries) that are small and light enough to fit into the vehicle, and that have sufficient capacity to provide decent range between charges. The energy density (KW-h per Kg) of modern Li-ion batteries is about 2% that of gasoline or diesel fuel. Some electric cars have met with market success, but battery technology needs to develop a major increase in battery energy density before they are considered viable for mainstream applications, and then the problem will be one of installing enough charging infrastructure to allow for unimpeded travel without the drivers suffering from “range anxiety”.

Ships, highway trucks and airliners pose their own problems, and are unlikely to be weaned off of fossil  fuels for some time to come. These applications need energy storage devices that have much higher density (both by volume and by weight) than batteries – the use of hydrogen (produced by electrolysis of water) and fuel cells is being vigorously pursued. Hydrogen can also be burned directly in modified jet engines or even reciprocating engines, but hydrogen has storage issues that need to be addressed.

Hydrogen’s energy density (KW-h per Kg) is quite high, but it occupies a large volume, so must be stored at very high pressures if storage tanks are to be kept to a reasonable size. Hydrogen can also be stored in a liquid form, but the extremely low cryogenic temperatures required (-253°C) present significant
challenges.

If it were possible to convert all power generation, heating, and transportation applications to non fossil fuel technology, it would be possible to reduce the total amount of man-made CO2 emissions by over 50%, but this would have a negligible effect on global temperature. It would of course still be required to extract oil and natural gas from the ground for the manufacture of synthetic materials, plastics, asphalt, lubricants, and pharmaceuticals.

Summary and Conclusions

The material reviewed so far in this paper confirms that there are a large number of factors that affect the earth’s climate. Many of these are poorly understood by man, and there are some factors that probably haven’t even been discovered yet. A number of conclusions can be taken away from the information presented so far in this document:

a) Climate change is a naturally-occurring, cyclic phenomena, and it has been going on for millions of years.
b) Climate change is primarily driven by changes in the energy of the sun that impinges on the earth. The dominant factors driving this are variations in the sun (total output power, spectral distribution, sunspot cycles) Milankovitch Cycles, variations in ocean currents (ENSO, PDO, and AMO). Other factors include the effect of varying cosmic particle influx and high altitude bacteria, causing changes in cloud cover.
c) The primary greenhouse gas is water vapour. The effect of atmospheric CO2 on global temperature change is much less. Because of the non-linear effect of CO2 concentration, increases beyond the current level will have a decreasing effect on the earth’s climate.
d) Man-made CO2 does have a minor effect on global temperature changes, but it is not the dominant factor. A reduction of man-made CO2 emissions would have a negligible effect on global temperature.
e) Man’s understanding of the various climate-influencing factors is very limited.
f) Climate models are not effective at forecasting future long-term global temperatures.
g) There is very little that mankind can do to affect global temperature change. It does not make sense to introduce regulations that will have a negative impact on Western economies in a pointless attempt to change the natural rate of global climate change.
h) Mankind will have to learn to adapt to future climate changes. If mankind is still around in a few thousand years, they will then have to adapt to global cooling and glaciations!

Any legislative efforts to limit man-made carbon dioxide emissions at the local, regional, provincial, or federal levels may be well-intended, but are ultimately futile, and potentially dangerous. These efforts will harm the economy, waste resources, and not significantly affect the naturally-occurring cyclic climatic changes.

 

 

2024 Arctic Ice Seesaw

In January, most of the Arctic ocean basins are frozen over, and so the growth of ice extent slows down.  According to MASIE January on average adds 1.2M km2, and this month it was 1.1M. However, 2024 started above average and quickly grew to 14M km2 (14 Wadhams), before slowing down and ending January slightly above the 18 year average.   The few basins that can grow ice this time of year tend to fluctuate and alternate waxing and waning, which appears as a see saw pattern in these images.

On the left is the Pacific seesaw with Bering below and Okhotsk above.  This year Okotsk added ice steadily, and slowed at the end, while Bering waffled up and down mid month before gaining ice at the end. The Atlantic seesaw is Barents top center and Baffin on the right below Greenland.  Barents grew ice steadily until mid January, then gave almost all of it back by the end. Baffin added ice slowly all month, then accelerated the last two weeks.

While the seesaws are tilting back and forth on the margins, the bulk of the Arctic is frozen solid. And with limited places where more extent can be added, the pace of overall growth has slowed. Note that at 14.4M km2 Arctic ice extent now has about six weeks to break the 15M km2 annual ceiling mid March.

The graph shows the 18-year average gain for January is 1.2M km2.  2024 started with 275k km2 surplus ice extent and ended slightly above average, while and other recent years were lower.  SII showed lower extents most of the month  with a 253k km2 deficit to MASIE at the end.

Region 2024031 Day 31 2024-Ave. 2018031 2024-2018
 (0) Northern_Hemisphere 14396470 14360118 36352 13792271 604199
 (1) Beaufort_Sea 1070983 1070351 632 1070445 538
 (2) Chukchi_Sea 966006 965973 34 965971 35
 (3) East_Siberian_Sea 1087137 1087059 78 1087120 18
 (4) Laptev_Sea 897845 897823 22 897845 0
 (5) Kara_Sea 894933 918701 -23769 895363 -430
 (6) Barents_Sea 473076 569199 -96123 481947 -8872
 (7) Greenland_Sea 711708 607586 104122 501411 210297
 (8) Baffin_Bay_Gulf_of_St._Lawrence 1272213 1331684 -59471 1406903 -134690
 (9) Canadian_Archipelago 854860 853430 1430 853109 1752
 (10) Hudson_Bay 1260903 1260770 133 1260838 66
 (11) Central_Arctic 3214505 3210272 4233 3184817 29688
 (12) Bering_Sea 665225 647841 17384 382207 283018
 (13) Baltic_Sea 71817 62350 9467 41714 30103
 (14) Sea_of_Okhotsk 910937 818756 92181 704398 206539

The table shows regional ice extents in km2.  The few deficits are in Baffin Bay and Barents, offset by sizeable surpluses in Greenland and Okhotsk seas.  Everywhere else is close to maximum for the year.

The polar bears have a Valentine Day’s wish for Arctic Ice.

welovearcticicefinal

And Arctic Ice loves them back, returning every year so the bears can roam and hunt for seals.

Footnote:

Seesaw accurately describes Arctic ice in another sense:  The ice we see now is not the same ice we saw previously.  It is better to think of the Arctic as an ice blender than as an ice cap, explained in the post The Great Arctic Ice Exchange.

Green Electrical Shocks in 2024

These days electrical grid managers are shocked and sounding alarms, not about climate itself, but about dangerous energy policies by ignorant politicians rendering the grid unstable and supply unpredictable.  For example today’s Just The News article Analysts: ‘Irrational’ policies drive coal plant shutdowns, incentivize overbuilding wind farms.  Excerpts in italics with my bolds

The South Dakota Public Utilities Commission told the utility that the “premature closure of these [coal] plants adds to the uncertainty of electrical generation resource adequacy in the upper Midwest.” Some energy experts call the government’s policies “irrational.”

Despite ongoing warnings that the electricity grid of the United States is becoming increasingly unstable, a major utility is moving forward with the elimination of two major coal-fired power plants in the upper Midwest. Energy analysts say the instability is a byproduct of the shutdown of reliable generation sources.

In its latest assessment, the North American Electric Reliability Corporation, a grid watchdog, warned that the MISO region is under some of the highest risks for resource inadequacy, which means that during peak demand periods, rolling blackouts are a possibility. Xcel Energy, according to the Energy News Network, is even looking at variable rates to encourage customers to conserve energy and use it during off-peak periods.

Michigan, Minnesota, Illinois and Nebraska have all set goals to decarbonize their grid by 2040 or 2050, which will mean eliminating coal-fired power plants entirely. These goals are on top of federal green energy mandates.

The Public is Blissfully Unaware–That Needs to Change

Some years ago, a weekly news program aired in the Netherlands on the subject Green Electrical Shocks. It employed images and humor to reveal electrical realities to an audience burdened with misconceptions.  The video clip is below with English subtitles. For those who prefer reading, I provide the substantial excerpts from the program with my bolds.

How many of you have Green Electricity? I will estimate 69%
And how much nationally? Oh, 69%!
So we are very average, and in a good way, because the climate is very important.

Let me ask: Green electricity comes from . . .?
Yes, electricity produced from windmills and solar panels.
Nearly 2/3 of the Dutch are using it. That’s the image.

Well I have green news and bad news.
The green news: Well done!
The bad news: It is all one big lie.
Time for the Green Electrical Shocks.

Shock #1: The green electricity from your socket is not green.
When I switched to green electricity I was very proud.
I thought, Yes, well done! The climate is getting warmer, but not any more thanks to me.

Well, that turned out to be untrue.
All producers deliver to one communal grid. Green and grey electricity all mix.
The electricity you use is always a mix of various sources.
OK. It actually makes sense not to have separate green and grey cables for every house.
So it means that of all electricity, 69% is produced in a sustainable way. But then:


Shock #2: Green Electricity is mostly fake.
Most of the green electricity we think we use comes from abroad.
You may think: So what. Green is green.

But that electricity doesn’t come from abroad, it stays abroad.
If you have green electricity at home, it may mean nothing more than that your supplier has bought “green electricity certificates”.

In Europe green electricity gets an official certificate,
Instead of selling on the electricity, they sell on those certificates.
Norway, with its hydro power, has a surplus of certificates.
Dutch suppliers buy them on a massive scale, while the electricity stays in Norway.

The idea was: if countries can sell those certificates, they can make money by producing more green electricity.
But the Norwegians don’t produce more green electricity.
But they do sell certificates.

The Dutch suppliers wave with those certificates, and say Look! Our grey electricity is green.
Only one country has produced green electricity: Norway.
But two countries take the credit.
Norway, because they produce green electricity, and the Netherlands because, on paper, we have green electricity. Get it? That’s a nice deal.

More and more countries sell those certificates. Italy is now the top supplier.
We buy fake green electricity from Italy, like some kind of Karma ham.

Now, let’s look again at the green electricity we all think we use.
So the real picture isn’t 69%. If you cancel the certificates, only 21% of electricity is really green.
Nowadays you can even order it separately if you don’t want to be part of that Norway certificates scam.
You may think: 21% green is still quite a lot. But it is time for:

Shock #3: Not all energy is electricity.
If you talk about the climate, you shouldn’t just consider electricity but all energy.
When you look at all energy, like factories, cars, trains, gas fires, then the share of consumer electricity is virtually nothing.
If you include everything in your calculation, it turns out that only 6% of all the energy we use in the Netherlands is green. It is a comedy, but wait:

Trees converted into pellets by means of petroleum powered machinery.

Shock #4: Most green energy doesn’t come from sun or wind, like you might think.
Even the 6%, our last green hope, is fake. According to the CBS we are using more sun and wind energy, but most of the green energy is produced by the burning of biomass.
Ah, more than half of the 6% green energy is biomass.

Ridiculous. What is biomass really? It is organic materials that we encounter every day.
Like the content of a compost heap. How about maize leaves or hay?
The idea behind burning organic materials is that it will grow up again.
So CO2 is released when you burn it, but it will be absorbed again by new trees.

However, there is one problem. The forest grows very slowly and our power plants burn very fast.
This is the fatal flaw in the thinking about biomass. Power plants burn trees too fast, so my solution: slow fire. Disadvantage: it doesn’t exist. So this is our next shock.

Shock#5: Biomass isn’t all that sustainable.
It’s getting worse. There aren’t enough trees in the Netherlands for biomass.
We can’t do it on our own. We don’t have enough wood, so we get it from America.

In the USA forests are cut at a high rate, Trees are shredded and compressed into pellets.
These are shipped to the Netherlands and end up in the ovens of the coal plants.
It’s a disaster for the American forests, according to environmental groups.

So we transport American forests on diesel ships to Europe.
Then throw them in the oven because it officially counts as green energy.
Only because the CO2 released this way doesn’t count for our total emissions.

In reality biomass emits more CO2 than natural gas and coal.
These are laws of nature, no matter what European laws say.
At the bottom line, how much sustainable energy do we really have in the Netherlands?
Well, the only real green energy from windmills, solar panels etc. Is only 2.2%. of all the energy we use.

In Conclusion
So the fact that 2/3 of the audience and of all Dutch people use green electricity means absolutely nothing. It’s only 2.2%, and crazier still, the government says it should be at 14% by 2020.
They promised: to us, to Europe, to planet Earth: 14 instead of 2.2.

Instead of making a serious attempt to save the climate, they are only working on accounting tricks, like buying pieces of paper in Norway and burning American forests.
They are only saving the climate on paper.

Summary Comment

As the stool above shows, the climate change package sits on three premises. The first is the science bit, consisting of an unproven claim that observed warming is caused by humans burning fossil fuels. The second part rests on impact studies from billions of research dollars spent uncovering any and all possible negatives from warming. And the third leg is climate policies showing how governments can “fight climate change.”

It is refreshing to see more and more articles by people reasoning about climate change/global warming and expressing rational positions. Increasingly, analysts are unbundling the package and questioning not only the science, but also pointing out positives from CO2 and warming.  And as the Dutch telecast shows, ineffective government policies are also fair game.

More on flawed climate policies at Reasoning About Climate

Observed vs. Imagined Sea Levels 2023 Update

3047060508_737c7687bd_o.0.0

Such beach decorations exhibit the fervent belief of activists that sea levels are rising fast and will flood the coastlines if we don’t stop burning fossil fuels.  As we will see below there is a concerted effort to promote this notion empowered with slick imaging tools to frighten the gullible.  Of course there are frequent media releases sounding the alarms.  For example some examples already in 2024:

How rising sea levels will affect our coastal cities and towns Phys.org

Sea-level rise is here to stay and gathering pace, but the rate of future increase remains uncertain. It largely depends on what happens in Antarctica over the coming decades. This in turn depends on land and sea temperatures around the southern continent, which are directly linked to our efforts to limiting global warming to 1.5°C in line with the Paris Agreement. With over 250 million people now living on land less than 2 m above sea level, most in Asia, it is imperative we do everything we can to limit future sea-level rise.

Sea level rise could cost Europe billions in economic losses, study finds CBS News

Some regions of Europe could see “devastating” economic losses in the coming decades due to the rising oceans, researchers say. A new study found that under the worst-case scenario for emissions and sea level rise, the European Union and United Kingdom could lose 872 billion Euros (about $950 billion) by the end of this century, with many regions within them suffering GDP losses between 10% and 21%.

The study, published Thursday in the journal Scientific Reports, analyzed the economic impacts of sea level rise for 271 European regions. Researchers conducted their analysis based on estimates of high greenhouse gas emissions, which drive global temperature increases, a process that causes sea levels to rise.

Climate scientists project sea levels around New York City will rise by 1ft in 2030s alongside tropical storms and hotter temperatures  Daily Mail

The New York City Panel on Climate Change (NPCC) released sea level projections that annual precipitation could increase by up to 10 percent over the years and warming would rise between two and 4.7 degrees Fahrenheit. The projection is part of a final report set to be released this spring. The estimates are based on carbon emissions and greenhouse gas emissions that cause ice caps to melt and increases precipitation, which leads to rising sea levels.

The rest of this post provides a tour of seven US cities demonstrating how the sea level scare machine promotes fear among people living or invested in coastal properties.  In each case there are warnings published in legacy print and tv media, visual simulations powered by computers and desktop publishing, and a comparison of imaginary vs. observed sea level trends, updated with 2023 tidal gauge reports.

[Note: Some readers may be confused by the imagined sea level projections shown in red.  These come from models that include IPCC suppositions in estimating sea level rise in various localities.  For example, from the UCS (Union of Concerned Scientists):

Three sea level rise scenarios, developed by the National Oceanic and Atmospheric Administration (NOAA) and localized for this analysis, are included:

    • A high scenario that assumes a continued rise in global carbon emissions and an increasing loss of land ice; global average sea level is projected to rise about 2 feet by 2045 and about 6.5 feet by 2100.
    • An intermediate scenario that assumes global carbon emissions rise through the middle of the century then begin to decline, and ice sheets melt at rates in line with historical observations; global average sea level is projected to rise about 1 foot by 2035 and about 4 feet by 2100.
    • A low scenario that assumes nations successfully limit global warming to less than 2 degrees Celsius (the goal set by the Paris Climate Agreement) and ice loss is limited; global average sea level is projected to rise about 1.6 feet by 2100.

The charts below also reflect sea level forecasts by state agencies like the California Coastal Commission]

Prime US Cities on the “Endangered” List
Newport, R.I.

Examples of Media Warnings

Bangor Daily News:  In Maine’s ‘City of Ships,’ climate change’s coastal threat is already here

Bath, the 8,500-resident “City of Ships,” is among the places in Maine facing the greatest risks from increased coastal flooding because so much of it is low-lying. The rising sea level in Bath threatens businesses along Commercial and Washington streets and other parts of the downtown, according to an analysis by Climate Central, a nonprofit science and journalism organization.

Water levels reached their highest in the city during a record-breaking storm in 1978 at a little more than 4 feet over pre-2000 average high tides, and Climate Central’s sea level team found there’s a 1-in-4 chance of a 5-foot flood within 30 years. That level could submerge homes and three miles of road, cutting off communities that live on peninsulas, and inundate sites that manage wastewater and hazardous waste along with several museums.

UConn Today:  Should We Stay or Should We Go? Shoreline Homes and Rising Sea Levels in Connecticut

As global temperatures rise, so does the sea level. Experts predict it could rise as much as 20 inches by 2050, putting coastal communities, including those in Connecticut, in jeopardy.

One possible solution is a retreat from the shoreline, in which coastal homes are removed to take them out of imminent danger. This solution comes with many complications, including reductions in tax revenue for towns and potentially diminished real estate values for surrounding properties. Additionally, it can be difficult to get people to volunteer to relocate their homes.

Computer Simulations of the Future

Newport Obs Imaged

Imaginary vs. Observed Sea Level Trends (2023 Update)

Boston, Mass.

Example of Media Warnings

From WBUR Radio Boston:  Rising Sea Levels Threaten MBTA’s Blue Line

Could it be the end of the Blue Line as we know it? The Blue Line, which features a mile-long tunnel that travels underwater, and connects the North Shore with Boston’s downtown, is at risk as sea levels rise along Boston’s coast. To understand the threat sea-level rise poses to the Blue Line, and what that means for the rest of the city, we’re joined by WBUR reporter Simón Ríos and Julie Wormser, Deputy Director at the Mystic River Watershed Association.

As sea levels continue to rise, the Blue Line and the whole MBTA system face an existential threat. The MBTA is also facing a serious financial crunch, still reeling from the pandemic, as we attempt to fully reopen the city and the region. Joining us to discuss is MBTA General Manager Steve Poftak.

Computer Simulations of the Future

Boston Obs Imaged2

Imaginary vs. Observed Sea Level Trends (2023 Update)

New York City

Example of Media Warnings

From Quartz: Sea level rise will flood the neighborhood around the UN building with two degrees warming

Right now, of every US city, New York City has the highest population living inside a floodplain. By 2100, seas could rise around around the city by as much as six feet. Extreme rainfall is also predicted to rise, with roughly 1½ times more major precipitation events per year by the 2080s, according to a 2015 report by a group of scientists known as the New York City Panel on Climate Change.

But a two-degree warming scenario, which the world is on track to hit, could lock in dramatic sea level rise—possibly as much as 15 feet.

Computer Simulations of the Future

NYC Obs Imaged

Imaginary vs. Observed Sea Level Trends (2023 Update)

Philadelphia, PA.

Example of Media Warnings

From NBC Philadelphia:  Climate Change Studies Show Philly Underwater

NBC10 is looking at data and reading studies on climate change to showcase the impact. There are studies that show if the sea levels continue to rise at this rate, parts of Amtrak and Philadelphia International Airport could be underwater in 100 years.

Computer Simulations of the Future

Philly Obs Imaged

Imaginary vs. Observed Sea Level Trends (2023 Update)

Miami, Florida

Examples of Media Warnings

From WLRN Miami: Miles Of Florida Roads Face ‘Major Problem’ From Sea Rise. Is State Moving Fast Enough?

One 2018 Department of Transportation study has already found that a two-foot rise, expected by mid-century, would imperil a little more than five percent — 250-plus miles — of the state’s most high-traffic highways. That may not sound like a lot, but protecting those highways alone could easily cost several billion dollars. A Cat 5 hurricane could be far worse, with a fifth of the system vulnerable to flooding. The impact to seaports, airports and railroads — likely to also be significant and expensive — is only now under analysis.

From Washington Post:  Before condo collapse, rising seas long pressured Miami coastal properties

Investigators are just beginning to try to unravel what caused the Champlain Towers South to collapse into a heap of rubble, leaving at least 159 people missing as of Friday. Experts on sea-level rise and climate change caution that it is too soon to speculate whether rising seas helped destabilize the oceanfront structure. The 40-year-old building was relatively new compared with others on its stretch of beach in the town of Surfside.

But it is already clear that South Florida has been on the front lines of sea-level rise and that the effects of climate change on the infrastructure of the region — from septic systems to aquifers to shoreline erosion — will be a management problem for years.

Computer Simulations of the Future

Florida Obs Imaged

Imaginary vs. Observed Sea Level Trends (2023 Update)

Houston, Texas

Example of Media Warnings

From Undark:  A $26-Billion Plan to Save the Houston Area From Rising Seas

As the sea rises, the land is also sinking: In the last century, the Texas coast sank about 2 feet into the sea, partly due to excessive groundwater pumping. Computer models now suggest that climate change will further lift sea levels somewhere between 1 and 6 feet over the next 50 years. Meanwhile, the Texas coastal population is projected to climb from 7 to 9 million people by 2050.

Protecting Galveston Bay is no simple task. The bay is sheltered from the open ocean by two low, sandy strips of land — Galveston Island and Bolivar Peninsula — separated by the narrow passage of Bolivar Roads. When a sufficiently big storm approaches, water begins to rush through that gap and over the island and peninsula, surging into the bay.

Computer Simulations of the Future

Galv Obs Imaged

Imaginary vs. Observed Sea Level Trends (2023 Update)

San Francisco, Cal.

Example of Media Warnings

From San Francisco Chronicle:  Special Report: SF Bay Sea Level Rise–Hayward

Sea level rise is fueled by higher global temperatures that trigger two forces: Warmer water expands oceans while the increased temperatures hasten the melting of glaciers on Antarctica and Greenland and add yet more water to the oceans.

The California Ocean Protection Council, a branch of state government, forecasts a 1-in-7 chance that the average daily tides in the bay will rise 2 or more feet by 2070. This would cause portions of the marshes and bay trail in Hayward to be underwater during high tides. Add another 2 feet, on the higher end of the council’s projections for 2100 and they’d be permanently submerged. Highway 92 would flood during major storms. So would the streets leading into the power plant.

From San Francisco Chronicle Special Report: SF Bay Sea Level Rise–Mission Creek

Along San Francisco’s Mission Creek, sea level rise unsettles the waters.  Each section of this narrow channel must be tailored differently to meet an uncertain future. Do nothing, and the combination of heavy storms with less than a foot of sea level rise could send Mission Creek spilling over its banks in a half-dozen places, putting nearby housing in peril and closing the two bridges that cross the channel.

Whatever the response, we won’t know for decades if the city’s efforts can keep pace with the impact of global climatic forces that no local government can control.

Though Mission Creek is unique, the larger dilemma is one that affects all nine Bay Area counties.

Computer Simulations of the Future

SF Obs Imaged

Imaginary vs. Observed Sea Level Trends (2023 Update)

 

Summary: This is a relentless, high-tech communications machine to raise all kinds of scary future possibilities, based upon climate model projections, and the unfounded theory of CO2-driven global warming/climate change.  The graphs above are centered on the year 2000, so that the 21st century added sea level rise is projected from that year forward.  In addition, we now have observations at tidal gauges for the first 23 years, nearly 1/4 of the total expected.  The gauges in each city are the ones with the longest continuous service record, and wherever possible the locations shown in the simulations are not far from the tidal gauge.  For example, NYC best gauge is at the Battery, and Fulton St. is also near the Manhattan southern tip.

Already the imaginary rises are diverging greatly from observations, yet the chorus of alarm goes on.  In fact, the added rise to 2100 from tidal gauges ranges from 6 to 9.5 inches, except for Galveston projecting 20.6 inches. Meanwhile models imagined rises from 69 to 108 inches. Clearly coastal settlements must adapt to evolving conditions, but also need reasonable rather than fearful forecasts for planning purposes.

Footnote:  The problem of urban flooding is discussed in some depth at a previous post Urban Flooding: The Philadelphia Story

Background on the current sea level campaign is at USCS Warnings of Coastal Floodings

And as always, an historical perspective is important:

post-glacial_sea_level