Climate Model Upgraded: INMCM5 Under the Hood

2018 Update: Best Climate Model INMCM5 October 22, 2018 (follow link to latest info on INMCM5)

A previous analysis Temperatures According to Climate Models showed that only one of 42 CMIP5 models was close to hindcasting past temperature fluctuations. That model was INMCM4, which also projected an unalarming 1.4C warming to the end of the century, in contrast to the other models programmed for future warming five times the past.

In a recent comment thread, someone asked what has been done recently with that model, given that it appears to be “best of breed.” So I went looking and this post summarizes further work to produce a new, hopefully improved version by the modelers at the Institute of Numerical Mathematics of the Russian Academy of Sciences.

Institute of Numerical Mathematics, Russian Academy of Sciences, Moscow, Russia

Earlier this year came this publication Simulation of the present-day climate with the climate model INMCM5 by E.M. Volodin et al. Excerpts below with my bolds.

In this paper we present the fifth generation of the INMCM climate model that is being developed at the Institute of Numerical Mathematics of the Russian Academy of Sciences (INMCM5). The most important changes with respect to the previous version (INMCM4) were made in the atmospheric component of the model. Its vertical resolution was increased to resolve the upper stratosphere and the lower mesosphere. A more sophisticated parameterization of condensation and cloudiness formation was introduced as well. An aerosol module was incorporated into the model. The upgraded oceanic component has a modified dynamical core optimized for better implementation on parallel computers and has two times higher resolution in both horizontal directions.

Analysis of the present-day climatology of the INMCM5 (based on the data of historical run for 1979–2005) shows moderate improvements in reproduction of basic circulation characteristics with respect to the previous version. Biases in the near-surface temperature and precipitation are slightly reduced compared with INMCM4 as  well as biases in oceanic temperature, salinity and sea surface height. The most notable improvement over INMCM4 is the capability of the new model to reproduce the equatorial stratospheric quasi-biannual oscillation and statistics of sudden stratospheric warmings.

 

The family of INMCM climate models, as most climate system models, consists of two main blocks: the atmosphere general circulation model, and the ocean general circulation model. The atmospheric part is based on the standard set of hydrothermodynamic equations with hydrostatic approximation written in advective form. The model prognostic variables are wind horizontal components, temperature, specific humidity and surface pressure.

Atmosphere Module

The INMCM5 borrows most of the atmospheric parameterizations from its previous version. One of the few notable changes is the new parameterization of clouds and large-scale condensation. In the INMCM5 cloud area and cloud water are computed prognostically according to Tiedtke (1993). That includes the formation of large-scale cloudiness as well as the formation of clouds in the atmospheric boundary layer and clouds of deep convection. Decrease of cloudiness due to mixing with unsaturated environment and precipitation formation are also taken into account. Evaporation of precipitation is implemented according to Kessler (1969).

In the INMCM5 the atmospheric model is complemented by the interactive aerosol block, which is absent in the INMCM4. Concentrations of coarse and fine sea salt, coarse and fine mineral dust, SO2, sulfate aerosol, hydrophilic and hydrophobic black and organic carbon are all calculated prognostically.

Ocean Module

The oceanic module of the INMCM5 uses generalized spherical coordinates. The model “South Pole” coincides with the geographical one, while the model “North Pole” is located in Siberia beyond the ocean area to avoid numerical problems near the pole. Vertical sigma-coordinate is used. The finite-difference equations are written using the Arakawa C-grid. The differential and finite-difference equations, as well as methods of solving them can be found in Zalesny etal. (2010).

The INMCM5 uses explicit schemes for advection, while the INMCM4 used schemes based on splitting upon coordinates. Also, the iterative method for solving linear shallow water equation systems is used in the INMCM5 rather than direct method used in the INMCM4. The two previous changes were made to improve model parallel scalability. The horizontal resolution of the ocean part of the INMCM5 is 0.5 × 0.25° in longitude and latitude (compared to the INMCM4’s 1 × 0.5°).

Both the INMCM4 and the INMCM5 have 40 levels in vertical. The parallel implementation of the ocean model can be found in (Terekhov etal. 2011). The oceanic block includes vertical mixing and isopycnal diffusion parameterizations (Zalesny et al. 2010). Sea ice dynamics and thermodynamics are parameterized according to Iakovlev (2009). Assumptions of elastic-viscous-plastic rheology and single ice thickness gradation are used. The time step in the oceanic block of the INMCM5 is 15 min.

Note the size of the human emissions next to the red arrow.

Carbon Cycle Module

The climate model INMCM5 has а carbon cycle module (Volodin 2007), where atmospheric CO2 concentration, carbon in vegetation, soil and ocean are calculated. In soil, а single carbon pool is considered. In the ocean, the only prognostic variable in the carbon cycle is total inorganic carbon. Biological pump is prescribed. The model calculates methane emission from wetlands and has a simplified methane cycle (Volodin 2008). Parameterizations of some electrical phenomena, including calculation of ionospheric potential and flash intensity (Mareev and Volodin 2014), are also included in the model.

Surface Temperatures

When compared to the INMCM4 surface temperature climatology, the INMCM5 shows several improvements. Negative bias over continents is reduced mainly because of the increase in daily minimum temperature over land, which is achieved by tuning the surface flux parameterization. In addition, positive bias over southern Europe and eastern USA in summer typical for many climate models (Mueller and Seneviratne 2014) is almost absent in the INMCM5. A possible reason for this bias in many models is the shortage of soil water and suppressed evaporation leading to overestimation of the surface temperature. In the INMCM5 this problem was addressed by the increase of the minimum leaf resistance for some vegetation types.

Nevertheless, some problems migrate from one model version to the other: negative bias over most of the subtropical and tropical oceans, and positive bias over the Atlantic to the east of the USA and Canada. Root mean square (RMS) error of annual mean near surface temperature was reduced from 2.48 K in the INMCM4 to 1.85 K in the INMCM5.

Precipitation

In mid-latitudes, the positive precipitation bias over the ocean prevails in winter while negative bias occurs in summer. Compared to the INMCM4, the biases over the western Indian Ocean, Indonesia, the eastern tropical Pacific and the tropical Atlantic are reduced. A possible reason for this is the better reproduction of the tropical sea surface temperature (SST) in the INMCM5 due to the increase of the spatial resolution in the oceanic block, as well as the new condensation scheme. RMS annual mean model bias for precipitation is 1.35mm day−1 for the INMCM5 compared to 1.60mm day−1 for the INMCM4.

Cloud Radiation Forcing

Cloud radiation forcing (CRF) at the top of the atmosphere is one of the most important climate model characteristics, as errors in CRF frequently lead to an incorrect surface temperature.

In the high latitudes model errors in shortwave CRF are small. The model underestimates longwave CRF in the subtropics but overestimates it in the high latitudes. Errors in longwave CRF in the tropics tend to partially compensate errors in shortwave CRF. Both errors have positive sign near 60S leading to warm bias in the surface temperature here. As a result, we have some underestimation of the net CRF absolute value at almost all latitudes except the tropics. Additional experiments with tuned conversion of cloud water (ice) to precipitation (for upper cloudiness) showed that model bias in the net CRF could be reduced, but that the RMS bias for the surface temperature will increase in this case.

 

A table from another paper provides the climate parameters described by INMCM5.

Climate Parameters Observations INMCM3 INMCM4 INMCM5
Incoming solar radiation at TOA 341.3 [26] 341.7 341.8 341.4
Outgoing solar radiation at TOA   96–100 [26] 97.5 ± 0.1 96.2 ± 0.1 98.5 ± 0.2
Outgoing longwave radiation at TOA 236–242 [26] 240.8 ± 0.1 244.6 ± 0.1 241.6 ± 0.2
Solar radiation absorbed by surface 154–166 [26] 166.7 ± 0.2 166.7 ± 0.2 169.0 ± 0.3
Solar radiation reflected by surface     22–26 [26] 29.4 ± 0.1 30.6 ± 0.1 30.8 ± 0.1
Longwave radiation balance at surface –54 to 58 [26] –52.1 ± 0.1 –49.5 ± 0.1 –63.0 ± 0.2
Solar radiation reflected by atmosphere      74–78 [26] 68.1 ± 0.1 66.7 ± 0.1 67.8 ± 0.1
Solar radiation absorbed by atmosphere     74–91 [26] 77.4 ± 0.1 78.9 ± 0.1 81.9 ± 0.1
Direct hear flux from surface     15–25 [26] 27.6 ± 0.2 28.2 ± 0.2 18.8 ± 0.1
Latent heat flux from surface     70–85 [26] 86.3 ± 0.3 90.5 ± 0.3 86.1 ± 0.3
Cloud amount, %     64–75 [27] 64.2 ± 0.1 63.3 ± 0.1 69 ± 0.2
Solar radiation-cloud forcing at TOA         –47 [26] –42.3 ± 0.1 –40.3 ± 0.1 –40.4 ± 0.1
Longwave radiation-cloud forcing at TOA          26 [26] 22.3 ± 0.1 21.2 ± 0.1 24.6 ± 0.1
Near-surface air temperature, °С 14.0 ± 0.2 [26] 13.0 ± 0.1 13.7 ± 0.1 13.8 ± 0.1
Precipitation, mm/day 2.5–2.8 [23] 2.97 ± 0.01 3.13 ± 0.01 2.97 ± 0.01
River water inflow to the World Ocean,10^3 km^3/year 29–40 [28] 21.6 ± 0.1 31.8 ± 0.1 40.0 ± 0.3
Snow coverage in Feb., mil. Km^2 46 ± 2 [29] 37.6 ± 1.8 39.9 ± 1.5 39.4 ± 1.5
Permafrost area, mil. Km^2 10.7–22.8 [30] 8.2 ± 0.6 16.1 ± 0.4 5.0 ± 0.5
Land area prone to seasonal freezing in NH, mil. Km^2 54.4 ± 0.7 [31] 46.1 ± 1.1 48.3 ± 1.1 51.6 ± 1.0
Sea ice area in NH in March, mil. Km^2 13.9 ± 0.4 [32] 12.9 ± 0.3 14.4 ± 0.3 14.5 ± 0.3
Sea ice area in NH in Sept., mil. Km^2 5.3 ± 0.6 [32] 4.5 ± 0.5 4.5 ± 0.5 6.1 ± 0.5

Heat flux units are given in W/m^2; the other units are given with the title of corresponding parameter. Where possible, ± shows standard deviation for annual mean value.  Source: Simulation of Modern Climate with the New Version Of the INM RAS Climate Model (Bracketed numbers refer to sources for observations)

Ocean Temperature and Salinity

The model biases in potential temperature and salinity averaged over longitude with respect to WOA09 (Antonov et al. 2010) are shown in Fig.12. Positive bias in the Southern Ocean penetrates from the surface downward for up to 300 m, while negative bias in the tropics can be seen even in the 100–1000 m layer.

Nevertheless, zonal mean temperature error at any level from the surface to the bottom is small. This was not the case for the INMCM4, where one could see negative temperature bias up to 2–3 K from 1.5 km to the bottom nearly al all latitudes, and 2–3 K positive bias at levels of 700–1000 m. The reason for this improvement is the introduction of a higher background coefficient for vertical diffusion at high depth (3000 m and higher) than at intermediate depth (300–500m). Positive temperature bias at 45–65 N at all depths could probably be explained by shortcomings in the representation of deep convection [similar errors can be seen for most of the CMIP5 models (Flato etal. 2013, their Fig.9.13)].

Another feature common for many present day climate models (and for the INMCM5 as well) is negative bias in southern tropical ocean salinity from the surface to 500 m. It can be explained by overestimation of precipitation at the southern branch of the Inter Tropical Convergence zone. Meridional heat flux in the ocean (Fig.13) is not far from available estimates (Trenberth and Caron 2001). It looks similar to the one for the INMCM4, but maximum of northward transport in the Atlantic in the INMCM5 is about 0.1–0.2 × 1015 W higher than the one in the INMCM4, probably, because of the increased horizontal resolution in the oceanic block.

Sea Ice

In the Arctic, the model sea ice area is just slightly overestimated. Overestimation of the Arctic sea ice area is connected with negative bias in the surface temperature. In the same time, connection of the sea ice area error with the positive salinity bias is not evident because ice formation is almost compensated by ice melting, and the total salinity source for these pair of processes is not large. The amplitude and phase of the sea ice annual cycle are reproduced correctly by the model. In the Antarctic, sea ice area is underestimated by a factor of 1.5 in all seasons, apparently due to the positive temperature bias. Note that the correct simulation of sea ice area dynamics in both hemispheres simultaneously is a difficult task for climate modeling.

The analysis of the model time series of the SST anomalies shows that the El Niño event frequency is approximately the same in the model and data, but the model El Niños happen too regularly. Atmospheric response to the El Niño vents is also underestimated in the model by a factor of 1.5 with respect to the reanalysis data.

Conclusion

Based on the CMIP5 model INMCM4 the next version of the Institute of Numerical Mathematics RAS climate model was developed (INMCM5). The most important changes include new parameterizations of large scale condensation (cloud fraction and cloud water are now the prognostic variables), and increased vertical resolution in the atmosphere (73 vertical levels instead of 21, top model level raised from 30 to 60 km). In the oceanic block, horizontal resolution was increased by a factor of 2 in both directions.

The climate model was supplemented by the aerosol block. The model got a new parallel code with improved computational efficiency and scalability. With the new version of climate model we performed a test model run (80 years) to simulate the present-day Earth climate. The model mean state was compared with the available datasets. The structures of the surface temperature and precipitation biases in the INMCM5 are typical for the present climate models. Nevertheless, the RMS error in surface temperature, precipitation as well as zonal mean temperature and zonal wind are reduced in the INMCM5 with respect to its previous version, the INMCM4.

The model is capable of reproducing equatorial stratospheric QBO and SSWs.The model biases for the sea surface height and surface salinity are reduced in the new version as well, probably due to increasing spatial resolution in the oceanic block. Bias in ocean potential temperature at depths below 700 m in the INMCM5 is also reduced with respect to the one in the INMCM4. This is likely because of the tuning background vertical diffusion coefficient.

Model sea ice area is reproduced well enough in the Arctic, but is underestimated in the Antarctic (as a result of the overestimated surface temperature). RMS error in the surface salinity is reduced almost everywhere compared to the previous model except the Arctic (where the positive bias becomes larger). As a final remark one can conclude that the INMCM5 is substantially better in almost all aspects than its previous version and we plan to use this model as a core component for the coming CMIP6 experiment.

Summary

One the one hand, this model example shows that the intent is simple: To represent dynamically the energy balance of our planetary climate system.  On the other hand, the model description shows how many parameters are involved, and the complexity of processes interacting.  The attempt to simulate operations of the climate system is a monumental task with many outstanding challenges, and this latest version is another step in an iterative development.

Note:  Regarding the influence of rising CO2 on the energy balance.  Global warming advocates estimate a CO2 perturbation of 4 W/m^2.  In the climate parameters table above, observations of the radiation fluxes have a 2 W/m^2 error range at best, and in several cases are observed in ranges of 10 to 15 W/m^2.

We do not yet have access to the time series temperature outputs from INMCM5 to compare with observations or with other CMIP6 models.  Presumably that will happen in the future.

Early Schematic: Flows and Feedbacks for Climate Models

Warming from CO2 Unlikely

Figure 5. Simplification of IPCC AR5 shown above in Fig. 4. The colored lines represent the range of results for the models and observations. The trends here represent trends at different levels of the tropical atmosphere from the surface up to 50,000 ft. The gray lines are the bounds for the range of observations, the blue for the range of IPCC model results without extra GHGs and the red for IPCC model results with extra GHGs.The key point displayed is the lack of overlap between the GHG model results (red) and the observations (gray). The nonGHG model runs (blue) overlap the observations almost completely. 

A recent post at Friends of Science alerted me to an important proof against the CO2 global warming claim. It was included in John Christy’s testimony 29 Mar 2017 at the House Committee on Science, Space and Technology. The text below is from that document which can be accessed here. (My bolds)

Main Point: IPCC Assessment Reports show that the IPCC climate models performed best versus observations when they did not include extra GHGs and this result can be demonstrated with a statistical model as well.

(5)  A simple statistical model that passed the same “scientific-method” test

The IPCC climate models performed best versus observations when they did not include extra GHGs and this result can be demonstrated with a statistical model as well. I was coauthor of a report which produced such an analysis (Wallace, J., J. Christy, and J. D’Aleo, “On the existence of a ‘Tropical Hot Spot’ & the validity of the EPA’s CO2 Endangerment Finding – Abridged Research Report”, August 2016 (Available here ).

In this report we examine annual estimates from many sources of global and tropical deep-layer temperatures since 1959 and since 1979 utilizing explanatory variables that did not include rising CO2 concentrations. We applied the model to estimates of global and tropical temperature from the satellite and balloon sources, individually, shown in Fig. 2 above. The explanatory variables are those that have been known for decades such as indices of El Nino-Southern Oscillation (ENSO), volcanic activity, and a solar activity (e.g. see Christy and McNider, 1994, “Satellite greenhouse signal”, Nature, 367, 27Jan). [One of the ENSO explanatory variables was the accumulated MEI (Multivariate ENSO Index, see https://www.esrl.noaa.gov/psd/enso/mei/) in which the index was summed through time to provide an indication of its accumulated impact. This “accumulated-MEI” was shown to be a potential factor in global temperatures by Spencer and Braswell, 2014 (“The role of ENSO in global ocean temperature changes during 1955-2011 simulated with a 1D climate model”, APJ.Atmos.Sci. 50(2), 229-237, DOI:10.1007/s13143-014- 001-z.) Interestingly, later work has shown that this “accumulated-MEI” has virtually the same impact as the accumulated solar index, both of which generally paralleled the rise in temperatures through the 1980s and 1990s and the slowdown in the 21st century. Thus our report would have the same conclusion with or without the “accumulated-MEI.”]

The basic result of this report is that the temperature trend of several datasets since 1979 can be explained by variations in the components that naturally affect the climate, just as the IPCC inadvertently indicated in Fig. 5 above. The advantage of the simple statistical treatment is that the complicated processes such as clouds, ocean-atmosphere interaction, aerosols, etc., are implicitly incorporated by the statistical relationships discovered from the actual data. Climate models attempt to calculate these highly non-linear processes from imperfect parameterizations (estimates) whereas the statistical model directly accounts for them since the bulk atmospheric temperature is the response-variable these processes impact. It is true that the statistical model does not know what each sub-process is or how each might interact with other processes. But it also must be made clear: it is an understatement to say that no IPCC climate model accurately incorporates all of the non-linear processes that affect the system. I simply point out that because the model is constrained by the ultimate response variable (bulk temperature), these highly complex processes are included.

The fact that this statistical model explains 75-90 percent of the real annual temperature variability, depending on dataset, using these influences (ENSO, volcanoes, solar) is an indication the statistical model is useful. In addition, the trends produced from this statistical model are not statistically different from the actual data (i.e. passing the “scientific-method” trend test which assumes the natural factors are not influenced by increasing GHGs). This result promotes the conclusion that this approach achieves greater scientific (and policy) utility than results from elaborate climate models which on average fail to reproduce the real world’s global average bulk temperature trend since 1979.

The over-warming of the atmosphere by the IPCC models relates to a problem the IPCC AR5 encountered elsewhere. In trying to determine the climate sensitivity, which is how sensitive the global temperature is relative to increases in GHGs, the IPCC authors chose not to give a best estimate. [A high climate sensitivity is a foundational component of the last Administration’s Social Cost of Carbon.] The reason? … climate models were showing about twice the sensitivity to GHGs than calculations based on real, empirical data. I would encourage this committee, and our government in general, to consider empirical data, not climate model output, when dealing with environmental regulations.

Summary

Planning requires assumptions because no one has knowledge of the future, only informed opinions.  Christy makes the case that our assumptions should be based on empirical data rather than models that are driven by theoretical assumptions.  When the CO2 sensitivity assumption is removed from climate models they come much closer to observed temperature measurements.  Statistical analysis shows that at least 75% of observed warming comes from factors other than CO2.  That analysis also correlates with the accumulated effects of oceanic circulations, principally the ENSO index.

Putting Climate Models in Their Place

A previous post Chameleon Climate Models described the general issue of whether a model belongs on the bookshelf (theoretically useful) or whether it passes real world filters of relevance, thus qualifying as useful for policy considerations.

Following an interesting discussion on her blog, Dr. Judith Curry has written an important essay on the usefulness and limitations of climate models.

The paper was developed to respond to a request from a group of lawyers wondering how to regard claims based upon climate model outputs. The document is entitled Climate Models (here) and is a great informative read for anyone. Some excerpts that struck me:

Climate model development has followed a pathway mostly driven by scientific curiosity and computational limitations. GCMs were originally designed as a tool to help understand how the climate system works. GCMs are used by researchers to represent aspects of climate that are extremely difficult to observe, experiment with theories in a new way by enabling hitherto infeasible calculations, understand a complex system of equations that would otherwise be impenetrable, and explore the climate system to identify unexpected outcomes. As such, GCMs are an important element of climate research.

Climate models are useful tools for conducting scientific research to understand the climate system. However, the above points support the conclusion that current GCM climate models are not fit for the purpose of attributing the causes of 20th century warming or for predicting global or regional climate change on timescales of decades to centuries, with any high level of confidence. By extension, GCMs are not fit for the purpose of justifying political policies to fundamentally alter world social, economic and energy systems. It is this application of climate model results that fuels the vociferousness of the debate surrounding climate models.

Evolution of state-of-the-art Climate Models from the mid 70s to the mid 00s. From IPCC (2007)

Evolution of state-of-the-art Climate Models from the mid 70s to the mid 00s. From IPCC (2007)

The actual equations used in the GCM computer codes are only approximations of the physical processes that occur in the climate system. While some of these approximations are highly accurate, others are unavoidably crude. This is because the real processes they represent are either poorly understood or too complex to include in the model given the constraints of the computer system. Of the processes that are most important for climate change, parameterizations related to clouds and precipitation remain the most challenging, and are the greatest source of disagreement among different GCMs.

There are literally thousands of different choices made in the construction of a climate model (e.g. resolution, complexity of the submodels, parameterizations). Each different set of choices produces a different model having different sensitivities. Further, different modeling groups have different focal interests, e.g. long paleoclimate simulations, details of ocean circulations, nuances of the interactions between aerosol particles and clouds, the carbon cycle. These different interests focus their limited computational resources on a particular aspect of simulating the climate system, at the expense of others.


Overview of the structure of a state-of-the-art climate model. See Climate Models Explained by R.G. Brown

Human-caused warming depends not only on how much CO2 is added to the atmosphere, but also on how ‘sensitive’ the climate is to the increased CO2. Climate sensitivity is defined as the global surface warming that occurs when the concentration of carbon dioxide in the atmosphere doubles. If climate sensitivity is high, then we can expect substantial warming in the coming century as emissions continue to increase. If climate sensitivity is low, then future warming will be substantially lower.

In GCMs, the equilibrium climate sensitivity is an ‘emergent property’ that is not directly calibrated or tuned. While there has been some narrowing of the range of modeled climate sensitivities over time, models still can be made to yield a wide range of sensitivities by altering model parameterizations. Model versions can be rejected or not, subject to the modelers’ own preconceptions, expectations and biases of the outcome of equilibrium climate sensitivity calculation.

Further, the discrepancy between observational and climate model-based estimates of climate sensitivity is substantial and of significant importance to policymakers. Equilibrium climate sensitivity, and the level of uncertainty in its value, is a key input into the economic models that drive cost-benefit analyses and estimates of the social cost of carbon.

Variations in climate can be caused by external forcing, such as solar variations, volcanic eruptions or changes in atmospheric composition such as an increase in CO2. Climate can also change owing to internal processes within the climate system (internal variability). The bestknown example of internal climate variability is El Nino/La Nina. Modes of decadal to centennial to millennial internal variability arise from the slow circulations in the oceans. As such, the ocean serves as a ‘fly wheel’ on the climate system, storing and releasing heat on long timescales and acting to stabilize the climate. As a result of the time lags and storage of heat in the ocean, the climate system is never in equilibrium.

The combination of uncertainty in the transient climate response (sensitivity) and the uncertainties in the magnitude and phasing of the major modes in natural internal variability preclude an unambiguous separation of externally forced climate variations from natural internal climate variability. If the climate sensitivity is on the low end of the range of estimates, and natural internal variability is on the strong side of the distribution of climate models, different conclusions are drawn about the relative importance of human causes to the 20th century warming.

Figure 5.1. Comparative dynamics of the World Fuel Consumption (WFC) and Global Surface Air Temperature Anomaly (ΔT), 1861-2000. The thin dashed line represents annual ΔT, the bold line—its 13-year smoothing, and the line constructed from rectangles—WFC (in millions of tons of nominal fuel) (Klyashtorin and Lyubushin, 2003). Source: Frolov et al. 2009

Anthropogenic (human-caused) climate change is a theory in which the basic mechanism is well understood, but whose potential magnitude is highly uncertain. What does the preceding analysis imply for IPCC’s ‘extremely likely’ attribution of anthropogenically caused warming since 1950? Climate models infer that all of the warming since 1950 can be attributed to humans. However, there have been large magnitude variations in global/hemispheric climate on timescales of 30 years, which are the same duration as the late 20th century warming. The IPCC does not have convincing explanations for previous 30 year periods in the 20th century, notably the warming 1910-1945 and the grand hiatus 1945-1975. Further, there is a secular warming trend at least since 1800 (and possibly as long as 400 years) that cannot be explained by CO2, and is only partly explained by volcanic eruptions.

Summary

There is growing evidence that climate models are running too hot and that climate sensitivity to CO2 is on the lower end of the range provided by the IPCC. Nevertheless, these lower values of climate sensitivity are not accounted for in IPCC climate model projections of temperature at the end of the 21st century or in estimates of the impact on temperatures of reducing CO2 emissions.

The climate modeling community has been focused on the response of the climate to increased human caused emissions, and the policy community accepts (either explicitly or implicitly) the results of the 21st century GCM simulations as actual predictions. Hence we don’t have a good understanding of the relative climate impacts of the above (natural factors) or their potential impacts on the evolution of the 21st century climate.

Footnote:

There are a series of posts here which apply reality filters to attest climate models.  The first was Temperatures According to Climate Models where both hindcasting and forecasting were seen to be flawed.

Others in the Series are:

Sea Level Rise: Just the Facts

Data vs. Models #1: Arctic Warming

Data vs. Models #2: Droughts and Floods

Data vs. Models #3: Disasters

Data vs. Models #4: Climates Changing

Climate Medicine

Climates Don’t Start Wars, People Do

virtual-reality-1920x1200

Beware getting sucked into any model, climate or otherwise.

Climate Whack-A-Mole

The Joys of Playing Climate Whack-A-Mole

Dealing with alarmist claims is like playing whack-a-mole. Every time you beat down one bogeyman, another one pops up in another field, and later the first one returns, needing to be confronted again. I have been playing Climate Whack-A-Mole for a while, and if you are interested, there are some hammers supplied below.

The alarmist methodology is repetitive, only the subject changes. First, create a computer model, purporting to be a physical or statistical representation of the real world. Then play with the parameters until fears are supported by the model outputs. Disregard or discount divergences from empirical observations. This pattern is described in more detail at Chameleon Climate Models

A series of posts here apply reality filters to attest climate models.  The first was Temperatures According to Climate Models where both hindcasting and forecasting were seen to be flawed.

Others in the Series are:

Sea Level Rise: Just the Facts

Data vs. Models #1: Arctic Warming

Data vs. Models #2: Droughts and Floods

Data vs. Models #3: Disasters

Data vs. Models #4: Climates Changing

Climate Medicine

Climates Don’t Start Wars, People Do

virtual-reality-1920x1200

Beware getting sucked into any model, climate or otherwise.

Alarmist Heads in the Clouds

A new study from Scripps at UC San Diego claims proof of greenhouse gas warming by means of changes to the clouds. The paper is behind a paywall, so the reasoning is not accessible, but the media releases will ensure wide repetition.

From the news release July 11, 2016 (here)
Clouds Are Moving Higher, Subtropical Dry Zones Expanding, According to Satellite Analysis
Scripps-led study confirms computerized climate simulations projecting effects of global warming

Inconsistent satellite imaging of clouds over the decades has been a hindrance to improving scientists’ understanding. Records of cloudiness from satellites originally designed to monitor weather are prone to spurious trends related to changes in satellite orbit, instrument calibration, degradation of sensors over time, and other factors.

When the researchers removed such artifacts from the record, the data exhibited large-scale patterns of cloud change between the 1980s and 2000s that are consistent with climate model predictions for that time period, including poleward retreat of mid-latitude storm tracks, expansion of subtropical dry zones, and increasing height of the highest cloud tops. These cloud changes enhance absorption of solar radiation by the earth and reduce emission of thermal radiation to space. This exacerbates global warming caused by increasing greenhouse gas concentrations.

The researchers drew from several independent corrected satellite records in their analysis. They concluded that the behavior of clouds they observed is consistent with a human-caused increase in greenhouse gas concentrations and a planet-wide recovery from two major volcanic eruptions, the 1982 El Chichón eruption in Mexico and the 1991 eruption of Mt. Pinatubo in the Philippines. Aerosols ejected from those eruptions had a net cooling effect on the planet for several years after they took place.

Barring another volcanic event of this sort, the scientists expect the cloud trends to continue in the future as the planet continues to warm due to increasing greenhouse gas concentrations. (My bolds)

Another Example of Lop-sided Myopia and Confirmation Bias

The report above violates basic physics, resulting in a gross distortion. Two points are critical. When it comes to clouds, the greenhouse gas that matters is H2O, not CO2. Any IR effects are 96% due to the presence of water vapor and the droplets in the clouds.

And even more importantly, as Dr. Salby illustrated (here), the net effect from clouds is cooling, not warming.

Net cloud forcing is then −15 W m−2. It represents radiative cooling of the Earth-atmosphere system. This is four times as great as the additional warming of the Earth’s surface that would be introduced by a doubling of CO2. Latent heat transfer to the atmosphere (Fig. 1.32) is 90 W m−2. It is an order of magnitude greater. Consequently, the direct radiative effect of increased CO2 would be overshadowed by even a small adjustment of convection (Sec. 8.7).

Convective clouds forming over the Amazon in a blanket smoke. Credit: Prof. Ilan Koren

This is confirmed by other researchers, such as I. Koren, G. Dagan, and O. Altaratz. From aerosol-limited to invigoration of warm convective clouds. Science, 2014; 344 (6188) here.

They then looked at another source of data: that of the Clouds’ and Earth’s Radiant Energy System (CERES) satellite instruments which measure fluxes of reflected and emitted radiation from Earth to space, to help scientists understand how the climate varies over time. When analyzed together with the aerosol loading over the same area at the same time, the outcome, says Koren, was a “textbook demonstration of the invigoration effect” of added aerosols on clouds. In other words, the radiation data fit the unique signature of clouds that were growing higher and larger. Such clouds show a strong increase in cooling due to the reflected short waves, but that effect is partly cancelled out by the enhanced, trapped, long-wave radiation coming from underneath. (My bold)

More info on clouds is here:Climate Partly Cloudy 

Summary

Once again, atmospheric physics is willfully distorted in order to get a headline and burnish credentials in support of man-made climate change. They promote a myopic and lop-sided picture to frighten a public mostly ill-equipped to see through their mumbo-jumbo.

mumbo_jumbo_flamingo-land1

Chameleon Climate Models

Chameleon2

Paul Pfleiderer has done a public service in calling attention to
The Misuse of Theoretical Models in Finance and Economics (here)
h/t to William Briggs for noticing and linking

He coins the term “Chameleon” for the abuse of models, and explains in the abstract of his article:

In this essay I discuss how theoretical models in finance and economics are used in ways that make them “chameleons” and how chameleons devalue the intellectual currency and muddy policy debates. A model becomes a chameleon when it is built on assumptions with dubious connections to the real world but nevertheless has conclusions that are uncritically (or not critically enough) applied to understanding our economy. I discuss how chameleons are created and nurtured by the mistaken notion that one should not judge a model by its assumptions, by the unfounded argument that models should have equal standing until definitive empirical tests are conducted, and by misplaced appeals to “as-if” arguments, mathematical elegance, subtlety, references to assumptions that are “standard in the literature,” and the need for tractability.

Chameleon Climate Models

Pfleiderer is writing about his specialty, financial models, and even more particularly banking systems, and gives several examples of how dysfunctional is the problem. As we shall see below, climate models are an order of magnitude more complicated, and abused in the same way, only more flagrantly.

As the analogy suggests, a chameleon model changes color when it is moved to a different context. When politicians and activists refer to climate models, they assert the model outputs as “Predictions”. The media is rife with examples, but here is one from Climate Concern UK

Some predicted Future Effects of Climate Change

  • Increased average temperatures: the IPCC (International Panel for Climate Change) predict a global rise of between 1.1ºC and 6.4ºC by 2100 depending on some scientific uncertainties and the extent to which the world decreases or increases greenhouse gas emissions.
  • 50% less rainfall in the tropics. Severe water shortages within 25 years – potentially affecting 5 billion people. Widespread crop failures.
  • 50% more river volume by 2100 in northern countries.
  • Desertification and burning down of vast areas of agricultural land and forests.
  • Continuing spread of malaria and other diseases, including from a much increased insect population in UK. Respiratory illnesses due to poor air quality with higher temperatures.
  • Extinction of large numbers of animal and plant species.
  • Sea level rise: due to both warmer water (greater volume) and melting ice. The IPCC predicts between 28cm and 43cm by 2100, with consequent high storm wave heights, threatening to displace up to 200 million people. At worst, if emissions this century were to set in place future melting of both the Greenland and West Antarctic ice caps, sea level would eventually rise approx 12m.

Now that alarming list of predictions is a claim to forecast what will be the future of the actual world as we know it.

Now for the switcheroo. When climate models are referenced by scientists or agencies likely to be held legally accountable for making claims, the model output is transformed into “Projections.” The difference is more than semantics:
Prediction: What will actually happen in the future.
Projection: What will possibly happen in the future.

In other words, the climate model has gone from the bookshelf world (possibilities) to the world of actualities and of policy decision-making.  The step of applying reality filters to the climate models (verification) is skipped in order to score political and public relations points.

The ultimate proof of this is the existence of legal disclaimers exempting the modellers from accountability. One example is from ClimateData.US

Disclaimer NASA NEX-DCP30 Terms of Use

The maps are based on NASA’s NEX-DCP30 dataset that are provided to assist the science community in conducting studies of climate change impacts at local to regional scales, and to enhance public understanding of possible future climate patterns and climate impacts at the scale of individual neighborhoods and communities. The maps presented here are visual representations only and are not to be used for decision-making. The NEX-DCP30 dataset upon which these maps are derived is intended for use in scientific research only, and use of this dataset or visualizations for other purposes, such as commercial applications, and engineering or design studies is not recommended without consultation with a qualified expert. (my bold)

Conclusion:

Whereas some theoretical models can be immensely useful in developing intuitions, in essence a theoretical model is nothing more than an argument that a set of conclusions follows from a given set of assumptions. Being logically correct may earn a place for a theoretical model on the bookshelf, but when a theoretical model is taken off the shelf and applied to the real world, it is important to question whether the model’s assumptions are in accord with what we know about the world. Is the story behind the model one that captures what is important or is it a fiction that has little connection to what we see in practice? Have important factors been omitted? Are economic agents assumed to be doing things that we have serious doubts they are able to do? These questions and others like them allow us to filter out models that are ill suited to give us genuine insights. To be taken seriously models should pass through the real world filter.

Chameleons are models that are offered up as saying something significant about the real world even though they do not pass through the filter. When the assumptions of a chameleon are challenged, various defenses are made (e.g., one shouldn’t judge a model by its assumptions, any model has equal standing with all other models until the proper empirical tests have been run, etc.). In many cases the chameleon will change colors as necessary, taking on the colors of a bookshelf model when challenged, but reverting back to the colors of a model that claims to apply the real world when not challenged.

A model becomes a chameleon when it is built on assumptions with dubious connections to the real world but nevertheless has conclusions that are uncritically (or not critically enough) applied to understanding our economy. Chameleons are not just mischievous they can be harmful − especially when used to inform policy and other decision making − and they devalue the intellectual currency.

Thank you Dr. Pfleiderer for showing us how the sleight-of-hand occurs in economic considerations. The same abuse prevails in the world of climate science.

Paul Pfleiderer, Stanford University Faculty
C.O.G. Miller Distinguished Professor of Finance
Senior Associate Dean for Academic Affairs
Professor of Law (by courtesy), School of Law

Footnote:

There are a series of posts here which apply reality filters to attest climate models.  The first was Temperatures According to Climate Models where both hindcasting and forecasting were seen to be flawed.

Others in the Series are:

Sea Level Rise: Just the Facts

Data vs. Models #1: Arctic Warming

Data vs. Models #2: Droughts and Floods

Data vs. Models #3: Disasters

Data vs. Models #4: Climates Changing

Data vs. Models #4: Climates Changing

188767-004-6bde1150

Köppen climate zones as they appear in the 21st Century.

Every day there are reports like this:

An annual breach of 2 degrees could happen as soon as 2030, according to climate model simulations, although there’s always the chance that climate models are slightly underestimating or overestimating how close we are to that date. Writing with fellow meteorologist Jeff Masters for Weather Underground, Bob Henson said the current spike means “we are now hurtling at a frightening pace toward the globally agreed maximum of 2.0°C warming over pre-industrial levels.”

That abstract, mathematically averaged world, the subject of so much media space and alarm, has almost nothing to do with the world where any of us live. Because nothing on our planet moves in unison.

Start with the hemispheres:

Notice that the global temperature tracks with the seasons of the NH. The reason for this is simple. The NH has twice as much land as the Southern Hemisphere (SH). Oceans have greater heat capacity and do not change temperatures as much as land does. So every year when there is almost a 4 °C swing in the temperature of the Earth, it follows the seasons of the NH. This is especially interesting because the Earth gets the most energy from the sun in January right now. That is because of the orbit of the Earth. The perihelion is when the Earth is closest to the sun and that currently takes place in January.

Using round numbers, the Northern Hemisphere (NH) half of the total surface combines 20% land with 30% ocean, while the SH comprises 9% land with 41% ocean. With the oceans having huge heat capacities relative to the land, the NH has much more volatility in temperatures than does the SH. But more importantly, the trends in multi-decadal warming and cooling also differ.

Climates Are Found Down in the Weeds

The top-down global view needs to be supplemented with a bottom-up appreciation of the diversity of climates and their changes.

slide_4

 

The ancient Greeks were the first to classify climate zones. From their travels and sea-faring experiences, they called the equatorial regions Torrid, due to the heat and humidity. The mid-latitudes were considered Temperate, including their home Mediterranean Sea. Further North and South, they knew places were Frigid.

Based on empirical observations, Köppen (1900) established a climate classification system which uses monthly temperature and precipitation to define boundaries of different climate types around the world. Since its inception, this system has been further developed (e.g. Köppen and Geiger, 1930; Stern et al., 2000) and widely used by geographers and climatologists around the world.

Köppen and Climate Change

The focus is on differentiating vegetation regimes, which result primarily from variations in temperature and precipitation over the seasons of the year. Now we have an interesting study that considers shifts in Köppen climate zones over time in order to identify changes in climate as practical and local/regional realities.

The paper is: Using the Köppen classification to quantify climate variation and change: An example for 1901–2010
By Deliang Chen and Hans Weiteng Chen
Department of Earth Sciences, University of Gothenburg, Sweden

Hans Chen has built an excellent interactive website (here): The purpose of this website is to share information about the Köppen climate classification, and provide data and high-resolution figures from the paper Chen and Chen, 2013: Using the Köppen classification to quantify climate variation and change: An example for 1901–2010 (pdf)

The Köppen climate classification consists of five major groups and a number of sub-types under each major group, as listed in Table 1. While all the major groups except B are determined by temperature only, all the sub-types except the two sub-types under E are decided based on the combined criteria relating to seasonal temperature and precipitation. Therefore, the classification scheme as a whole represents different climate regimes of various temperature and precipitation combinations.

Main characteristics of the Köppen climate major groups and sub-types:

Major group  Sub-types
A: Tropical Tropical rain forest: Af
Tropical monsoon: Am
Tropical wet and dry savanna: Aw, As
B: Dry Desert (arid): BWh, BWk
Steppe (semi-arid): BSh, BSk
C: Mild temperate Mediterranean: Csa, Csb, Csc
Humid subtropical: Cfa, Cwa
Oceanic: Cfb, Cfc, Cwb, Cwc
D: Snow Humid: Dfa, Dwa, Dfb, Dwb, Dsa, Dsb
Subarctic: Dfc, Dwc, Dfd, Dwd, Dsc, Dsd
E: Polar Tundra: ET
Ice cap: EF

Temporal Changes in Climate Zones

This study used a global gridded dataset with monthly mean temperature and precipitation, covering 1901–2010, which was produced and documented by Kenji Matsuura and Cort J. Willmott from Department of Geography, University of Delaware. Station data were compiled from different sources, including Global Historical Climatology Network version 2 (GHCN2) and the Global Surface Summary of Day (GSOD).The data and associated documentations can be found at http://climate.geog.udel.edu/climate/html_pages/Global2011/

In the maps below, the Köppen classification was applied on temperature and precipitation averaged over shorter time scales, from interannual to decadal and 30 year. The 30 year averages were calculated with an overlap of 20 years between each sub-period, while the interannual and decadal averages did not have overlapping years. Black regions indicate areas where the major Köppen type has changed at least once during 1901–2010 for a given time scale. Thus, the black regions are likely to be sensitive to climate variations, while the colored regions identify spatially stable regions.

Chen_and_Chen_2013fig2ab

Chen_and_Chen_2013fig2c

 

Major group Time scales
Interannual (%) Interdecadal (%) 30-year (%)
A                45.5                    89.0                 94.2
B                45.1                    85.2                 91.8
C                35.3                    77.4                 87.3
D                30.0                    83.3                 91.0
E                78.2                    92.8                 96.2

The table and images show that most places have had at least one entire year with temperatures and/or precipitation atypical for that climate.  It is much more unusual for abnormal weather to persist for ten years running.  At 30-years and more the zones are quite stable, such that is there is little movement at the boundaries with neighboring zones.

Over time, there is variety in zonal changes, albeit within a small range of overall variation:

Chen and Chen Conclusions

By using a global gridded temperature and precipitation data over the period of 1901–2010, we reached the following conclusions:

  • Over the whole period (1901–2010), the mean climate distributions have a comparable pattern and portion with previous estimates. The five major groups A, B, C, D, E take up 19.4%, 28.4%, 14.6%, 22.1%, and 15.5% of the total land area on Earth respectively. Since the relative changes of the areas covered by the five major groups are all small on the 30 year time scale, the agreement indicates that the climate dataset used overall is of comparable quality with those used in other studies.
  • On the interannual, interdecadal, and 30 year time scales, the climate type for a given grid may shift from one type to another and the spatial stability decreases towards shorter time scales. While the spatially stable climate regions identified are useful for conservation and other purposes, the instable regions mark the transition zones which deserve special attention since they may have implications for ecosystems and dynamics of the climate system.
  • On the 30 year time scale, the dominating changes in the climate types over the whole period are that the arid regions occupied by group B (mainly type BWh) have expanded and the regions dominated by arctic climate (EF) have shrunk along with the global warming and regional precipitation changes.

Summary: The Myth of “Global” Climate Change

Climate is a term to describe a local or regional pattern of weather. There is a widely accepted system of classifying climates, based largely on distinctive seasonal variations in temperature and precipitation. Depending on how precisely you apply the criteria, there can be from 6 to 13 distinct zones just in South Africa, or 8 to 11 zones only in Hawaii.

Each climate over time experiences shifts toward warming or cooling, and wetter or drier periods. One example: Fully a third of US stations showed cooling since 1950 while the others warmed.  It is nonsense to average all of that and call it “Global Warming” because the net is slightly positive.  Only in the fevered imaginations of CO2 activists do all of these diverse places move together in a single march toward global warming.

Data vs. Models #3: Disasters

Addendum at end on Wildfires

Looking Through Alarmist Glasses

In the aftermath of COP21 in Paris, the Irish Times said this:

Scientists who closely monitored the talks in Paris said it was not the agreement that humanity really needed. By itself, it will not save the planet. The great ice sheets remain imperiled, the oceans are still rising, forests and reefs are under stress, people are dying by tens of thousands in heatwaves and floods, and the agriculture system that feeds 7 billion human beings is still at risk.

That list of calamities looks familiar from insurance policies where they would be defined as “Acts of God.” Before we caught CO2 fever, everyone accepted that natural disasters happened, unpredictably and beyond human control. Now of course, we have computer models to project scenarios where all such suffering will increase and it will be our fault.

For example, from an alarmist US.gov website we are told:

Human-induced climate change has already increased the number and strength of some of these extreme events. Over the last 50 years, much of the U.S. has seen increases in prolonged periods of excessively high temperatures, heavy downpours, and in some regions, severe floods and droughts.

By late this century, models, on average, project an increase in the number of the strongest (Category 4 and 5) hurricanes. Models also project greater rainfall rates in hurricanes in a warmer climate, with increases of about 20% averaged near the center of hurricanes.

Looking Without Alarmist Glasses

But looking at the data without a warmist bias leads to a different conclusion.

The trends in normalized disaster impacts show large differences between regions and weather event categories. Despite these variations, our overall conclusion is that the increasing exposure of people and economic assets is the major cause of increasing trends in disaster impacts. This holds for long-term trends in economic losses as well as the number of people affected.

From this recent study:  On the relation between weather-related disaster impacts, vulnerability and climate change, by Hans Visser, Arthur C. Petersen, Willem Ligtvoet 2014 (open source access here)

Data and Analysis

All the analyses in this article are based on the EM-DAT emergency database. This database is open source and maintained by the World Health Organization (WHO) and the Centre for Research on the Epidemiology of Disasters (CRED) at the University of Louvain, Belgium (Guha-Sapir et al. 2012).

The EM-DAT database contains disaster events from 1900 onwards, presented on a country basis. . .We aggregated country information on disasters to three economic regions: OECD countries, BRIICS countries (Brazil, Russia, India, Indonesia, China and South Africa) and the remaining countries, denoted hereafter as Rest of World (RoW) countries. OECD countries can be seen as the developed countries, BRIICS countries as upcoming economies and RoW as the developing countries.

The EM-DAT database provides three disaster impact indicators for each disaster event: economic losses, the number of people affected and the number of people killed. . .The data show large differences across disaster indicators and regions: economic losses are largest in the OECD countries, the number of people affected is largest in the BRIICS countries and the number of people killed is largest in the RoW countries.

Fig. 3 Economic losses normalized for wealth (upper panel) and the number of people affected normalized for population size (lower panel). Sample period is 1980–2010. Solid lines are IRW trends for the corresponding data.

Fig. 3
Economic losses normalized for wealth (upper panel) and the number of people affected normalized for population size (lower panel). Sample period is 1980–2010. Solid lines are IRW trends for the corresponding data.

The general idea behind normalization is that if we want to detect a climate signal in disaster losses, the role of changes in wealth and population should be ruled out; however, this is complicated by the fact that changes in vulnerability may also play a role. . .(After extensive research), we conclude that quantitative information on time-varying vulnerability patterns is lacking. More qualitatively, we judge that a stable vulnerability V t, as derived in this study, is not in contrast with estimates in the literature.

Climate drivers

Historic trend estimates for weather and climate variables and phenomena are presented in IPCC-SREX (2012, see their table 3-1). The categories ‘winds’, ‘tropical cyclones’ and ‘extratropical cyclones’ coincide with the ‘meteorological events’ category in the CRED database. In the same way, the ‘floods’ category coincides with the CRED ‘hydrological events’ category. The IPCC trend estimates hold for large spatial scales (trends for smaller regions or individual countries could be quite different).

The IPCC table shows that little evidence is found for historic trends in meteorological and hydrological events. Furthermore, Table 1 shows that these two events are the main drivers for (1) economic losses (all regions), (2) the number of people affected (all regions) and (3) the number of people killed (BRIICS countries only). Thus, trends in normalized data and climate drivers are consistent across these impact indicators and regions.

Summary

People who are proclaiming that disasters rise with fossil fuel emissions are flying in the face of the facts, and in denial of IPCC scientists.

Trends in normalized data show constant, stabilized patterns in most cases, a result consistent with findings reported in Bouwer (2011a) and references therein, Neumayer and Barthel (2011) and IPCC-SREX (2012).

The absence of trends in normalized disaster burden indicators appears to be largely consistent with the absence of trends in extreme weather events.

For more on attributing x-weather to climate change see: X-Weathermen Are Back

Addendum on Wildfires

Within all the coverage of the Fort McMurray Alberta wildfire, there have also been lazy journalists linking the event to fossil fuel-driven global warming, with a special delight of this being located near the oil sands.  The best call to reason has come from A Chemist in Langley, who argues for defensible science against mindless activism.  Of course, he has taken some heat for being so rational.

Here is what he said about the data and the models regarding boreal forest wildfires:

Well the climate models indicate that in the long-term (by the 2091-2100 fire regimes) climate change, if it continues unabated, should result in increased number and severity of fires in the boreal forest. However, what the data says is that right now this signal is not yet evident. While some increases may be occurring in the sub-arctic boreal forests of northern Alaska, similar effects are not yet evident in the southern boreal forests around Fort McMurray.

My final word is for the activists who are seeking to take advantage of Albertans’ misfortunes to advance their political agendas. Not only have you shown yourselves to be callous and insensitive at a time where you could have been civilized and sensitive but you cannot even comfort yourself by hiding under the cloak of truth since, as I have shown above, the data does not support your case.

Data vs. Models #2: Droughts and Floods

This post compares observations with models’ projections regarding variable precipitation across the globe.

There have been many media reports that global warming produces more droughts and more flooding. That is, the models claim that dry places will get drier and wet places will get wetter because of warmer weather. And of course, the models predict future warming because CO2 continues to rise, and the model programmers believe only warming, never cooling, can be the result.

Now we have a recent data-rich study of global precipitation patterns and the facts on the ground lead the authors to a different conclusion.

Stations experiencing low, moderate and heavy annual precipitation did not show very different precipitation trends. This indicates deserts or jungles are neither expanding nor shrinking due to changes in precipitation patterns. It is therefore reasonable to conclude that some caution is warranted about claiming that large changes to global precipitation have occurred during the last 150 years.

The paper (here) is:

Changes in Annual Precipitation over the Earth’s Land Mass excluding Antarctica from the 18th century to 2013 W. A. van Wijngaarden, Journal of Hydrology (2015)

Study Scope

Fig. 1. Locations of stations examined in this study. Red dots show the 776 stations having 100–149 years of data, green dots the 184 stations having 150–199 years of data and blue dots the 24 stations having more than 200 years of data.

Fig. 1. Locations of stations examined in this study. Red dots show the 776 stations having 100–149 years of data, green dots the 184 stations having 150–199 years of data
and blue dots the 24 stations having more than 200 years of data.

This study examined the percentage change of nearly 1000 stations each having monthly totals of daily precipitation measurements for over a century. The data extended from 1700 to 2013, although most stations only had observations available beginning after 1850. The percentage change in precipitation relative to that occurring during 1961–90 was plotted for various countries as well as the continents excluding Antarctica. 

There are year to year as well as decadal fluctuations of precipitation that are undoubtedly influenced by effects such as the El Nino Southern Oscillation (ENSO) (Davey et al., 2014) and the North Atlantic Oscillation (NAO) (Lopez-Moreno et al., 2011). However, most trends over a prolonged period of a century or longer are consistent with little precipitation change.Similarly, data plotted for a number of countries and or regions thereof that each have a substantial number of stations, show few statistically significant trends.

Fig. 8. Effect of total precipitation on percentage precipitation change relative to 1961–90 for stations having total annual precipitation (a) 1000 mm. The red curve is the moving 5 year average while the blue curve shows the number of stations. Considering only years having at least 10 stations reporting data, the trends in units of % per century are: (a) 1.4 ± 2.8 during 1854–2013, (b) 0.9 ± 1.1 during 1774–2013 and (c) 2.4 ± 1.2 during 1832–2013.

Fig. 8. Effect of total precipitation on percentage precipitation change relative to 1961–90 for stations having total annual precipitation (a) less than 500 mm, (b) 500 to 1000 mm, (c) more than 1000 mm. The red curve is the moving 5 year average while the blue curve shows the number of stations. Considering only years having at least 10 stations reporting data, the trends in units of % per century are: (a) 1.4 ± 2.8 during 1854–2013, (b) 0.9 ± 1.1 during 1774–2013 and (c) 2.4 ± 1.2 during 1832–2013.

Fig. 8 compares the percentage precipitation change for dry stations (total precipitation <500 mm), stations experiencing moderate rainfall (between 500 and 1000 mm) and wet stations (total precipitation >1000 mm). There is no dramatic difference. Hence, one cannot conclude that dry areas are becoming drier nor wet areas wetter.

Summary

The percentage annual precipitation change relative to 1961–90 was plotted for 6 continents; as well as for stations at different latitudes and those experiencing low, moderate and high annual precipitation totals. The trends for precipitation change together with their 95% confidence intervals were found for various periods of time. Most trends exhibited no clear precipitation change. The global changes in precipitation over the Earth’s land mass excluding Antarctica relative to 1961–90 were estimated to be:

Periods % per Century
 1850–1900 1.2 ± 1.7
 1900–2000 2.6 ± 2.5
 1950–2000 5.4 ± 8.1

A change of 1% per century corresponds to a precipitation change of 0.09 mm/year or 9 mm/century.

As a background for how precipitation is distributed around the world, see the post: Here Comes the Rain Again. Along with temperatures, precipitation is the other main determinant of climates, properly understood as distinctive local and regional patterns of weather.  As the above study shows, climate change from precipitation change is vanishingly small.

Data vs. Models #1 was Arctic Warming.