Select date

May 2024
Mon Tue Wed Thu Fri Sat Sun

Climate Change is not a problem: Unless we make it one.

12-2-2020 < SGT Report 30 2707 words
 

by Martin Capages Jr. PhD PE, Watts Up With That:



As long as humans have been on Earth, they have been adapting to changes in regional climates. A regional climate is the average of the weather for a relatively long period of time, usually 30+ years, at a particular location on the planet. The natural periodicity of prolonged regional weather variations has been documented in various ways by humans for eons. For a comparison of human civilization in the northern hemisphere to Greenland ice core temperatures for the last 18,000 years see here. Some of the means of documenting changes in long term weather patterns, i.e. climate change, include crude prehistoric cave drawings of the animals and plants, paintings of frozen rivers (see Figure 1 of ice skating on the River Thames in 1684), and archaeological digs. There are also written records of climatic conditions as early as 5,000 years ago, perhaps even earlier. Ice, subsea, peat and lake bed cores are also used, for a more detailed discussion of the methods used see here and the links therein.




Figure 1. Ice skating on the River Thames in London in January 1684, during the Little Ice Age. Museum of London, link.


Most geologists agree that we are currently in an extended ice age. Technically we are in an “icehouse” condition (see here). When ice caps exist on one or more poles year-round for an extended period of time, the Earth is said to be in an icehouse. Global temperature may decrease further if the solar activity remains at its current low level (see here). But geologists deal in massive time increments of thousands, millions even billions of years. The general public makes its observations in decades, perhaps a generation and maybe even in a century, but not much more than that. Such a myopic view of the Earth’s climate can be misleading.


CLIMATE SCIENCE


Climate science is a combination of many scientific specialties such as geology, geophysics, astrophysics, meteorology, and ecology just to name a few of the larger branches. Some of these scientists are working to develop computer models of the climate using atmospheric physics, chemistry, actual data, proxy data, empirical variables and assumed constants. The models include statistical tools to present the results in the form of projections of measurable parameters, one of these is the global mean temperature. These projections are presented in time increments that mean something to the public. Dr. Judith Curry has written a good overview of computer climate modeling that can be downloaded here.


To gain an understanding of the regional climate that preceded humankind, we have to get creative. That means using proxies to determine the average temperature and perhaps life conditions in earlier years. The two most cited proxies are ice cores and tree rings, but there are other lesser known proxies. In addition, we can also make reasonable assumptions about the prehistorical past with observations of regional geology. For example, glacier movements are revealed by the scars and strange debris fields that are left with each glacial expansion and retreat. Great boulders are left in the middle of grassy plains as glaciers melt. Gravel placed by high velocity melt water rivers can even reveal the dynamics involved, perhaps even provide a timetable for the events. These points are made just to illustrate the importance of the geological perspective in understanding why the climate changes. It is, after all, the physical record.


Many scientists, across many disciplines, have made their career goals the understanding of these worldly and sometimes outer-worldly events. Some of these scientists have developed hypotheses that they defend with great vigor which is, of course, understandable. There is peer admiration, public recognition and research funding available when one’s hypotheses prove to be correct. But there is a danger in pushing any hypothesis beyond its limits. And that may be the case of the proponents of the singular CO2 driven global warming hypothesis.


THE DISAGREEMENT


Instead of following more traditional methods of analyzing data acquired through research, noting some phenomenon, developing an hypothesis that might explain the phenomenon, then publishing the research and the scientific conclusions to get the scrutiny of peers in that particular field of research, the CO2 warming proponents appear to have started with an hypothesis. The hypothesis was that “humankind’s accelerated use of fossil fuels had led to an increase in average global temperature by adding more CO2 to the atmosphere and enhancing the Green House Gas effect.” This is easily seen in the stated objective of the United nations Framework Convention on Climate Change (UNFCCC):



“UNFCCC’s ultimate objective is to achieve the stabilization of greenhouse gas concentrations in the atmosphere at a level that would prevent dangerous interference with the climate system.” (link)



In other words, they assumed that stabilizing the atmospheric greenhouse gas concentration would prevent climate change, they did not prove this assertion first. The previous hypothesis had been that aerosols would cause a cooling of the average global temperature and lead to a new massive glacial advance or “Ice Age.” The media sometimes calls a major glacial advance an “ice age,” but we are already in an ice age and have been for millions of years. Some say the new ice age predictions in the 1970s were in the minority and erroneous. They claim there was no consensus on global cooling (link). Others say there was a consensus (link).  Then the impact of chlorofluorohydrocarbons (CFCs) on the ozone layer became the new major focus. A damaged ozone layer could increase solar radiation and lead to more cancer, animal blindness and plant withering (link).


Consensus among scientists means nothing. Proposing that a consensus exists by distilling published papers means absolutely nothing. Getting scientists together for an open discussion, presenting one’s hypothesis, showing the proof, then having a robust debate followed by an open show of hands may be a better way to define a scientific consensus, but even that could be biased by the quality of the presentations and the presenters involved.


ROOT CAUSES FOR DISAGREEMENT


Research funding has always been the result of patronage, both private and governmental. An individual researcher must have some sort of sustenance to survive. If successful in the research, that scientist will attract more funding than the competition in the same field. The attraction to the funders of that successful research may be in public prestige received or there may be a purely economic or even military advantage for the patron, politician or governmental entity. Most research is performed by academia. Many, if not most, of the governmental agencies funding research, are pressured by political entities to fund research that supports a political agenda. Government funding injects politics into scientific research and can make research outcome oriented. Today, there is little research based on scientific curiosity. Most research is agenda-driven and based on the biases of the funding source and the biggest source is the government. That has led to the climate change debacle the world now faces.


The actual climate change that will occur will be revealed at the pace that nature allows. Unfortunately, adapting to these changes takes time and resources. Understanding the causes of climate change may lead to decisions to take measures to mitigate the change or to adapt in advance of the climate change. The underlying assumption is that the projected climate change will have a negative impact on humans or even end humankind. So, the research has been directed at mathematical models of the climate centered on producing projections of global average temperature over time and comparing temperature to CO2 concentrations. These projections have actually been of the positive or negative deviation of the temperature above or below a selected historic baseline. While this is a valid and well accepted manner to display projections, the selection of historic baseline can distort the public’s perception of the change.


These dynamic, mathematical models must use the power of digital computer programming to produce temperature projections in a reasonable time frame. There are many constants and variables that are fed into the models. Both the equations, the input constants and variables can be “tweaked” to generate projections until the projections can hindcast the majority of the historical record with some accuracy. Typically, data samples are not absolute but introduce a range around some point of reference. This departure from the norm requires the introduction of probability and statistics to represent a range of values. Temperature varies with latitude and elevation, so temperature anomalies must be computed at as many places around the Earth as possible and then the anomalies are averaged. Each projection consists of bands of departures from the specific reference point. The plots are not absolute temperature versus time but the “temperature anomaly” above and below as many base lines.  But matching history requires controls and record keeping on the tweaks to the constants, variables and the equations themselves.



Figure 2. The upper graph shows the IPCC (Intergovernmental Panel on Climate Change) projections of temperature (red and blue lines) without any man-made CO2, just natural forces. The lower graph shows projections (again in red and blue) including man-made CO2. The black line in both graphs are the observations. The blue and yellow very fine lines are the individual model runs that are averaged to make the blue and red lines. Source, IPCC WG1, AR5, FAQ 10.1, page 895, link.


In Figure 2 we see the result. The IPCC, the Intergovernmental Panel on Climate Change, uses models from the Coupled Model Intercomparison Project (CMIP3 in 2010 and CMIP5 in 2014). The computed uncertainty in these estimates of global temperature change since 1860 are shown in blue and yellow. As the graphs show, the uncertainty range is larger than the deviation since 1860. The lower bound in 2000 overlaps the upper bound in 1860 in the lower graph. Since 2000, the observations have been fairly flat, as shown by the black line. In the upper graph, which is supposed to show only natural influences on climate change, the projections are flat, except for large volcanic eruptions, which decrease global temperatures. The authors want us to believe that none of the global warming in the past 150 years is natural? Did they assume this? Or do they know this? It is unclear. For a fuller discussion of Figure 2, see here.


The data itself must be distilled down. To then develop a projection of the results and keep it clear of bias, probabilistic techniques such as a Monte Carlo methodology are employed. These are computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. Many climate change scientists have relied on Monte Carlo methods in the probability density function analysis of radiative forcing. Unfortunately, the actual data set adjustments and model “tweaking” have raised concerns about possible bias in the projections.


Furthermore, the equations used in the millions of lines of software code may contain errors. Computer simulations provide a means to test hypotheses but do not provide “proof.” That is why computer projections must never be considered “settled science” or confused with observations. It is dangerous to do so. (Curry 2017).


 CARBON, CARBON DIOXIDE AND “BIG OIL”


The problem we have today is the divisive manner used by the scientists who are proponents of the “CO2 control knob” for global mean atmospheric temperature. Their computer models yield results that show a significant increase in the average global temperature by 1.1 to 4.2 degrees C (See figure 1, here) by the year 2100. That could be a problem perhaps, if it actually occurs. While the actual effect of a 4-degree temperature rise is unknown, it is assumed that it would be a bad thing and that assumption is widely believed. The “CO2 control knob” proponents (see here for an example), henceforth called “Alarmists” have declared that the doubling of the level of CO2 in the atmosphere could cause a global temperature increase of 4.5-degrees C (Link) by the end of the 21st Century, 80 years from now. They have recommended reducing, or even eliminating the use of fossil fuels which they believe is the primary cause of the rise in atmospheric CO2 from around 300 parts per million or 300 ppm at the beginning of the Industrial Age to today’s level of over 400 ppm.


Fossil fuels have always been referred to in the media in the pejorative and associated with “Big Oil”, another pejorative reference. The truth is that the use of fossil fuels has exponentially improved the ability of humans to flourish and Big Oil has been the means for that flourishing to take place. Big Oil has done some wasteful and selfish things and deserves some criticism. But Big Oil is not an evil entity, it is a business, a business of large and smaller corporations with shareholders, executives and employees, just like the Silicon Valley technical giants. Even the real Big Oil, the Organization of Petroleum Exporting Countries or OPEC, performs in the manner of a large corporation. The problem with Big Oil is that it has never been able to “stick up for itself.” It has even needed the help of “outsiders” to voluntarily join the battle on its behalf. Luckily, a few outsiders have decided to do that; however, it may be too late to change public perception of the fossil fuel energy industry (Epstein 2014). On the other hand, Silicon Valley has no such handicap as yet, but there is some negativism building with respect to privacy concerns and monopolistic behavior of the Tech Giants.


THE UNITED NATIONS EFFECT


The United Nations has exploited the negative view of fossil fuels to enhance its role and power in global affairs. Others have supported the CO2 argument to enhance their opportunistic investments in alternative energy sources with the exception of nuclear and hydro-electric power. Hydro-electric is a non-carbon, reliable renewable while nuclear is non-carbon and near-renewable due to its availability and energy density. These two alternative sources have been opposed by anti-humanity environmental extremists. These combined negative forces have generated very slick UN Proposals for Policy Makers that are based on the singular premise that the global temperature is increasing at an alarming rate, the root cause is the increase in atmospheric CO2 due to the use of fossil fuels, and that the entire world should participate in reducing human-caused CO2 emissions to zero.


But what if the temperature increase is not due to increased human generated CO2 levels? What if the computer models projecting an increasing global average temperature are wrong? Are all the computer models based on the same general hypothesis? If so, are they just tweaking constants and variables to match the history? And, what exactly does a 1 degree or even 3 degree C temperature rise mean?


RESEARCH ANSWERS


We need to get the answers to these questions. Who can provide these answers? There are many scientists and engineers who are knowledgeable in the physics and the chemical processes that set the boundaries for climate science. Many of the scientists are retired members of academia with years of experience in research, others are retired from large corporations that have their own research organizations. There are also scientists and engineers that have performed advanced research in government facilities, including military research. Current climate research is being performed at the public and private universities, corporations and in government laboratories. In the United States alone, the GAO estimates the government has spent over $107 billion dollars on climate research from 1993 to 2014 (Link). By far, most of the funding originates with governments. The government-academia research complex and rotating door has coopted research. Projects that fit social agendas are approved while more practical research languishes. Private research is denigrated by the government supported researchers.


Read More @ WattsUpWithThat.com





Loading...




Print