Characterization of different paleoproxies of temperature
Climate is the average pattern of weather for a particular region, which is characterized by weather statistics (temperature, pressure, humidity, speed and direction of wind etc.) averaged over time intervals generally longer than 30 years.
Climatology is a branch of atmospheric sciences concerned with the study of climates of the Earth and analyzing the causes and practical consequences of climatic changes. Until the 19th century climatology was closely linked with meteorology. The concept of climate appeared first in ancient Greece. Aristotle (384–322 BC) wrote “Meteorologica” – the first scientific book about the atmospheric (meteorological and climatic) phenomena. The understanding of climate by Aristotle and Hippocrates (460–377 BC) remained very influential until well into the 18th century . The medieval Chinese scientist Shen Kuo (1031–1095) was probably the first person who asserted that climate can change in the course of time. Enlightenment in Europe gave a new pulse to the development of both weather and climate research. The invention of meteorological devices – thermometer (Galileo in 1603), mercury barometer (Torricelli in 1643), barometer-aneroid (Leibnitz in 1700) – opened a new era in meteorology. Appreciable milestones in both meteorology and climatology were reached in the 19th century. In 1817 A. Humboldt (1769–1859) – one of the pioneers in scientific climatology – constructed the first map of global annual isotherms using the data from 57 weather stations. In 1848 H.W. Dove (1803–1879) constructed maps of the isotherms of January and July. The first isobars based on data on prevailing winds of the entire globe were constructed by Buhan in 1869. Francis Galton (1822–1911) invented the term anticyclone. Moreover, in 1896 S. Arrhenius (1859–1927) claimed that fossil fuel combustion may eventually result in a global warming. He proposed a relation between atmospheric carbon dioxide concentrations and global temperature. These among numerous other findings laid the foundation for modern climatology.
Global warming (GW) has become the most interesting problem of climatology in the second part of the 20th century. By the end of the 1980s it was finally acknowledged that global climate is warmer than during any period since 1880. Climatic modeling, including the greenhouse effect theory, started to develop intensively and the Intergovernmental Panel on Climate Change (IPCC) was founded by the United Nations Environment Programme and the World Meteorological Organization. This organization aims at assessing the scientific information of the risk of human-induced climate change and prediction of the impact of greenhouse effect according to existing climate models. The problem of global warming has also moved from the realm of scientific debates into global and local political spheres. What is the physical mechanism of GW? Does it result only from anthropogenic activity (especially the burning of fossil fuels) or do some other natural climatic phenomena contribute to the global temperature increase, too? What is the magnitude and pattern of the warming? Answers to these questions can provide us valuable information about potential climate changes in future decades and, hence, is of crucial importance for all human activity. However, in order to answer the above questions, we need detailed information about past climatic variability and its causes. Unfortunately, substantial gaps exist in our knowledge concerning the dynamics of climate variability. The available instrumental (meteorological) records are sparse and irregularly distributed. They usually cover no more than the last 100–150 years. Paleoclimatic proxy records (reconstructions from natural archives) are irreplaceable tools in filling the gaps in our knowledge about long-term climatic changes. However, the paleodata are less accurate and their reliability quite often raises serious doubts. On the other hand, modern climate models include numerous parameters, some of which are not defined adequately enough. Therefore an integrated analysis using different approaches is necessary to obtain a clearer picture of the global warming.
2. Global warming in the context of instrumental data
The average temperature of the Earth, measured by the surface weather station thermometers, has increased appreciably during the last century. The IPCC consortium reported that the global mean surface temperatures have risen by 0.74°C ± 0.18°C when estimated by a linear trend over the last 100 years  (Figure 1A).
According to a currently widely held view, the temperature rise is: (a) mainly a result of anthropogenic emissions of greenhouse gases (CO2, CH4, N2O, halocarbons) and (b) extremely high and unprecedented in a historical context (see e.g. ). It is, however, evident that the instrumentally recorded temperature data are not representative enough to support solid conclusions. Even during the last few decades the weather station network covers less than 90% of the land, i.e. no more than 25% of the Earth’s surface (see Figure 1B). Moreover, the scarcity of spatial coverage of these data generally increases going back in time. That is why the uncertainty of the global annual temperature increases from less then 0.10C at the end of the 20th century to about 0.150C at the end of the 19th century .
Satellite measurements of atmospheric temperature, started at the end of the 1970’s, have much more dense spatial coverage. The satellite passes over most points on the Earth twice per day. Satellite-borne microwave sounders – the microwave sounding units (MSU) – estimate the temperature of thick layers of the atmosphere by measuring microwave thermal emissions (radiances) of oxygen molecules from a complex of emission lines near 60 GHz. By making measurements at different frequencies near 60 GHz (≅1 cm), different atmospheric layers can be sampled. Then, based on the obtained data and by means of various mathematical procedures, atmospheric temperature is calculated. Two groups of scientists – the Remote Sensing System (RSS) group and the University of Alabama (UAH) group – have analyzed the data produced by NASA (National Aeronautics and Space Administration) satellites series TIROS (Television Infrared Observing Satellites) and obtained two versions of temperature changes in the lower troposphere, i.e. at heights less than 8 km (maximum of sensitivity around 2–3 km) since the end of 1978. Figure 2 shows RSS and two UAH satellite series (versions 2007 and 2012) together with the surface thermometric data.
It should be noted, however that it is not a trivial task to tie readings from different satellites together. Satellite measurements are limited by the following constraints : (a) offsets in calibration between satellites; (b) drifts in satellite calibration; (c) orbital decay and drift and associated long-term changes in the time of day that the measurements are made at a particular location. Therefore it is not easy to evaluate correctly long-term trends in the satellite microwave-based data and the estimations often vary. The initial versions (before 2005) of UAH record (Figure 2D) had a weak trend of 0.03-0.050C /decade . In 2005 the trend value changed to 0.120C /decade . The current version (Figure 2C) has a trend value about 0.140C /decade, which is very close to that in RSS series (Figure2B). In spite of some past disagreement, both UAH and RSS records show that the temperature rise in the low troposphere is unlikely less than at the surface contrary to the predictions of greenhouse warming theory. Actually, physical theory and the GCMs predict that the troposphere will warm faster than the surface as the greenhouse effect takes place (see IPCC, 2007). This effect is most expressed in the tropical troposphere, because this part of the atmosphere is the most appropriate for detecting the greenhouse fingerprint. IPCC GCMs forecast a tropical tropospheric greenhouse warming increasing with altitude and reaching its maximum at ca 10 km (see Fig. 9.1 of ). Comparison of the model predicted and satellite-derived temperatures has brought rather controversial results that created a decades-long debate which still continues. For example, Douglass et al.  examined tropospheric temperature trends using 22 GCMs and arrived at a conclusion that the model results and observed temperature trends are in disagreement in the largest part of the tropical troposphere, being separated by more than twice the uncertainty of the model mean. Santer et al.  who used the updated and corrected observational datasets found, however, that they are within the confidence intervals of the models. Fu et al. ), in turn, examined the GCM-predicted and observed trends in the difference between the tropical upper troposphere and lower-middle troposphere temperatures for 1979–2010 and showed that models significantly exaggerate the trend value.
3. Global warming in the context of paleodata
Paleoclimatology is the science that studies the climatic history of the Earth using proxy records. It reconstructs and studies climatic variations prior to the beginning of the instrumental era by means of a variety of proxy sources. Among the main data used by paleoclimatology are: tree rings (width and density), tree height growth, concentration of stable isotope (18О, 13С, D) in natural archives (ice, coral and tree rings), borehole temperature measurement and contemporary written historic records (weather diaries, annals etc.). The detailed description of these paleoindicators can be found in [3, 11, 13, 43, 78]. Here we describe shortly some of the main features of the available proxy records (Table 1).
|Proxyvariable||Maximum time span||Spatial limitations||Maxi-mum Rl inter-annual||Maxi-mum Rl inter-decadal||General shortcomings|
|Tree-ring width||7–8 millennia||Extratropical (> 300) part of the globe or high-elevation area||0.50||0.80||Standardization methods hamper interpretation of multi-decadal and longer variability. Divergence problem.|
|Tree-ring density||1.5–2 millennia||Extratropical (> 300) part of the globe or high elevation area||0.79||0.92||Standardization methods hamper interpretation of multi-decadal and longer variability. Divergence problem.|
|Stable isotopes in tree rings||1–2 millennia||Extratropical (> 300) part of the globe or high elevation area||0.68||0.81||A complicated procedure of measurement|
|Tree-height increment||1263 years||Northern Fennoscandia||0.61||0.72||A novel approach not profoundly examined|
|Stable isotopes in ice||750 000 years||High-latitude (>600) and high-elevation ice caps||0.30||0.50||A complicated procedure of measurement. Problems with dating deep layers Problems with calibration towards instrumental data.|
|Stable isotopes in corals||1 000 000 years||Tropic (±300) oceans||0.41||0.66||Problems with interpre-tation of multi-decadal and longer variability (changes in water depth, nutrient supply).|
|Contemporary written historic records||1372 years|
|Europe, China, Japan, Korea, Russia, Egypt||0.90||0.93||Problems with interpretation of variability longer than a human lifespan|
|Borehole temperature||20 000 years (usually - 500 years)||Mid-latitude (30–600 N) part of Northern Hemi-sphere, south (>00 S) part of Africa, extracon-tinental Australia||_||_||Reproduce only long-term (century-scale and longer) variability.|
|Ice core melt layers||600 years||High-latitude (>600) and high-elevation ice caps where tempera-ture in summer reaches positive values||0.18||0.74||Problems with calibration towards instrumental data. Might not reproduce the full range of the temperature variability|
Some proxy characteristics are summarized in Table 1: (a) maximum duration of reconstruction currently achieved; (b) regions of the world where reconstructions are potentially available; (c) largest coefficient of correlation between annually resolved proxy and instrumental data; (d) largest coefficient of correlation between decadal proxy and instrumental data; (e) general shortcomings of the proxy. Coefficients of correlation were calculated, in general, over the last 80–100 years.
Tree rings are one of the most widely used climate proxies because they can be absolutely dated annually by means of dendrochronological cross-dating method. Progress in the methodology, theory and application of dendroclimatology in the last decades of the 20th century [23, 27, 34] has helped this science to become popular, and over the last decades its methods have become key tools in the reconstruction of past temperatures in many parts of the world [11, 30, 37, 55, 57, 92]. It should also be noted that tree-ring data are generally collected from territories that are remote from areas of human activity and are less subjected to local anthropogenic impacts such as urbanization and changes in land use.
Since different paleoindicators reflect actual temperature changes in different ways the multiproxies – the time series, which generalize proxy sets of various types – are often used by paleoclimatology [60–61].
Based on both instrumental and paleoclimatic data, IPCC  claimed that the average Northern Hemisphere temperatures during the second half of the 20th century were higher than during any other 50-year period in the last 500 years with a probability >0.9 and the highest in at least the past 1300 years with a probability >0.66. Let us examine this suggestion in detail with seven reconstructions of the Northern Hemisphere temperature during the last 1000–2000 years (Figure 3). They are tree-ring proxies after Briffa  (NHB) and Esper et al.  (NHE), non tree-ring proxy after Loehle  (NHL) and multi-proxies after Jones et al.  (NHJ), Mann et al.  (NHM), Crowley and Lowery  (NHC), and Moberg et al.  (NHMb).
The millennial climate proxies demonstrate evidently different histories of temperature variability over the past 1000–2000 years (Figure 3). Power spectra of the temperature reconstructions demonstrate the differences even more clearly. Figure 4 shows the global wavelet spectra for the seven proxy records concerned, calculated using the Morlet basis for the years before 1900, i.e. for a time interval prior to a possible strong anthropogenic impact. Overall linear trends were preliminarily subtracted from each of the time series. One can see from Figure 4 that millennium-scale variations are present only in spectra of series NHMb and NHL. Variations with period less than 225 yrs are quite weak in the spectra of reconstructions NHM, NHJ, NHC.
A visual inspection and wavelet analysis make it possible to divide temperature reconstructions conditionally in the following groups:
Reconstructions NHM, NHJ and NHC (Figure 3A,B) show an obvious linear decline of mean temperature (up to 0.15–0.300C) over the pre-industrial era (AD 1000–1880) and a sharp rise thereafter (the so-called hockey stick shape). Temperature in the middle of the 20th century is obviously highest in the entire millennium. According to  these proxies show that the Earth in the 20th century is warmer than it was in any other time period in the last millennium. We combine NHM, NHJ, and NHC records as a group, depicting a temperature pattern with a sharp rising shape – a hockey stick (IHS).
Reconstructions NHE and NHB (Figure 3C, D) do not show a linear trend. Instead, their long-term variability is dominated by multi-centennial cyclicities. The 20th century was warm, but the warming is not so anomalous. Here we combine NHE and NHB temperature reconstructions as a group depicting a shape of multi-centennial variability (MCV).
Reconstructions NHMb and NHL (Figure 3E, F) also show some linear trend in AD 1000–1880 but the temperature increase during the 20th century is not abrupt. The warming of the 20th century is comparable with that during the Medieval Warm Period (AD 800–1100), i.e. it is not unusual. Time variation with a period longer than 1000 years obviously prevails in the spectra of these records. We further call the NHMb and NHL proxies as the millennial-variability (MV) reconstructions.
It has been shown that the disagreement between the different temperature patterns cannot be fully explained by differences either in standardization techniques or in the different geographical coverage (see, for example, ). Thus, paleoseries of IHS, MCV and MV types can be assessed as three different clusters of seven temperature reconstructions. Bürger  analyzed ten temperature reconstructions by means of a more sophisticated technique and concluded that they form five clusters all of which are significantly incoherent with each other.
We have plotted two geothermal reconstructions of the global surface temperature (Figure 5) – the series by Rutherford and Mann  and the series by Beltrami . These series are plotted together with the instrumental data .
Ogurtsov and Lindholm  showed that the borehole-based reconstructions demonstrate a history of past temperature changes conflicting with the IHS-type proxies. The IHS-type proxies have a downward linear trend during the pre-anthropogenic era (AD 1500–1880) while the corresponding trend in the global borehole temperature is evidently upward. Agreement between borehole data and MV/MCV proxies is better. The geothermal reconstructions show that the 20th century (AD 1900–2000) is the warmest period for the last 500 years. The most recent borehole-based reconstructions of Huang et al. , spanning the last 20 000 years, show that the average global temperature 4.5–9.0 kA before present was higher than it is today. However, the uncertainty of temperature reconstruction in such a remote past period is quite large.
The discourse about the disagreement between global paleoproxies is important because of the problem of credibility and confidence of the available temperature reconstructions, which has become especially heated after the works of McIntyre and McKitrick [62-65]. These authors applied the methods of data transformation used by Mann et al.  to the same source data and obtained a Northern Hemisphere temperature index rather different from the NHM record (Figure 6). McIntyre and McKitrick  show that the 20th century was the warmest through the last 500 years but the 15th century was much warmer.
McIntyre and McKitrick [62, 63] stimulated the scientific community and generated some polemic discussion [41, 56, 64, 65, 92, 93]. The debate continues, but it can be most reasonably concluded that appreciable differences between the global temperature reconstructions obtained by means of the identical initial data indicate that, despite evident successes, the methods and approaches of paleoclimatology still leave considerable space for subjectivity. Taking into account the noted problems a question arises – what are the actual features of climate that could be captured by paleoreconstructions? That is to say: if the obtained temperature reconstructions have significant dissimilarities, what are their common features? Bürger  arrived at a conclusion that the available reconstructions differ so much that there is no way to draw meaningful conclusions from them. Ogurtsov et al. [73, 76] showed, however, that in spite of differences, the reconstructions of the Northern Hemisphere temperature have at least two apparent common features: (a) presence of a roughly regular century-scale rhythm with a period of 50–130 yrs through the last 1000 years; (b) a noticeable temperature rise during the last century. In spite of differences between IHS, MCV and MV reconstructions, they agree that the 20th century was warm. However, it is difficult to make any decisive conclusion about actual extent of the 20th century temperature anomaly within the last 1000 years, particularly if we take into account the possible influence of divergence or reduced sensitivity to temperature changes.
The divergence problem is a well known anomalous reduction in the sensitivity (ARS) of tree growth to changing temperature, which has been detected in many dendrochronological records over the last decades of the 20th century [9-10, 26, 31]. An evident underestimation of recent warming in tree-ring based reconstructions is illustrated in Figure 7.
Decrease in stratospheric ozone concentration that causes increase of the flux of ultraviolet radiation reaching the ground and corresponding decline in tree productivity;
Decrease of ground solar irradiance over 1961–1990 that caused decrease in sunlight reaching the ground;
Nonlinear growth response, which causes reduction of tree-ring width under high temperatures.
In spite of some possible explanations the divergence problem still has not been solved and modern temperature reconstructions in dendrochronology usually do not cover well the most recent time interval. Therefore they may not capture well the sharp rise in temperature during the two last decades. Wilson et al. ) made a new temperature reconstruction for the Northern Hemisphere that utilizes fifteen tree-ring based proxy series that express no divergence effects over the last decades. Based on this divergence-free time series Ogurtsov et al.  corrected eight millennial-scale proxy reconstructions of temperature of the Northern Hemisphere for ARS effect and reanalyzed them. This study concluded that, neglecting the reconstruction , in the extratropical part of the Northern Hemisphere the time interval 1988–2008 was the warmest two decades in the last 1000 years with a probability of more than 0.70. The unusual level of current temperature over the areas least disturbed by local anthropogenic impact might prove that over the last two decades the climatic system was perturbed by an additional global-scale forcing factor, which did not operate in the past. It has been noted, however, that the procedure of correction for anomalous reduction in sensitivity includes some rather arbitrary assumptions and needs to be improved further (e.g. prolongation of linear trend over 2000–2008). .
Summarizing the results of analysis and reanalysis of the available long-scale temperature proxies, it could be confidently concluded that the 20th century was warm, i.e. that the global temperature averaged over the last 100 years was actually higher than global temperature averaged over the last 1000 years.
4. Modern climate modeling — Advantages and limitations
Climate models are mathematical descriptions of the Earth's climate system. They use sets of mathematical equations and numerical methods to reproduce the interactions of the atmosphere, mixed and deep ocean, land surface and cryosphere. Climate modeling has a variety of purposes from studies of the dynamics of the climate system to the prediction of future climate. Climate models range in complexity from simple, one-equation analytic models to state-of-the-art General Circulation Models (GCM), simulating the physics, chemistry, and biology of all the parts of the Earth’s climatic system.
Energy-balance models of the globally averaged climate are the simplest. In the framework of the energy balance approach, changes in the climate system are estimated from an analysis of the change in the Earth's heat storage. The basis for these models was introduced by Budyko  and Sellers . In its most simplified form, energy-balance model provides globally averaged values for the computed variables.
The more complicated radiative-convective models take into account the vertical variation of temperature with altitude. This approach makes it possible to study the role of clouds, water vapor and stratosphere. First radiative-convective model was introduced by Manabe and Wetherland .
GCMs are three-dimensional models with the boundary conditions at the spreading surface. They try to simulate incoming and outgoing radiation, time/spatial variation of the wind field, generation of clouds and transfer of water vapor, formation of sea ice, atmosphere-ocean coupling and redistribution of heat in oceans etc. GCMs have a spatial resolution comparable to the global synoptic network. The most prominent use of GCMs in recent years has been to forecast temperature changes resulting from increases in atmospheric concentrations of greenhouse gases . GCMs are the most complex and sophisticated models including at least tens of equations.
It should be noted that while working with climatic models one needs to know a lot of model parameters, the number of which increases along with the increasing complexity of models. However, many such parameters are not known with adequate precision. This concerns even climatic sensitivity. The climate sensitivity (usually in 0K×W–1×m2) is a measure of the climate system’s response to constant radiative forcing. Radiative forcing (usually in W×m–2) in turn is a measure of the perturbation brought by some factor to the radiative balance in the global Earth-atmosphere system. Positive forcing tends to warm the surface while negative forcing tends to cool it. Current estimations of the climate sensitivity, λc, range from 0.07 0K×W-1×m2  to 2.54 0K×W–1×m2 . IPCC  gives the value 0.53–1.23 0K×W–1×m2 (see also Table 1 from ). Knowledge about radiative forcings which likely influenced terrestrial climate over the last 150 years – (a) anthropogenic greenhouse gas (CO2, CH4, N2O) emissions; (b) anthropogenic aerosol emissions; (c) anthropogenic changes in albedo (land use, black soot on snow); (d) volcanic aerosol emissions; (d) total solar irradiance (TSI) variations – is still not satisfactory. According to  the level of scientific understanding is high only for industrial greenhouse gases. The level of understanding of all the other possible climate drivers is either medium or low. For example, forcing caused by human-made changes in land surface properties since pre-agricultural times is quite unclear. A possible value lies between 0.0 and 0.4 W×m–2 according to  and between 0.5 W×m–2 and –0.6 W×m–2 according to . As a result an estimation of total net anthropogenic forcing since AD 1750 ranges from 0.6 W×m–2 to 2.4 W×m–2 . Our knowledge about natural forcings is also rather limited. For example, different long-term reconstructions of TSI prior to instrumental period (before 1978) show a quite different picture. According to , the average TSI increased by ca 5 W×m–2 from the beginning of the 19th century till the end of the 20th century, while the reconstruction  shows only 1.5 W×m–2 rise during the same time interval (see also Figure 6 from ). These values lead to a corresponding radiative forcing 0.26–0.88 W×m–2. Furthermore, considerable uncertainty remains over the magnitude of influence of volcanic eruptions on the climate system. Even in the case of Mt. Pinatubo explosion (1991), which was directly observed and investigated using all the possibilities of modern science, the estimations of its climatic forcing differ from 2.25 W×m–2  to 4.7 W×m–2 . Knowledge about the structure of feedbacks is also incomplete . Thus, it is obvious that the parameters of many modern climate models have huge uncertainties. That is why some skeptics even believe that they follow the old maxim of “garbage in, garbage out” – the principle in the computer science, meaning that if the input data are incorrect then erroneous results would be obtained even if the algorithms are correct. It is interesting to note that a wide spread in the main input data and model parameters actually makes it possible to fit the calculated temperature to the measured one in rather different ways. Numerical experiment of  shows that an arbitrary choice of radiative forcings without justification of the choice criteria makes it possible to explain to a great extent the global warming of the 20th century beyond the hypothesis about the greenhouse effect. Ogurtsov  tested the total (direct and indirect) contribution of the solar activity to the global warming using one-dimensional energy-balance climate model. This work takes into account the fact that the Sun can affect the Earth’s climate not only directly, via changes in solar luminosity, but probably also indirectly via the modulation of galactic cosmic ray (GCR) flux. A correlation between the changes in the globally averaged low (<3.2 km in altitude) cloud cover anomaly and the changes in the GCR intensity was indeed demonstrated by Marsh and Svensmark [58, 59] and Palle et al. . Data on low cloudiness obtained in the framework of the International Satellite Cloud Climate Project (ISCCP) are plotted in Figure 8 together with the data on GCR flux measured by neutron monitor in Kiel.
One can see appreciable positive correlation between GCR and low cloudiness through 1983–2001. Ogurtsov  calculated the hypothetical cloud radiative forcing since the end of the 19th century based on: (a) linear relationship between low cloudiness and GCR over 1983–1994, proposed by Marsh and Svensmark ; (b) long-term reconstruction of GCR intensity obtained by Mursula et al. ; (c) estimations of cloud radiative forcing made using the data of the Earth Radiation Budget Experiment . Using this forcing, TSI reconstruction after Hoyt and Schatten  (Figure 9B) and a simple one-dimensional (4 latitudinal belts) energy-balance model, Ogurtsov  calculated the mean temperature in the Northern Hemisphere over 1886–1999 (Figure 9C). It is evident that the joint effect of the changes in: (a) the solar luminosity and (b) low cloudiness may lead to an increase in the hemispheric temperature in the 20th century by about 0.350C. Thus, in the framework of the model of , the warming of the Northern Hemisphere before the 1980s can be fully accounted for changes in solar-cosmic factors with the greenhouse effect fully neglected.
However, a decrease of the correlation between GCR and clouds at the end of the 1990s and a change of its sign after 2002 makes any conclusion about possible cosmic ray-cloud link rather disputable [29, 88]. The calculation of  is thus only a computing exercise, which shows that an arbitrary manipulation of the input model parameters, neither of which is known precisely and reliably enough, can bring us quite curious results. This result as well as many other points (see e.g. Lindzen , testify that despite large successes in climate modeling, current climate simulations seem still to be only fitting of the calculation results to the actual observed temperatures. A fairly good fitting can be achieved using various combinations of input data. Such investigations are very important since they are necessary for studying possible scenarios of climatic changes in the past as well as in the future. However, at present it is not easy to estimate (even approximately) the probability of each specific scenario. We can only note that solar contribution to the sharp temperature increase during the last 3–4 decades is either minor  or negligible . The rise of temperature after 1980 has not been simulated by the model of  (see Figure 9C). This testifies that the observed rapid rise in global mean temperatures after the beginning of 1970s is difficult to reproduce in any model if the greenhouse gas forcing is not taken into account.
5. Possible scenarios of climate change in the 21st century
Projection of forthcoming climatic changes is one of the main tasks of climatology. There are two approaches to understanding future evolution of climate: (a) mathematical and (b) physical. In the mathematical approach we take the observed climate record and try to extrapolate it properly into the future. In the physical approach we first attempt to understand the most important climate processes and describe them with a detailed model. Then the obtained model is used to predict the response of future climate to different forcings. Attempts at predicting future warming of the globe started in the 1970s–1980s. These physical forecasts were made assuming a predominantly greenhouse character of GW and were based on prognoses of future CO2 concentrations. The concentration of carbon dioxide has been predicted quite correctly, e.g. Legasov et al.  forecasted a 375–385 ppm value of CO2 in the year 2000 while Bolin et al.  estimated a corresponding value of 360–380 ppm. However, the predictions of global temperature were not equally successful (Table 2).
|Source||The forecast formula||Predicted temperature in 2000 (T 0C)|
|Budyko, 1972 ||T 2000=T 1900+1.2||0.83–1.00|
|Kellogg , 1978 ||T 2000=T1900+1.2||0.83–1.00|
|Budyko, 1982 ||T 2000=(1880–1975)+0.6||0.35–0.49|
|The impact of atmospheric carbon dioxide increasing on climate, 1982 ||T2000=T1900+(1.0–2.0)||0.63–1.80|
|Budyko and Izrael 1991, ||T 2000=T 1970+0.9||0.72–0.91|
Real temperatures in the years 1900, 1970 and 2000 were determined by means of the data of CRU (http://www.cru.uea.ac.uk/cru/info/warming/) and NCDC (ftp://ftp.ncdc.noaa.gov/pub/data/ anomalies/monthly.land_ocean.90S.90N.df_1901-2000mean.dat). The majority of the predictions (Table 1) clearly overestimated the warming at the end of the 20th century, since the real instrumental temperature in 2000 reached 0.27 (CRU) and 0.39 (NCDC). This is also true for the detailed prognosis made by a group headed by Hansen et al. . These researchers considered three possible scenarios based on rising concentrations of greenhouse gases (CO2, CH4, N2O, CFC11, CFC12) till 2020. Scenario A assumes continued exponential increase in greenhouse gases, scenario B assumes a reduced linear growth and scenario C assumes a rapid reduction of emissions such that the net climate forcing ceases to increase after the year 2000 (see Fig. 10A). Hansen et al.  have calculated respective variations in global temperature for these scenarios (Figure 10B). Even the minimum scenario of  considerably (up to 0.10С) overestimates the value of the actual measured temperature during 2005–2010. Naturally our knowledge on a climatic system has appreciably increased since the end of the 1980s. Nevertheless, the evident failure of the early GW predictions seems rather indicative. It proves that more or less reliable prediction of the future climate evolution is a very complicated task. Insufficient knowledge about possible and potential forcing factors deteriorates the reliability of climatic modeling and, hence, reduces opportunities of the physical methods of forecast. In addition, discrepant knowledge about the history of climate prevents us from more or less reliable extrapolation of temperature into the future. Consequently these shortcomings limit the applicability of the mathematical methods. As a conclusion based on the available climatic and paleoclimatic data two possible scenarios of the global temperature change in the 21st century are considered. These scenarios in turn are based on extreme scenarios of the climate evolution during the last 100 years.
GW is unique and unprecedented. It is connected mainly with an extra global-scale forcing factor, which did not operate in the past. It is clear that anthropogenic greenhouse effect is the main additional contributor and the temperature history of last 1000 years, which is compatible with the IHS reconstructions. Global warming appears almost entirely as a result of industrial activity of mankind, and the past of the climatic system does not play appreciable part (neglecting the natural factors). These projections, which are based on various assumptions on the future evolution of industrial emissions of greenhouse gases, result in temperature rise of the 21st century between 1.8–4.00С .
GW is not unique in a historical context and it is mainly the result of natural (terrestrial, solar, cosmophysical) climatic cycles while contribution of greenhouse effect is of minor importance. In that case the temperature history of the last 1000 years is compatible with MV and MCV reconstructions. Climate in the 20th century was driven by the same dynamic system as during the entire last millennium (neglecting the greenhouse effect). This in turn means that the present state of climate is a natural result of its past, and the paleoclimatic data (MV and MCV proxies) could be used as a source of information for forecasts. For this purpose we interpolated the paleorecords NHL and NHMb by decades, and made a prognosis of mean decadal temperature over the first part of the 21st century by means of a nonlinear forecast technique. The nonlinear prediction was made using the method of analogs, which is based on the reconstruction of the trajectory of the predicted series dynamic system in a pseudo-phase space and is a modification of the method used in .
If the warming of the last 100 years is a result of natural climatic variability, i.e. the climate of past century is governed by the same dynamic system as the previous one to two millennia, the mean temperature of the Northern Hemisphere in the first part of the 21st century will unlikely be higher than the modern value Figure 11. Ogurtsov and Lindholm  forecasted the same.
If the GW is a result of a variety of different processes of both natural and anthropogenic origin, neither of which could be neglected (greenhouse gas emission, solar activity change, natural climatic variation, regional and local anthropogenic impact), the situation is most complicated. In that case it is challenging to make even a qualitative estimation of changes of a global climate through current century because of the significant uncertainty in our knowledge on the relative contributions of the specified factors.
6. Conclusion and prospects for further research
The analysis of the available information on the temperature of the Earth, including both instrumental temperature measurements and proxy paleodata, leads to the conclusion that the 20th century was warm, i.e. the average temperature of the Earth through AD 1900–2000 was undoubtedly higher than the average temperature through AD 1000–2000. The forcing factors, which presumably cause the global warming are: (a) anthropogenic changes in the atmospheric concentration of greenhouse gases and aerosols, (b) changes in solar activity, (c) internal oscillations in the climatic system, (d) changes in volcanic activity, and (e) anthropogenic changes in land surface properties. Greenhouse forcing is often reported as the major contributor to GW. Climatic modeling gives evidence in favor of this assumption, since it is very hard to simulate abrupt rise of temperature starting in 1970s, if greenhouse gas influence is neglected. Nevertheless greenhouse skeptics propose a few ideas to explain the phenomenon. Pokrovski  suggests that the sharp recent warming is mainly a result of a strong 60–70 years natural temperature cycle connected with the circulation of oceanic water. The hypothesis sounds plausible but it should be noted that the interval of instrumental measurement is not long enough to establish both the phase and amplitude of this cycle accurately. Analyses of paleodata confirms the presence of the century-scale cyclicity in Northern Hemisphere temperature  but its peak-to-trough amplitude unlikely exceeds 0.3 0C and the second part of 20th century seems to be a decline phase of this variation (see Fig.4 of ). Another idea was considered by Bashkirtsev and Mashnich  who suggest taking into consideration the globally averaged satellite cloud observations made in the framework of ISCCP. Bashkirtsev and Mashnich  propose that the change in the Earth’s albedo, caused by downward trend in multi-decadal record of cloudiness, is sufficient to provide the observed temperature variations at the end of the 20th century. Their oversimplified estimation showed that a decrease in a global cloud area during 1987–2000, which usually is not considered by climatic models, could cause increase in background solar radiation flux up to 10 W m–2. Moreover, Palle et al. [80, 81] claimed that a change in the Earth’s reflectance, tightly related to the cloudiness, has resulted in appreciable increase in solar radiation incident at the Earth's surface at the end of the 20th century. The phenomenon is often called a global brightening. The more detailed estimations of , based on both the data of satellite and ground-based astronomical and actinometrical observations, show that the respective radiative forcing reaches 2–7 W × m–2 during 1985–2000. Calculations of  show that despite a short period of action this forcing factor should result in a corresponding very sharp rise of a global temperature. It should be noted, however, that the quality of the ISCCP cloud data is doubtful  and particularly long-term trends of cloudiness, established by means of satellite measurement, are highly disputable . That is why the question about changes in the Earth’ albedo during the last decades is still open.
The possible failure of the models to predict the troposphere warming, particularly in the tropics, seems to be the most serious challenge to greenhouse theory now. Disagreement between model prediction and observational data is repeatedly used by greenhouse skeptics as evidence about the poor quality of climate models. Discussions created by this controversy are often beyond purely scientific debates and concern some philosophical issues, e.g.: if model prediction disagrees with the experimental results, which one is most likely wrong – model or experiment ? Solutions to the problems concern the tropical troposphere warming, which thus is a crucial point for understanding the origin of the GW.
Paleoclimatic data unlikely improve our understanding of the CO2-temperature change relationship appreciably. Indeed, research of Antarctic ice-core paleorecords, covering the last 9–420 kA, reveal a pattern of strong temperature and CO2 rises at roughly 100 000-year intervals. But during these great temperature transitions the CO2 rise has almost always come 400–5000 years after (not before) the temperature increase [20, 69]. This link most likely appears as warmer temperatures have facilitated release of the gas from oceans. Therefore ice-core paleodata give evidence that temperature controls concentration of carbon dioxide in the atmosphere and not vice versa. Greenhouse warming supporters do not deny this conclusion but emphasize that this cause-effect relationship took place in the past while in the case of a contemporary warming, the external climate forcing by anthropogenic CO2 emissions leads climate variations. Thus the application of the CO2-climate relation deduced from the past on a recent global warming is not fully substantiated . Moreover a study by Shakun et al. , who examined 80 proxy records from around the globe 20–10 kyr ago (the last glacial-interglacial transition), showed that the temperature rise happened first in the Southern Hemisphere, while in the Northern Hemisphere the CO2 increase was first. Shakun et al.  arrived at conclusion that about 90% of the global warming occurred after the CO2 increase.
Based on the totality of the available paleodata we can infer that global temperature during the last 2–3 decades was:
certainly highest over the last 500 years.
probably highest over the last 1000 years. However, it is not possible to conclude reliably that the last 20–30 years was the warmest period of the entire millennium, because the existing reconstructions of temperature during the last 600–1000 years depict different and even discrepant patterns. This disagreement can hardly be explained by uncertainties inherent to the proxies. It should also be noted that neither of the temperature histories based on the paleodata can be considered satisfactorily precise and reliable, due to e.g. the insufficient coverage of the Earth’s surface by individual paleorecords.
Moreover, it is difficult to estimate the contribution of any individual factors potentially responsible for the GW – industrial emission of greenhouse gases, varying activity of the Sun, regional anthropogenic impact and natural climatic cycles – due to insufficient knowledge of the corresponding radiative forcings and climate sensitivity. Actually, the estimation of total net anthropogenic forcing since 1750, made by IPCC , gives a value 0.6–2.4 W×m–2. Our estimation of direct solar forcing, caused by change in luminosity since the beginning of the 19th century, gives a value 0.26–0.88 W×m–2. If we use the assessment of climate sensitivity  λc = 0.53–1.23 0K×W–1×m2 we obtain a corresponding warming by 0.32–2.950C due to anthropogenic factor and by 0.14–1.080C due to change in TSI. The difference between the lower and upper limits reaches almost an order of magnitude. In addition, there are other studies indicating that the Sun can affect terrestrial climate indirectly, e.g.: (a) via a connection between the cloud cover and galactic cosmic ray intensity [58, 59] and (b) via a connection between the galactic and solar cosmic ray intensity and aerosol content . However, these hypotheses have not been reliably proven and thus it is impossible to obtain quantitative estimation of corresponding forcings.
Temperature reconstructions of MV and MCV types show that natural cycles of longer scale with larger amplitudes can also be an important factor of the GW. But it is very difficult to determine the actual role of internal variability of the climatic system in the GW because of disagreement between different proxies and their limited precision and reliability.
Summarizing all stated above, we can conclude that the origin of the rise of global temperature should be considered as not well known due to a lack of adequate knowledge about many of the factors that may be responsible for this phenomenon. Consequently, it is very difficult to predict the climatic change in the 21st century even nearly precisely. The available information allows only specifying two possible scenarios of the evolution of global temperature during this century:
If global warming is almost entirely a result of industrial greenhouse effect the average temperature of the globe in the 21st century will continue to increase significantly, in agreement with the projections of IPCC.
If global warming is only slightly connected with the anthropogenic activity and is primarily a result of natural climatic variability (a less probable scenario), then the average temperature of the Northern Hemisphere will not increase at least during the first half of the 21st century.
Undoubtedly many other climatic scenarios are possible as well. However, it seems unlikely that the problem of the origin of the modern increase of global temperature will be solved before we reach the years in the middle of the current century. Substantial improvement of both climate modeling and experimental monitoring of the current state of the atmosphere is of great importance to establish origin and character of the GW definitely. Further progress in paleoclimatology can also help to solve the problem.