Open access peer-reviewed chapter

The Eta Model: Design, Use, and Added Value

By Fedor Mesinger, Katarina Veljovic, Sin Chan Chou, Jorge Gomes and André Lyra

Submitted: November 3rd 2015Reviewed: July 19th 2016Published: October 5th 2016

DOI: 10.5772/64956

Downloaded: 932

Abstract

The design of the Eta model goes back to early 1970s, when its original dynamical core was designed following the philosophy of Akio Arakawa of emulating important properties of the atmospheric governing equations. The core’s later major features were invented and implemented in the mid-1980s. Once a comprehensive physics package was added, the model became operational as a regional NWP model in the United States in 1993. Its use for regional climate projections followed later, for the South American region and then for a regional reanalysis over the North American region. Summary of the model’s dynamical core is given, followed by that of its physics package. Results of experiments revealing the model’s ability to generate added value even at large scales when run as a regional climate model (RCM) are summarized. The Eta model is applied on various climate scales seamlessly, from subseasonal, seasonal to multidecadal, from coarse 40 km up to high 5 km resolution. Examples of applications to various socioeconomic sectors, such as for hydropower management, crop yield forecasts, environmental and forest conservation, urban areas management, assessment of natural disaster risks, etc., are given. The Eta RCM capability to reproduce extreme climatic values is pointed out.

Keywords

  • Eta model
  • eta vertical coordinate
  • regional climate models
  • topography in climate models
  • added value by RCMs
  • horizontal diffusion

1. Introduction: model design

The origins of the Eta model go back as far as 1973 when the dry version code was written at the University of Belgrade of its “ancestor” model, referred to at the time as limited area primitive equation model (LAPEM). This first code already had an Arakawa, or Arakawa-Lorenz, conserving vertical advection scheme, and a lateral boundary condition (LBC) scheme that stayed in place until the present.

Over the years, many development steps followed along with a few model name changes until the name “Eta” became generally accepted, referring to the model´s unique vertical coordinate. But a number of other features that deserve to be noted are found in model’s dynamical core, which can be summarized as follows:

  • For the gravity wave terms, on the model’s E grid, forward-backward scheme is used that

    1. - avoids the time computational mode of the leapfrog scheme and is neutral with time steps twice leapfrog [1];

    2. - is modified to enable propagation of a height point perturbation to its nearest-neighbor height points, thereby suppressing the space computational mode of the semistaggered grid [2, 3].

  • Also for other terms, including various physics calls, split explicit time differencing is used. This makes the model very efficient since long time steps are used for subroutines that do not require short steps for reasons of stability.

  • Horizontal advection scheme is used that conserves energy and C-grid enstrophy, on the B/E grid, in space differencing, for two-dimensional nondivergent flow. The scheme is one of Janjić [4] obtained by transformation of the Arakawa and Lamb [5] C-grid scheme so as to use the B/E grid velocity components.

  • Conservation of energy is enforced in transformations between the kinetic and potential energy, in space differencing [6].

  • Nonhydrostatic version is available via a switch of the code. If chosen, the model uses the scheme of Janjic [7]. Instead of solving a prognostic equation for the vertical velocity component, the scheme approximates the vertical acceleration using the finite difference of the hydrostatic vertical velocities of two consecutive time steps.

  • The eta vertical coordinate [8] ensures hydrostatically consistent calculation of the pressure-gradient (“second”) term of the pressure-gradient force (PGF) irrespective of the steepness of the terrain. This is because the eta coordinate surfaces are very nearly horizontal. The topography discretization with the coordinate has been upgraded some years ago by introducing the so-called sloping steps (e.g., [9]), which successfully addresses the Gallus and Klemp [10] problem the step-topography discretization has with flow over a bell-shaped topography [11, 12].

  • Van Leer type finite-volume vertical advection of dynamic variables (v, T), using the scheme of [13].

One advantage of the eta coordinate compared to the usual terrain-following (sigma) is that vertical sides of model cells are very nearly equal so that for a finite-volume discretization, which ensures conservations by keeping track of the inflow and outflow into and out of model cells, multiplications of horizontal fluxes by the area of the vertical sides of cells can be skipped. This is done in the Eta so that in view of the Arakawa-type horizontal advection and finite-volume vertical advection it is very nearly a finite-volume dynamical core model. In fact, in view of its conservation of second-order integral quantities of enstrophy and kinetic energy, not done in standard finite-volume cores, its dynamical core can be referred to as a finite-volume “plus” dynamical core.

The physics package of the model includes the Noah surface scheme over land and sea ice, with roughness length following [14], and Monin-Obukhov with Paulson functions inside the surface layer. In addition, the wind-direction-dependent form drag scheme is used to account for the subgrid scale topography [15]. Over water, surface fluxes scheme is used with a molecular sublayer, roughness length after Charnock, and Mellor-Yamada level 2 derived scheme inside the surface layer. Mellor-Yamada 2.5 turbulence is used above the surface layer, with refinements that include addressing its realizability problem following [1619]. A choice of the Betts-Miller-Janjic [9, 20] or Kain-Fritsch convection scheme is available. The momentum transport scheme [9, 21] is present within the model’s Kain-Fritsch scheme. A choice of scheme [22] or [23] is used for cloud microphysics. GFDL radiation schemes [24, 25] are used, but at the time of this writing, they are being replaced by the RRTMG scheme of [26], upgraded subsequently by [27]; for the impact in clear sky case, see Ref. [28].

The model’s lateral boundary condition (LBC) scheme makes an effort to respect the mathematical nature of the problem, prescribing prognostic variables only along the outside boundary grid points, with one less variable prescribed at the outflow than at the inflow points [1]. At outflow points, the velocity component tangential to the boundary is extrapolated from inside of the domain. Experiments of [29] strongly indicate that thereby a result is obtained generally in better agreement with observations than when using the Davies [30] boundary relaxation scheme, used by just about all other limited area models.

2. Eta coordinate treatment of topography

After about 2000 quite a few papers stated that the eta coordinate, used by the Eta model, “has difficulties in representing flow over mesoscale topography” and/or that it “appears to be ill suited for high-resolution prediction models” (e.g., [31], among others).What prompted these doubts was a poor result of an experimental NCEP Eta model in forecasting a downslope windstorm in the lee of the Utah Wasatch Range, and even more the results of Gallus and Klemp [10]. In simulating an idealized case of flow over a bell-shaped topography, or “Witch of Agnesi”, and using an eta nonhydrostatic code, Gallus and Klemp [10] had flow separate downstream of the mountain instead of properly descending along the lee slope.

To address the problem, discretization using the eta coordinate was refined by assuming sloping bottoms of model’s lowest velocity cells, if the four neighboring elevation values permit the definition of the direction of the slope. This would be the case when one of the four elevation values is higher than the remaining three, and when two neighboring values are equal and higher than the remaining two. The slope would be defined inside the model layer below that of the velocity point. For an illustration in the simplest 2D case, see Figure 2 of Ref. [9].

The refinement improved considerably the simulation of flow over the bell-shaped topography, Figure 3 of [9], but not as much as the artificial enforcement of zero vorticity along the step corners by Gallus and Klemp, a modification they found to be working better than some others they tried. Recently, however, an oversight was noticed in the sloping steps Eta code, consisting of its horizontal diffusion not having been made aware of the existence of the slopes. While addressing this oversight, in addition the horizontal diffusion scheme was made unconditionally stable and monotonic. This seemed desirable because in very high resolution runs apparently too high values of the scheme’s nonlinear diffusion coefficients were resulting from high values of the velocity deformation. Following these refinements, the simulation of the Gallus-Klemp experiment was obtained as shown in the right-hand plot of Figure 1. The result that Gallus and Klemp obtained with their artificial code modification is shown in the left-hand plot. The right-hand plot shows velocities along the lee slope even slightly greater than those of the left-hand plot.

Figure 1.

Simulation of the Gallus-Klemp experiment with the Eta code allowing for velocities at slopes in the horizontal diffusion scheme, right-hand plot. The plot (c) of Figure 6 of Ref. [10], left-hand plot. Contours show wind speed in m s-1, with 10 m s-1 prescribed at the lateral boundaries of the domain; see [10] for additional detail.

In summary, with the eta coordinate discretization using sloping steps refinement, and the model’s Arakawa-style schemes, arbitrarily steep topography can be used with no significant noise generation and no major disadvantages that we are aware of. One could consider that a downside of the coordinate is that vertical resolution of the boundary layer becomes poor over high topography. However, model layers getting very thin over high topography with terrain-following systems might not be desirable either. For the illustration of the point made about steep topography in Figure 2 a plot is reproduced from Figure 8 of [9], showing vertical cross section generated using a nonhydrostatic option and running a case of a severe downslope windstorm in the lee of the Andes. Cliffs of well over a kilometer and up to about 2 km are seen in the model’s topography, with no visible noise in the isotherms depicted. The plot is generated using NCAR Graphics Package with values of individual grid cells shown and no contour smoothing.

Figure 2.

West-east cross section of the topography and temperature in deg K at 33 h of an 8 km resolution Eta forecast initialized at 0000 UTC 10 July 2006. The case run is one of an intense zonda windstorm in the lee of the Andes, with heating in the area of the station San Juan, at the distance of about 340 km from the origin of the plot, of more than 9 K compared to the temperature at 24 h. For additional detail see [9].

3. Value added at large scales

One issue the Eta model has been used to address is that of the possibility of achieving added value at large scales inside the domain of a regional climate model (RCM). The usual attitude of people running RCMs is that RCMs should improve on small scales, while keeping the large scales as close as possible to those of the global driver model. To this end, nudging of the RCM’s large scales toward those of the driver global model is frequently applied. Opinions have even been advanced that RCMs are unable to improve on large scales. Veljovic et al. [32] have on the other hand argued that attempting to improve also on large scales when running an RCM is a worthwhile objective, and have presented results indicating that this indeed is possible. If one can achieve added value at large scales inside an RCM, small scales will obviously benefit too, and perhaps quite considerably so. Thus, Figure 1 of [32] shows an example of quite an extraordinary improvement achieved using an RCM, which hardly could have been possible without an added value at large scales.

A caveat is, however, needed regarding some of the tests for which authors have made statements on the added value achieved at different scales but have driven their RCMs by various reanalysis fields. One should notice that it is only via experiments that use GCM results for LBCs that one can fairly test the ability of an RCM to add value at large scales. It is crucial to note that the purpose of RCMs is not to reproduce something we know, but to improve on climate integrations that are projecting changes into the future that we do not know. When driving an RCM with reanalysis LBCs, the RCM has no opportunity to improve on large scales of the climate projection driver data even though it might be able to do so. As pointed out by Veljovic et al. [32], and restated in [9], this is because such climate projection driver data have not been made a part of the experiment. Thus, a proper assessment of the optimal domain size, and of the ability of the RCM to add value at different scales, is not possible.

One should note that we are not saying that in experiments with RCMs driven by a reanalysis one cannot improve on a field or a feature that might be considered large scale, such as perhaps precipitation pattern, compared to the reanalysis used. But this is not a test of the ability of the RCM to add value at large scales because the global fields against which such a test needs to be made, climate projection fields, were not used to drive the RCM.

It is suggested that in order to achieve added value at large scales, four requirements must be observed. First, one needs to run the RCM on a relatively large domain. Note that domain size is very inexpensive as compared to resolution. Second, one should use LBCs that are not ignoring the basic mathematical properties of the problem at hand. That means enforcing the driver model values along the outside RCM boundary only, and not all of them at the outflow parts of the boundary. Third, one must not use large scale or spectral nudging. And finally, fourth, one must use an RCM with dynamical core not of inferior quality to that of the driver global model.

Following the original effort of Veljovic et al. [32], additional experiments that support the four requirements listed above have been presented by Mesinger and Veljovic [11, 12]. For an overall test, the most crucial issue is obviously demonstrating that indeed achieving added value at large scales is possible when the requirements listed are observed. To this end, in Figure 3 the verification results are shown for an experiment in driving a 21-member Eta model ensemble by ECMWF 32-day ensemble members. Verification scores chosen are for the wind at 250 hPa, assuming that it is the placement accuracy of the wind at upper troposphere, in particular of the jet stream, that reflects the model skill in forecasting atmospheric large scales better or at least more directly than various spectral analysis methods a number of authors used. Therefore, the placement of wind speeds greater than a chosen value, selected at 45 m s-1, has been assessed using the verification approach that is customary in quantitative precipitation verifications. ECMWF analyses were taken as “observed”. The measure used is the one usually referred to as the equitable threat score (ETS) or Gilbert skill score (e.g., [33]). To minimize the impact of “hedging”, e.g., of obtaining a higher score if the forecast area is greater than the observed, the “bias adjusted” ETS scores, ETSa [34], were calculated. Results for the driver ECMWF ensemble (red) and the Eta ensemble (blue) are shown in the upper panel of the figure. More traditional RMS differences between the ensemble members and the analyses are shown in the lower panel.

Figure 3.

Bias-adjusted ETS scores of 250 hPa wind speeds greater than 45 m s−1, upper panel, and RMS wind difference, in m s−1, lower panel, of the driver ECMWF ensemble members (red) and Eta members (blue), both with respect 
to ECMWF analyses. Initial time is 0000 UTC 4 October 2012.

Compared to that of the earlier experiment of Veljovic et al. [32], the driver ECMWF ensemble of the experiment of Figure 3 was of an increased resolution, of about 32 km the first 10 days, and 63 km thereafter. The resolution of the Eta ensemble was unchanged, about 31 km. Thus, during the first 10 days the resolution of the two ensembles of Figure 3 was about the same. Yet, a clear advantage of the Eta ensemble is seen in the ETSa scores during the first 3–9 days of the experiment, supported by the lower RMS differences during that time. The differences in ETSa scores following day 10 are smaller, with the Eta scores being somewhat higher most of the time when the two are visibly different during about the first half of that period, and an opposite result near the end. The RMS differences following day 10 show about the same message. One might consider scores of earlier times, when the model skill is still synoptically useful, to have more of a meaning than that near the end of the experiment, in particular once the ETSa becomes close to its random value of 0.

Along with the earlier results cited, the scores of Figure 3 we feel amount to a fairly large body of evidence showing that a nested limited area model, or an RCM, can achieve added value over its driver model also at large scales. If it does so more often than not, then using large scale or spectral nudging will be a nudging in the wrong direction. One should also note that the boundary conditions scheme of Ref. [30] or relaxation boundary conditions, forcing the RCM values in a boundary band of points to approach those of the driver model, also represent a way of nudging the RCM toward the large scales of the driver model. Thus, if used, they reduce the ability of the RCM to improve on large scales. In summary, if one finds that spectral or large scale nudging is needed for an acceptable result, we feel it is a good idea to try to find out why.

4. Regional reanalysis

Starting with Kalnay et al. [35], a practice of analyzing all reasonably available data using an unchanged system of model and data assimilation has become customary, with the idea that thereby as much as possible the changes in model and data assimilation systems will be taken out of the resulting analyses so that as much as feasible the climate and climate change information only will impact the analyses. Presumably for the first time, in the early two thousands a regional reanalysis effort has been undertaken at NCEP, running the then operational Eta model and data assimilation systems. The paper presenting the project design and selected results for the original 25-year period of 1979–2004 is Mesinger et al. [36]. The reanalysis, named North American Regional Reanalysis (NARR), is continued by NCEP/Climate Prediction Center in near-real time, and is used for many application purposes.

5. From weather to climate

The Eta model has been applied to produce weather forecasts over South America at the Centre for Weather Forecasts and Climate Studies (CPTEC) of INPE since December 1996 [37]. Due to the steep slopes of the Andes Cordillera in South America, the eta coordinate was demonstrated to enable a realistic reproduction of the major features of the summer circulation over South America [38].

The first attempts to extend the integration of the Eta model were applied over South America: Chou et al. [39, 40] using two different land surface schemes—the simple bucket model and the SSiB scheme. The evaluations of one-month simulations produced by the Eta model showed improvement over the driver CPTEC Atmospheric General Circulation Model (AGCM). Chou et al. [40] demonstrated that the model responds most strongly to changes in the lateral boundary conditions, in the second place, to changes in the sea surface temperature, and then, finally, to the land-surface conditions, such as soil moisture and soil temperature.

The Eta model started seasonal forecasts over South America [41] in 2002 by applying persisted anomaly of sea surface temperature as the lower boundary condition of the model. The model was driven by CPTEC AGCM through the lateral boundary conditions. The integration range was 4.5 months. The evaluations showed that the model reproduced the seasonal precipitation variability over the continent and some potential use of higher frequency variability forecast by the model. The Eta seasonal forecasts clearly showed advantage over the driver CPTEC AGCM forecasts, in particular of the seasonal precipitation forecasts [42]. Some systematic errors of seasonal precipitation forecasts were identified in the construction of model seasonal climatology [43] such as the underestimate of 850 hPa specific humidity in the central part of the domain during summer season. Bustamante et al. [43] also showed the interannual variability of seasonal precipitation predicted by the Eta model in agreement with observations. Resende [44] also obtained an underestimate of 700 hPa specific humidity over most of the South America by the Eta model simulations driven by Climate Forecast System Reanalysis (CFSR) [45] at the lateral boundary conditions, in both winter and summer seasons. This underestimate of seasonal-specific humidity may be one of the sources of errors of seasonal precipitation, which is generally underestimated [46].

The relevance of the boundary conditions to drive the regional climate model is shown in Pilotto et al. [47], who compared the Eta seasonal precipitation forecasts driven by CPTEC AGCM, with persisted sea surface temperature anomaly, against the forecasts driven by CPTEC Coupled Ocean-Atmosphere GCM, using predicted sea surface temperature. The model was set up in the domain covering the southern Atlantic Ocean and eastern part of South America. Improvements of the precipitation forecasts over the equatorial oceans by the Eta forecasts driven by the OAGCM were substantial by reducing the excessive precipitation of the Intertropical Convergence Zones over the Atlantic and Pacific Oceans.

A first attempt to apply the Eta RCM forecasts for crop yield productivity in Brazil was produced by Vieira [48] using a physically based crop model driven by Eta seasonal forecasts. Although the evaluation of these forecasts showed some level of error in various areas where corn crop is produced [49], the forecasts captured reasonably well the crop productivity forecasts, particularly in the months of maximum productivity.

Most of the energy production in Brazil is based on hydropower. Therefore, the accurate precipitation forecasts are essential for the correct management of energy production, distribution, and transmission. An example of the application of Eta RCM seasonal forecasts over major Brazilian river basins can be found, for example, in the project mentioned by [50], that demonstrated that the Eta RCM has smaller systematic errors in total precipitation during the rainy season of the Sao Francisco River Basin, in comparison with the driver CPTEC AGCM seasonal precipitation forecasts. These forecasts were run in ensemble mode perturbing parameters of the Betts-Miller-Janjic convection scheme of the Eta model. Another example of the application of the seasonal forecasts for the energy sector is the work of the evaluation of the Eta seasonal forecasts in capturing the onset of rainy period over the Parana River, one of the major rivers for power production in Brazil [51].These forecasts showed large spread in determining the onset of the rainy season. A potential application in the energy sector is the forecast in subseasonal scale as shown in [52] driven by CPTEC Coupled Ocean-Atmosphere General Circulation Model (OAGCM), using lagged ensemble of 20 members constructed from twice daily forecasts during 10 consecutive days.

6. Climate change

The Eta model was adapted to run from seasonal to multidecadal range [53] in order to develop capacity for studies of climate change. Some of the changes consisted of allowing seasonal variations of vegetation greenness, sea surface temperature reading off coupled ocean-atmosphere global climate models for any long decadal range, synchronicity with OAGCM calendar, and updated equivalent CO2 concentrations according to the projected future emission scenarios. The model was initially driven by the HadAM3P global atmospheric model and had the present climate simulation evaluated against climatology [53]. The model reproduced reasonably well the South America summer climatological features such as the South Atlantic Convergence Zone and the upper level circulation.

To support impact, vulnerability, and adaptation (IVA) studies for the Brazilian Second National Communication to the UNFCCC, the Eta model was set up at 40 km horizontal resolution, nested in four physics perturbation members of the HadCM3 simulations under the A1B emission scenario [54]. The model was run continuously from 1960 to 1990. The time slice between 1961 and 1990 was considered the reference climate period, while three time slices, 2011–2040, 2041–2070, and 2071–2099, were produced as future climate periods. In order to verify model capability to reproduce large-scale climatic pattern in long-term integrations, Nakićenović and Swart [55] demonstrated that the seasonal mean upper-level winds that were simulated in the 30-year continuous runs agreed with reanalysis winds (Figure 4). The Bolivian Anticyclone, which is a major summer feature over South America, was correctly positioned.

For the Brazilian Third National Communication, the strategy to construct the Eta ensemble considered two greenhouse gas emission scenarios, RCP4.5 and RCP8.5 [56], and at least two global climate models, HadGEM2-ES and MIROC5. The runs were divided into four time slices from 1961 to 2005, as the reference climate period, and 2006–2040, 2041–2070, and 2071–2100, as the future climate periods [57, 58]. The horizontal resolution was increased to 20 km. From the Second to the Third National Communication, the Eta model was upgraded following [9]. The major changes were the modification of the vertical advection, making the model a full finite volume, and the refinement of the eta coordinate discretization, which allowed sloping sides of topography cells. This version showed improvement over the previous version, in particular in capturing the downslope windstorms at the lee of the Andes in South America [59], the zonda winds, a foehn type of wind.

Figure 4.

Mean 200 hPa streamlines from ERA40 reanalysis (left column) and Eta run nested in HadCM3 simulations (right column) for DJF (top row) and JJA (bottom row). Areas shaded in orange refer to wind speeds in m s−1.

The spread of the Eta ensemble runs resulting from the use of different global models at the lateral boundary conditions in [57] was shown to be larger than the spread resulting from the use of the perturbed members of the same global model in [53]. The spread produced by the use of different global model drivers and different emission scenarios [57] attempts to produce the range of lower and upper limits of the projected changes. Therefore, the constructed ensemble tries to envelope the uncertainties associated with the construction of the climate change projections.

The output from the Eta model downscaling of global climate change projections has been applied to support studies on various socioeconomic sectors. Studies of possible impacts of climate change have been produced to the Brazilian hydropower availability [60], to water resources considering small river basins and river springs [61], to Amazon biome and general tropical forest conservation [62, 63], and to coffee crops [64, 65]. Indices of vulnerability and susceptibility to climate change have been designed, for example as in [66], in order to help to construct adaptation measures.

The IVA studies generally require higher resolutions dataset because the problem faced is of local scale rather than global or continental. In addition, the major issue of climate change is the change of the extreme values. In coarse resolution, the values are smoother. The increase in horizontal resolution, easily provided by an RCM, can help to reproduce the frequency distribution of a variable closer to the observed frequency. The evaluations of trends of extreme values in metropolitan areas of São Paulo [67] and Rio de Janeiro [68] under the A1B scenario used the Eta model output at 40 km. The increase in resolution to 20 km [69, 70] has shown improvement over the 40 km, especially in the extreme values. The mean values of the Eta climate change projections at 20 km resolution are very similar to those of the coarse 40 km resolution runs. The gain due to the resolution is detected in the extreme values that agree better with observations as demonstrated by Chou et al. [69].

Figure 5.

Frequency distribution of temperature (°C) for two cities in Brazil, São Paulo and Guarulhos; and frequency distribution of precipitation (y-axis in log) (mm day−1) for stations around the Metropolitan Region of São Paulo (RMSP, the grey contoured area). The distributions are for the period 1961 to 1990. The black lines refer to observational data; the purple lines, the 20 km Eta simulations; and the blue lines, the 5 km Eta simulations.

An additional horizontal resolution increase may lead to grid sizes smaller than 10 km and to scales in which nonhydrostatic motions become important. The advantage of using the switch to change from hydrostatic to nonhydrostatic mode is clear in long-term integrations, in which resolution is decreased in favor of increasing the integration range. This further increase in horizontal resolution is suitable to help measure the resilience of a city to changes in the climate. The evaluation of the 5 km Eta in nonhydrostatic mode setup over Southeast Brazil [71, 72] showed the advantage of the higher resolution for reproducing the distribution frequency of temperature and precipitation in metropolitan areas. Figure 5 compares the frequency distribution of temperature and precipitation for stations located in the metropolitan area of São Paulo obtained by the Eta runs set up in 5 km and in 20 km resolutions. The 5 km resolution distribution approaches the observation distribution more than the 20 km distribution. At this resolution, the topography and coastline are better described, which favor the representation of the winds and temperature, and, consequently, precipitation.

Over southeastern Europe the model was used by Kržič et al. [73] and Djurdjević [74]. Investigations made are similar to some of those summarized above for regions of Brazil. The Eta model used, however, differs in having its radiation scheme replaced by that of [75], and also in being coupled to the Princeton Ocean Model (POM). One needs to note that in these references the acronym EBU is used for the Eta version used, referring to “Eta Belgrade University”.

7. Work in progress and plans

Along with various applications we are pursuing further development of the model, in several directions. In view of the constant increase in the power of computing resources, one of these efforts is aimed at improving the performance of the model when run at high horizontal resolutions. Thus, we have been making extensive experiments using 1 km resolution over a domain including very complex coastal topography of the Brazilian states of Rio de Janeiro and São Paulo. Our plans addressing the performance at these and higher resolutions include further upgrades of the model’s already upgraded Mellor-Yamada 2.5 turbulence scheme. Our codes include some features of the Mellor-Yamada-Nakanishi-Niino scheme (MYNN, e.g., [76], and references therein), and additional work in this area is planned.

Another direction of our model refinement efforts is motivated by the need to enable improved environmental applications at the time of diverse influences on climate by changes in atmospheric constituents, such as aerosols, greenhouse gasses additional to carbon dioxide, etc. Work on coupling these with the now experimentally implemented RRTMG radiation scheme is in progress.

To enable model use in a global setup, work is in progress for implementing the current Eta code to be used mapped on a cubed sphere, by following in the footsteps of the earlier work of Rančić et al. [77] and Zhang and Rančić [78]. Once this work becomes sufficiently mature, it should provide an ideal framework also for the use of the Eta as a nested model on one face of the cube, with resolution higher than on the remaining five faces. This should enable “lateral boundary conditions” of the “nested” model to be defined much more consistently than what can be done when nesting a limited area model in an “alien” global model.

Acknowledgments

The work of Sin Chan Chou, Jorge Gomes, and André Lyra reported on here was funded by Grant Nos. 400792/2012-5 and 308035/2013-5 of the National Council for Scientific and Technological Development (CNPq), Brazil. The work of Fedor Mesinger was partially funded by the former of these two grants and by Grant No. F-147 of the Serbian Academy of Sciences and Arts, and that of Katarina Veljovic by the Ministry of Science and Technological Development of the Republic of Serbia under Grant No. 176013.

How to cite and reference

Link to this chapter Copy to clipboard

Cite this chapter Copy to clipboard

Fedor Mesinger, Katarina Veljovic, Sin Chan Chou, Jorge Gomes and André Lyra (October 5th 2016). The Eta Model: Design, Use, and Added Value, Topics in Climate Modeling, Theodore Hromadka and Prasada Rao, IntechOpen, DOI: 10.5772/64956. Available from:

chapter statistics

932total chapter downloads

More statistics for editors and authors

Login to your personal dashboard for more detailed statistics on your publications.

Access personal reporting

Related Content

This Book

Next chapter

Computational Vector Mechanics in Atmospheric and Climate Modeling

By James Williams, Britton Landry, Matthew Mogensen and T. V. Hromadka II

Related Book

First chapter

Introductory Chapter: An Introduction to Topics in Hydrometeorology

By Theodore V. Hromadka II and Prasada Rao

We are IntechOpen, the world's leading publisher of Open Access books. Built by scientists, for scientists. Our readership spans scientists, professors, researchers, librarians, and students, as well as business professionals. We share our knowledge and peer-reveiwed research papers with libraries, scientific and engineering societies, and also work with corporate R&D departments and government entities.

More About Us