The Materially Finite Global Economy Metered in a Unified Physical Currency

In this book entitled "The Biosphere", researchers from all regions of the world report on their findings to explore the origins, evolution, ecosystems and resource utilization patterns of the biosphere. Some describe the complexities and challenges that humanity faces in its efforts to experiment and establish a new partnership with nature in places designated as biosphere reserves by UNESCO under its Man and the Biosphere (MAB) Programme. At the dawn of the 21st century humanity is ever more aware and conscious of the adverse consequences that it has brought upon global climate change and biodiversity loss. We are at a critical moment of reflection and action to work out a new compact with the biosphere that sustains our own wellbeing and that of our planetary companions. This book is a modest attempt to enrich and enable that special moment and its march ahead in human history.

people, most academics, and the media do not respond in a way acknowledging we are talking about the world we share and on which we live. This chapter addresses problems with the concept, why it has been overlooked, attempts to show clearly the critical need for awareness, and introduces a new method for monitoring and accounting for what people do to their surroundings, logically and scientifically cumulative from individual actions to global consequences. The method is raw and only newly tested in case studies, but has the advantage of being objectively testable, and removed from the subjective influences imposed by public and private stakeholders in manipulation of financial accounts. Our modern life is mesmerized with manipulated money, and we need to grasp physical reality directly. This new approach estimates that in 2011, 450 exajoules of fossil fuels energy was "released" for "consumption" against the context of 26,000 exajoules of "unavailable" coulombic forces.
If an alien visitor observed our planet from space, and saw the marvelous constructs of infrastructure and cities, vast farmlands and other developments, but was unable to zoom in or understand how people and organizations deal with each other in social contracts of language, documents and money, a fair idea could still be deduced of where on Earth was conducive to quality of life, and where was unpleasant because of natural and man-made problems.
A study is described in this chapter on why monitoring of GDP by financial accounting is increasingly misleading, and what attempts have been made to use non-money approaches to understanding economic activity. This chapter introduces a radical and raw new approach. The initiative is described and conducted as a first step to provide a scientific numeraire that indicates how individuals, groups and regions are impacting on their local surroundings and, in total, on the biosphere.
The problem of monitoring and managing "Spaceship Earth" (Boulding 1966) in money terms is becoming ridiculous. Clean-up costs of environmental disasters are additions to GDP. Financial crashes are addressed by financial experts recommending more money, trillions in the US and hundreds of billions in other default prone nations. In the cacophony of reporting, giving advice, mounting austerity measures and carrying out protests after the 2008 crash, there seems no mention that economists for two centuries have always fallen back on the obvious truth that money is supposed to be used to represent real goods and services. It is not supposed to be an end in itself.
Primitive peoples built up a view of their world founded on food and shelter for survival and dealt in transactions of sharing and then barter. Tokens such as simple shells became a wonderful invention for efficient trading. This evolved into money, leading to centrally administered stamped metal, recognizable to the general public, and conducive to storing, transportation, and small and large market transactions. Almost as soon as invented, money started to distort the world view of some, in official circles, undermining the power to represent reality through wanton increase in money supply, and unofficially through counterfeiting, robbery and any means of cheating possible.

When our world was small -Managing the household
The first conscious attempt to monitor and explain what happened with people's activities around us was by Quesnay publishing in 1756 an "Economic Table" of transactions between farms and cottage industries in northern France. The term "economy" then was as novel and as strange as "biosphere" seems now to the public. The term derived from the two Greek words for "household" and "manage" and scientists dealing with our "biosphere" would serve themselves and the public well to raise awareness that the biosphere is the home of 7 billion people, and requires management.
Right from the outset of economic study three centuries ago, Nature was taken for grantedthe soil and water were too obvious to discuss. Attention was concentrated on the goods and commodities produced and the services rendered. Quesnay recorded goods and services in francs and plotted the complex transactions between farm produce and the simple outputs of utensil, tools and cloth from the villages.
The first grand attempt to explain the wellbeing of the world, as it was understood at that time, was set out in "The Wealth of Nations" (Smith, 1776) observing, not the whole world, but those parts that were considered important to Europeans of that era. Smith legitimized the accounting of real goods and services in money terms with the image of two hunters agreeing that if the trouble taken in getting a beaver was equivalent to getting two deer, then in money terms a beaver is worth 2 deer. What was not needed to be spelt out was that money is so wonderful for it avoids the unpleasantness of equating one deer with half a beaver. The British Empire expanded all over what we now call the biosphere, and monitored and managed its territories in money terms. The power of money led to exponential developments in resource utilization, financing and leveraging mines and commercial plantations, along with intricate infrastructure in the New World, Africa, and Asia.
Modern students of the biosphere must be aghast that the author of the founding canon on resource utilization (Ricardo, 1817) offered the throwaway line that, of course "water and virgin land are free" because they are abundant. But this was written 60 years before the Oklahoma Land Rush, dramatic evidence that virgin land was indeed free. The sense of superiority that grew out of the industrial revolution engendered a ubiquitous phenomenon that regions of the world still undeveloped were desperate for development. As the mills, factories and foundries of Europe reached limits of local raw materials and encountered downsides of polluting, smokestack industries were enthusiastically welcomed in the New World, where there was plenty of coal and other resources, and virgin land where dumping pollution seemed not to matter.
The economic development theory presumed a linear path towards a better life, technology replacing unpleasant labor, and benefits so obvious that they seemed universal goals. The downside of resource depletion always seemed to have substitutes in other places or through different technologies. The downside of polluting air, water and soils could be confidently addressed, usually by shifting the culprit emitters to further locations where development and new employment were welcome. The cleansing of London of its smog was hailed as a triumph, usually without recognition that the causes were moved to Los Angeles, and now to Beijing and a dozen cities in China.

The materially finite planet
The first checks on the blind belief in technology and concerns for broad environment only appeared in the 1960's with clear evidence of farm pollutants in "Silent Spring" (Carson, 1962). This was followed by the philosophical reflection on the first photos from the Apollo space program of the Earth as a closed materially finite system in a vacuum with only energy from the sun and gravity as inputs, set out in the essay, in "The Economics of the Coming Spaceship Earth" (Boulding, 1966). A heightened anxiety of global resource depletion and pollution was stimulated by "The Limits of Growth" (Meadows et al, 1973).
The new environmental awareness called into question the economists assertion of "rational man" which presumed myriad actors in the market place purchasing, producing, investing resulted in a superior outcome to Big Brother central planning. In the context of the Cold War western countries exalted entrepreneurship and faith in new technologies and leapfrog innovations that would overcome immediate problems or resource depletion and dirty industries. There was plenty of evidence to support this faith. Thus each call for sober assessment of global resources and capacity to absorb pollutants was met with various degrees of distain and ridicule, likening nay-sayers to Chicken Little panicking that "the sky is falling" and rebutting depletion concerns with the observation that "the Stone Age did not end because they ran out of stones". In a widely noted public wager, business optimist Julian Simon challenged conservationist Paul Erhlich on the rise or fall of metal prices over the decade from 1980. Through new technologies and more exploration driven by demand, the prices fell.
The thrust of the warnings in The Limits of Growth seemed to be weakened by periods of economic boom. At publication in 1973 the world seemed exhausted and choked, and China took no part in the world economy, preoccupied with its debilitating Cultural Revolution. China changed quickly from 1980 to enter and drive a new level of intensity in development. The 1980 US presidential elections saw candidate Reagan refuting incumbent President Carter's urging to choose smaller cars for less environmental impact, and the following rout underscored to politicians around the world the danger of preaching conservation and humbleness as opposed to bravely dominating the environment.

Science and economics as divergent cultures
The sciences and the humanities have been criticized as diverging to the extent that communication breaks down (Snow, 1959). The observation was made that in a gathering of people well educated in the arts, mention of the Second Law of Thermodynamics would likely be met with puzzlement. There has also been a bifurcation between science and economics. The early economists emulated Newtonian mechanics in their modeling of market "forces", parabolic trajectories of demand, and business cycles. As the discipline of economics burgeoned, and enjoyed power as it guided and supported the policies of governments, science developed into myriad specialties. The polymaths made way for very narrowly focused specialists delivering wonderful new discoveries that were not generally understood by the public or economists, but resulting in general progress. Economists impressed by Newton used gravity for their modeling of economic processes, and assessed projections of investment with rates of return precise to two decimal places, when in physics, Heisenberg was revealing the uncertainty principle in even the most concrete of experiments.
The economist forecasts of good news and slewed accounting seem to win over the caution of scientist, and thus assessments of the need to respect ecologies and the diverse biomes, www.intechopen.com and especially pointing to the material finiteness and interrelatedness of activities on the planet rarely get endorsement over glamour projects promising more to consumers. In government, economic advisors are more influential than science advisors, and have more room to wriggle hinting at potential for short term gains. The 2010 Oscar winning documentary movie, Inside Job, objectively and correctly highlights how influential economists from Ivy League universities unabashedly endorsed reckless deregulation of Wall Street financial services for their personal remuneration. Economics is used to forecast and to influence regulations in the future, and is not subject to rigorous tests of objective repeatability as occurs in science.
Accountants, not scientists tend to leadership in big business. Early this century Enron's core business was selling natural gas, a product where inventories and flows should be easy to confirm. With a Harvard trained financial expert in charge, engineers were belittled for their doggedness and inability to be creative. A culture of ignoring science and presenting financial reports of miraculous results engendered such an aura of respect that simple checking on the natural gas (that is, methane, or CH 4 , a very simple molecule that cannot be doctored and substituted for increased profits) dared not be suggested, neither by prudent employees enjoying promises of bonus, by the accountant firm enjoying high billings, or even by government regulators. The charade was untenable, and the "market correction" that economists assure will come into play when needed was severe, leading to a record bankruptcy, layoffs, suicide, and jailings.
We have reached a stage where scientists are unable to articulate convincingly on even what seems indisputable important facts such as the factors influencing climate change, and the desecration of primary forests and grasslands. There are rumblings on Capitol Hill in Washington that the EPA is a "job killer" and should be disbanded. In all human history so far, the biosphere we are now trying to understand and protect has served us well and has endured volcanic eruptions, sun storms, and other giant natural disasters. A generation is growing up in polluted cities blithely accustomed to serious air pollution and undrinkable tap water. The threat of a slow early death seems not nearly as vital as having a job and maximizing short term enjoyments. As long as there are doubts about what impacts human activity has on our surroundings, and especially while the planet seems big enough to process and refreshen even very harmful emissions, scientists will remain the playthings of politicians.

Measuring our world without relying on money
Early civilizations measured wealth in weights of grain and gold. Money soon took over and Smith's inquiry into the wealth of nations led 150 years later to the formal estimating of national income and the concept was so powerful that National Accounts and GDP soon became the dominant check on a nation's wellbeing. Criticisms of GDP thinking were met with tinkering at the edges for "green" GDP and "quality of life" and "human development indices".
The oil crises of the 1970's prompted venturing into using energy as a measure of value, not only directly, but also in the "energy cost" of all goods and services. The aluminium industry lends itself to energy analysis simply because it is so energy intensive. As an example, at the height of energy analysis popularity one case study showed that 1.9 ton of www.intechopen.com zinc concentrate required 18.97 gigajoules to be processed to 1 ton of zinc ingot (Middleton & Paterson, 1977). Yet the approach was overdone and there were many economic activities and commodities where energy input seemed only a minor factor.
One valiant crusader attempting to circumvent the biases caused by money accounting invented the term "emergy", variously described as derived from "energy memory" or "embodied energy" (Odum, 1989). Odum tried to explain economic activity through a language of various energy-related symbols he had created. Given the desire by many wellmeaning people to find antidotes for measuring wellbeing in dollars, his quests had serious following but have now mainly been discredited.
The giant step to link economics back to science was made by a respected economist, arguing that economy was modeled as if it was a perpetual motion machine, and specifically that it had no cognizance of the Second Law of Thermodynamics (Georgescu-Roegen, 1971). Published at a time of growing concern about the downsides of economic development, the idea was received with interest and taken up by a new generation of environmentally aware economists.
The biosphere came to prominence with the work of a Georgescu-Roegen protégé, Herman Daly. Daly took the foundational model of macroeconomics, the "circular flow of the economy" and showed that the loops of goods and services in the market ignored Nature as a source of inputs, and as a receiver and recycler of high entropy unwanted outputs to the air, water and soils. Daly's 2-part diagram shows the conventional circular flow as a small loop nested in a very large environment, so generous and forgiving as to be taken for granted, as was the case two centuries ago. The second part of the diagram shows the circular flow at its current size, burning billions of tons of coal etc, drawing down reserves of vital inputs and, highlights the new awareness that on the output side the economy is choking in constrained space for emissions and wastes of many kinds (Daly 1977). But even his tenure as Chief Economist, Environment Division of the World Bank did little to convince World Bank bankers and pro-growth, pro-consumption economists that economies should not blindly strive for higher GDP. A third wave after Georgescu-Roegen attempted to estimate the value of the world's ecosystems and came to the conclusion that the annual service value of the biosphere was about double world GDP (Costanza et al, 1997). This famous article in the prestigious journal, Nature, was a milestone in bringing the concept and term, biosphere, to a wider public.
As environmental consciousness gained ground, the analytic tool known variously as Substance Flow Analysis and Material Balance became to be usefully applied. Based on the fact that matter/energy cannot be created or destroyed, the material and energy in major industrial processes could be tracked from insitu reserves through to final product, AND with the inventorying of what wastes had previously been ignored or hidden that must exist to make up the total of what went in. It comes as a surprise to many lay people that a lump of coal, solid and heavy, is primarily made up of carbon atoms, atomic mass 12, but on combustion they form with two atoms of oxygen, each atomic mass 16, so that the invisible, formerly perceived as benign, carbon dioxide floating in the air has an atomic mass of 44.
An advance was made on tidying up energy analysis with the slow picking up of the concept of exergy. In 1956, in Central Europe, Slovinian Zoran Rant made the simple www.intechopen.com observation that energy cannot be created or destroyed (First Law of Thermodynamics) and so in economic activities we should not say energy is produced or consumed. He coined the term "exergy" meaning useful energy that, in being used, is destroyed. His first step was taken up and promoted by Polish Jan Szargut, but for decades of the Cold War, his publishing, mainly in Polish, went unnoticed in the West. Chinese students, with only narrow options to study abroad, discovered Szargut's work in the Soviet Union and brought it back to China, finding it fitted well with the material inventories used in Chinese accounting under the planned economy (Chen & Chen, 2007). In the West, Stanford University Global Climate and Energy Project have produced excellent, detailed charts of global exergy flows, accumulation and destruction, (Section 8.3 and Figure 10). Szargut's monumental research not only quantified the exergy in fuels but also the "embodied exergy" in anthropogenic products.
6. De-engineering science for a fresh look at the big picture

Purpose-driven science needs to be de-and re-engineered for broader applications
Much of what is now good and useful science was discovered or evolved for a specific purpose. Carnot encountered the phenomenon now called entropy trying to improve efficiency of steam engines that lift water. Gibbs borrowed the notion to explain why not all reactions are exothermic. The solving of internal combustion engine "knock" by adding lead to the gasoline compound was hailed as a great advance until the downsides of leaded gasoline were encountered. Military urgency has been a great driver of new technologies, with the invention of the nuclear bomb as an extreme example. Radar would not have had a budget to research and develop but for the Battle of Britain. In driving science for a specific purpose, after the goal has been achieved, in subsequent decades with further and broader applications, there may be benefits in revisiting the inventions to fit better into a broader context.

The energy concept hides more than it reveals
James Joules was a brewer, and wanted to improve efficiency by using the new electric motors instead of steam engines in his brewery. From this purpose in comparing two technologies he evolved the concept of the mechanical equivalent of heat, leading to popularization of the term and concept of energy. Joule's later experiments in finessing his idea are amazing, and he settled on the equivalence factor between rotating a paddle (mechanical) and the resulting rise in temperature to an accuracy of 2 decimal places of the modern tested fact. Yet I dare to suggest that in understanding economies, the global economy, and the biosphere, the concept of energy hides more than it reveals. It certainly confuses many people in positions of power and leads to wrong decisions. First, as mentioned above, energy cannot be produced or consumed. When energy is transformed, it may be in part dissipated, but by definition the energy transformed is the equivalent value to its original state. While all energy is quantified in joules, very different forms of energy are being confused. Economists, business people and politicians use the term "energy" to mean electricity, fossil fuels, and other dynamic forces including wind, rivers, geothermal heat and tides. These forces are different by nature (Urone, 2004). www.intechopen.com

Forces
The four forces of totally different characteristics are 1. gravity; 2. the electromagnetic force, in magnets, the heat and light of fire, sunshine, electricity, and "chemical energy"; 3. the nuclear force which holds protons and neutrons in a nucleus to define which element an atom is; and 4. the weak force, only encountered in extreme subatomic experiments.
If an intelligent alien was observing the Earth from a distance, and noting the natural and anthropogenic activities in the biosphere, the four forces would be distinguishable. Energy analysis would be an unnecessary complication. Many of the forces are vital to life and to humans, but we cannot change them, and certainly do not pay for them.
i. Start with the Earth's orbiting momentum (giving us seasons) and rotational momentum (for day and night and heat distribution). To be seriously comprehensive in inventorying the energy of the Earth, we need to acknowledge that the Earth's momentum orbiting the sun is 2.66 x 10 40 kg/m 2 /s and the angular momentum in its rotation on its own axis is 7.074 x 10 33 kg/m 2 /s. Whatever the Big Bang was, that imparted those forces, they have evolved to give us what we have got. The point in stating what many might disdainfully comment as superfluous information is that this provides an exercise in acknowledging what energy on Earth we can control or influence (and is a cost) and what is natural and free. ii. Gravitational force is big in our imagination because it is so obvious and ubiquitous. It is common in our economic modeling but only as a model. We do not pay for gravity. iii. The force that attracts the protons and neutrons in the nuclei of any atom is so powerful that until the atom was "split" to end World War 2 and then for peaceful electricity generation, each element had to be taken as a "given". To understand the biosphere and the economic analysis we might apply to it, it is best to begin taking atomic structure for granted by putting the nuclear force to one side (i.e. for a simple first step, view any conventional economic processes but not those using nuclear power). iv. It is the electromagnetic force (emf) which is and can be the agent of intelligent or stupid change in our biosphere. We can light fires and boil water, we can move about, digest, breathe, see and make noises. It is emf that holds an electron in orbital around its proton partner. This is chemistry. This is about fuels, low carbon, what tastes good, and if you feel well.
A lot of emf is beyond our control, starting with sunlight. The emf in photons distils and lifts 1,400 trillion tons of water from the sea each 24 hour rotation to start the water cycle, which is then abetted by the Earth's rotation and the sun's play on air (winds). But in the animals we ascended from, and in primitive communities 100,000 years before recorded history, homonoids were taste testing, moving about, and exercising some mastery over their emf. It is anthropogenic emf that we should understand and monitor in the study of economic activity in the biosphere (Coulter, 2002).
There is hopeless confusion in economics by not arriving at this revelation. For example, we do not pay for hydropower. The emf from the sun lifts millions of tons of water up to the head of river valleys and gravity attracts it back to sea level. The economics is in the cost of www.intechopen.com building and maintaining the hydropower facility, and that can be quantified in physics and chemistry without accounting in money.

A false notion
The father of scientific method, Francis Bacon identified four problems with reasoning and the most dangerous was misuse of terms -"names of things which exist, but yet confused and ill-defined, and hastily and irregularly derived from realities" (Bacon, 1620). The said "releasing" of "chemical energy" from carbohydrate or hydrocarbon fuel and pivotal in any comment on the state of the biosphere is just such a brain-crippling notion.
As indicated in 6.1 above, it must have been exciting when scientists discovered how to measure the energy "released" when fuel is combusted. It can now be derived from tables based on previous work showing listings of energy values of bonds of compounds. Totalling the values for bonds listed in the reactants and products and calculating the difference will provide an estimate of energy released. That makes sense to ordinary people, even an economist. Here is the starting total. Here is the final total. The difference is what people call energy (mainly heat). This is a very useful calculation. But there is something strangethe bond energies in the resulting products are greater than in the original reactants. The atoms in the product are locked tighter togther than before, and the increased bonding is equal to the amount of energy released. In specific purpose driven science, all that mattered at first was the resulting number for available energy, and it could be calculated accurately. An engineer looks at a ton of coal and says it can produce 24 gigajoules of heat.
That is very useful in comparing where else energy may be harnessed from including hydropower, wind power, tidal power and direct solar power. But now that environmentalists are forced to look at surroundings and the bigger picture, culminating in the materially finite biosphere, it is relevant to ask, where does 24 gigajoules of heat energy in coal come from, especially when in calculations of chemistry, it seems surprisingly the opposite of common sense.

De-engineering energy calculations
Coal and gasoline have complex chemistry, so the energy released from the simplest hydrocarbon is noted at the molar level, and at the level of a single methane molecule. This is set out in Table 1.
The above data for a mole of methane are readily available from standard textbooks (Atkins & Jones, 2000) and chemistry tables. The data is obtained from calorimetric experiments. In applying chemistry table data to practical everyday problems, a mole of methane (6.02214 x 10 23 molecules of CH 4 ) ought to be converted to the volume in cubic meters and its layman's name of natural gas. It is the frequent failure to convert that exacerbates the divergence of the two cultures, science and economics. There are conversion tables for common fuels.  It is not normally practical to experimentally investigate the bond energies and energy release for one molecule of a fuel, so calculations simply divide by Avogadro's number and convert joules to eV. For one molecule of methane combined with two molecules of oxygen, the bond energies total about 27eV and the product carbon dioxide and two water molecules total about 35 eV. The 8 eV of energy in the form of heat are "released" because they are not www.intechopen.com needed. It is pertinent to question, where did the released eV of energy come from and why. This question has not been asked by chemists solely concerned with delivering energy for practical purposes such as heating and driving a car. But from a point of view of understanding the environment, and indeed, the biosphere, this apparent mystery ought to be investigated.
The answer seems to be that the conventional chemists' calculations only deal with the bond energies between atoms in a molecule that are traded. If we look at a simple diatomic molecule, taking O 2 as an example, then the standard test result is 494 kJ of energy per mole required to separate into separate atoms. Thus the matrix of the coulombic forces between two sets of 8 electron-proton pairs is calculated to be that which requires 5.1 eV to separate. In terms of the Second Law of Thermodynamics, and the "available" -"unavailable" nexus that distinguishes entropy from energy, the release of 5.1 eV (available) must come from a far larger base, which afterwards is then totally "unavailable" and in economic terms is waste, useless, or just pollutant. The two atoms each have 8 proton-electron pairs and it is merely a small residual charge of this that forms the molecular bond. In previous study before introduction to Bader's Atoms in Molecule approach (Bader, 1990), the oxygen molecule was depicted as a complex set of 16 proton-electron pairs centered on the two nuclei, with the valence electrons of one nuclei far outside the bond length of 0.121 nanometers, overlapping the other nuclei and attracting the protons in it "like spiders making love" as borrowed from a doctoral dissertation (Coulter, 1993) and shown in Figure 1. www.intechopen.com The phenomenon of entropy inexorably increasing, as observed when any two solids, liquids or gases separated start off with different energies (heat, molecular motion) and are then brought together, can be characterized as simple "even out". Hot and cold even out. High and low pressures even out. The simplest model is the concept of two springs unevenly set, naturally evening out, as shown in Figure 2. Fig. 2. The core concept of "available" energy evening out (Marshall, 1978) Appreciation of this phenomenon, and its quantification, is not a trivial point, and has been classed as "one of the most important problems in physical science" (Marshall, 1978). Taking the hydrogen ionization energy as 13.6 eV and for oxygen and carbon a coulombic force of attraction per proton-electron pair equivalent to 13 eV for inner shell 1s and 10 eV for 2s valence shell, the total energy of the atoms in reactants is 464 eV and, after release of 8.eV, 456 eV in the products. The ratio has evened out from 27.46/464 to 35.86/456, or from 0.059 to 0.079. Thus from this perspective bond energies used in calculating energy released are only a small fraction of the total energy (coulombic forces) within an atom and that this factor must be taken into consideration when we look at energy equations of reactants and products.
The question then focuses on, from what energy is the energy released evening out? It seems fair to conclude that fuels, hydrocarbons and carbohydrates, do have "available" energy from a total amount of coulombic forces within the molecule, and we could estimate the before after www.intechopen.com ratio of bond energies to total amount in the reactants and see how that fares in the product. At the time of initial study, the evolving field in physical chemistry pioneered as Atoms in Molecules (Bader, 1990) was unfolding in Canada and not known to this project. In the hydrogen atom the total emf charge existing in the atom in ground state is the coulombic force between the electron and proton and is tabulated as 13.6 eV. It seems that in conventional chemistry it is too difficult and not interesting to estimate the aggregate of coulombic forces in more complex atoms and molecules. For the purpose of suggesting an indicative ratio as available energy evens out to be come unavailable, a very rough estimate of the other two atoms involved in the combustion of methane was made. For carbon and oxygen, it was assumed the two electrons in the 1s inner shell also were attracted to their proton partners with an energy equivalent to 13 eV, and the electrons in the next shell assigned an estimated value of 10 eV.
Subsequently Bader's Atoms in Molecules (AIM) approach became better known, among much controversy. For the purpose of understanding energy released from fuels in the context of entropy change, from a low ratio naturally shifting to a higher ratio in an exothermic reaction, Bader's AIM proves insightful. Previously chemists resorted to telling the public that the shape and form of atoms and molecules was essentially unknowable, the Schrodinger equation for any atom other than the single protoned hydrogen too complex in wave-particle orbitals to picture. Bader radically changed that, applying the observable aspects of electronic charge density. Bader's AIM can contour molecules to show their shape, as in Fig 3, showing a carbon monoxide molecule with the oxygen atom nesting against the carbon atom.
Bader's pioneering work has proven so useful in applications that the main uses now are for determining the shape of complex molecules allowing two new fields of science to emerge and prosper: i.e. material engineering where scientists design the crystal structure of compounds, and drug design to actually "dock" on anomalous biological molecules in animals and humans and "fix" them, particularly successful on rogue enzymes. Fig. 3. The radical approach by Bader in his book (1990) and subsequent publications and those of his students allows 3D solid contoured modeling of atoms in molecules. This is a simple carbon monoxide molecule, showing its "shape" according to electron charge density. The blue shape on the left is the electron charge density of a single oxygen atom when bonded to a single carbon atom (the blue mushroom shape on the right).
The application of simply scaling up simple molecules to macroscopically observable objects has not been pursued but offers insights to appreciating molecules and atoms in ionic crystal www.intechopen.com lattices and then zooming out to observable matter, and conceptually, large volumes, ultimately to representative key chemical reactions in the biosphere. Fig. 4. The side profile of a cube of sides 1 nanometer with a conventional space-filling representation of a methane molecule together with a Bader 2D contour model of the electronic density in an oxygen molecule. The methane model is imaginary, but based on a known C-H bond length of 0.109 nanometers, the molecule is about the right size to fit into the 1 nanometer sided cube. The oxygen molecule outer contour defines 98% of the charge and Bader lists its dimensions accurately as length and width 0.42 and 0.32 nanometers respectively.
In the study (Coulter, 1993) a total of nine theoretical models of atoms in molecules were postulated, scaled up from a single molecule of methane combusting with two oxygen molecules, then a single glucose molecule reacting with six oxygen molecules, through to models of a cubic meter of atom in molecules inventory up to a cubic kilometer that in one year used 237 tons of carbohydrate and 1000 tons of coal. Six actual case studies of real human activity were conducted, beginning with brick making using grass as fuel in 100 cubic meters of space, up to a village in Henan China in 1990 monitoring a population of 1300 people within an imaginary cubic kilometer (100 meters soil depth and 900 meters air) consuming 237 ton of carbohydrate and in their brick kiln and for heating/cooking, 1000 www.intechopen.com tons of coal. In a simple economic activity, in fact not much change occurred. The change that did occur was an increase in entropy signified by the ratio change of available to unavailable energy, from 1.455228 to 1.455255. The inertness of change was mainly due to the 1 billion tons of clay in the cubic kilometer that did not change.

Fitting cubic kilometers into the biosphere 7.1 The concept of the biosphere
It is hard to come to grips with the shape of the biosphere. It can be likened to an appleskin, and it is hard to imagine our world as we know it, with tall mountains and deep oceans and mines, all taking place between the outer and inner layer of an appleskin. This study took the biosphere to include the highest mountains, the deepest wells and the altitudes that airliners fly. To be practical some of the thinning higher atmosphere with its oxygen was excluded. The limits above and below sealevel were set at 12 km. On a model of the Earth the size of a basketball, a biosphere 12 km above and below sealevel would be thinner than half a millimeter. Plus its spherical shape defies modeling -made of glass or plastic it would be too fragile and easily broken. There is an extra difficulty in that the biosphere is not a true sphere. But for practical purposes we can take the given average radius of 6371 km, and subtracting the volume of 12 km down from 12 km up, the volume of the biosphere is 12,241,590,428 cubic kilometers.

The biosphere as a square prism
The pertinent characteristic to model for understanding of economic development in the biosphere is its material finiteness. To make a model as a square prism, with the y axis proportional to the x and z axes surface area, a glass "aquarium" of depth 24 mm would have to extend 22.5 meters in length and breadth. Thus even without the spherical characteristic of the biosphere, its dimensions are hard to grasp. This fact is exacerbated when attempting to track entropy at the level of Bader's atoms in molecules within cubic nanometers, and then scale up to observable processes and economic activity culminating in the global economy. An approach tested was to keep the y axis as linear scale, from +12 km though sealevel to -12 km, but to track the x axis (that is one side of the "aquarium") on a log 10 scale.
Against these large numbers, the matters vital to us -biomass (all forests and vegetation, animals in land, sea and air) and fossil fuels, hardly show up on the biosphere profile. It is not reasonable to place them on a log scale, and at plain linear scale they are difficult to see. Moreover, they are diversely scattered above and below the Earth's surface, and their chemical interactions can only be tracked by selecting representative processes in representative locations. This simple inventory in cubic kilometers, tons, and in kilometers of the side of a cube are set out in Table 4.
When scaling chemical processes from the molecular level to the macroscopic, as was conducted on an iron ore blast furnace, it was found useful to model the biosphere as a cross-section profile with log 10 scale on horizontal AND the vertical axes. Beginning with the focus on one cubic nanometer at the center, it takes 13 orders of magnitude to reach the altitude or depth of 12 km, 15 orders of magnitude to extend from the seashore through www.intechopen.com Fig. 5. A profile of the "aquarium" model of the biosphere with the horizontal scale to log 10 stretching from a seashore to land 6,775 km (left) and with sea 15,809 km to the right. As representative key elements, it is extraordinary given the diversity of the biosphere, and its biodiversity, that land, sea and air, roughly equate to solids, liquids and gases and that silicates and iron oxides make up 70% of the Earths crust, the sea is 96.5% H 2 O, and air is 99% N 2 and O 2 .
Cubic kilometers tons side of cube in km  www.intechopen.com 6,775 km of land, and 16 orders of magnitude to represent the 15,809 km that represents one side of the volumes of the oceans. As will be seen below in tracking Fe 2 O 3 through to refined iron, concentrating on the situation of one minimum set of atoms in an iron ore lattice and then lifting the scale ten times, repeatedly, can lead to some aspects of reality otherwise unappreciated. Specifically, the phenomenon of emf can be tracked and why it is the key to economic production costs can be understood.
In Figure 6, the main units of the side of a cube are shown: a nanometer, centimeter, meter, kilometer and then outside that, extrapolation to wide open spaces up to and higher than the highest mountain (8.848 x 10 12 nanometers and the deepest sea (1.091 x 10 13 nanometers). But this would seem limited in application except for the introduction of the concept of a "nanometer scanner".
Fig. 6. Log10 scale matrix of profile of the biosphere's volume, from outer perimeter Y=24km, X=22,585km down to 1 km, 1 meter, 1 cc and 1 nanometer. Note annotations of km at 10 12 , meter at 10 9 , cc at 10 7 in units of nanometers.

The nanometer scanner
In computer war games now, the hunter looks ahead at objects in focus with peripheral objects unclear. Should something to the side attract interest and the hunter turns there, the item of interest becomes clear and can be zoomed in on. This concept similarly allows analysts to study one key cubic nanometer of matter and energy in detail and then move out to broader resolutions with some intimacy of what is going on in fine detail, and able to stop and check wherever is needed. Figure 7 shows a screen with 3 orders of magnitude. Should the attention shift to one side, then one of the 10 square nanometers would enlarge to show 1 nanometer.

Iron refining case study 8.1 Iron ore at the molecular level
Several millennia ago it is supposed that primitive people discovered a certain rock beside a hot fire oozed metal which was useful for tools. The basic technical process of iron refining (requiring forced oxygen onto the ore to heat it, now known to be a temperature of 1600 o C) is at least a thousand years old. With the advent of modern techniques in chemistry the lattice crystal of Fe 2 O 3 is now known and can be represented accurately. If Bader's Atoms in Molecules theory (Bader 1990) can be applied, the shape of the atoms in the lattice can be plotted with a high degree of accuracy and bring intelligent understanding to the bonding of atoms and what it takes to separate them. What previously was just blind fierce heating and causing atoms to oscillate wildly till they were released from the bond can now be reconsidered as a challenge to accurately targeting atoms with a mild head-on jolt to cause the iron to release from the oxygen. This is analogous to intelligently potting billiard balls with a queue, as opposed to violently and blindly shaking the table hoping that the balls may fall in. This would be a major advance in industrial refining. Insight into the possibilities provoked a senior iron industry executive to exclaim, "That would be the holy grail!" (Coulter, 2007). A schema of a set of iron and oxygen atoms nesting in an iron ore face can be presented (Pivovarov, 1997) and on scale shown within a 1 nanometer sidedcube. The object is to dislodge the oxygen molecules so that the iron is refined. In current technologies carbon atoms from the blast furnace process using carbon atoms from coke or calcium carbonate are used, hitting the oxygen atom at random angles, and given the intensity of action due to fierce heat, dislodging and bonding with a much higher bond strength than that previously existed between the iron and oxygen atoms. The oxygen atom is "married off" and carried away and eventually only iron atoms are left as molten pig iron. This is shown in Figure 8. Fig. 8. Iron atoms (gray) are bonded to oxygen atoms (white) in the iron ore Fe 2 O 3 . A carbon atom (dark gray) strikes an oxygen atom (white) in the upper diagram and because of its high speed (and momentum) dislodges the oxygen atom, bonding it as a carbon monoxide molecule with a bond energy more than double that of Fe-O. The iron ore matrix is modeled after Pivovarov, 1997. The perimeter square denotes sides of 1 nanometer and the atom diameters are to scale. Bader's theory is set to improve details of understanding and lead to more efficient technologies

Entropy of a blast furnace
The iron industry in China provides perhaps the best single example of how global industry functions and can be quantified in science without having to resort to accounting in money.
In 2010 China produced about 600 million tons of crude steel, about 40% of global output. China's blast furnaces range from very large to small county-level technologies, and several typical medium sized operations were studied. In an application of Substance Flow Analysis in the refining of iron in a blast furnace in North China, survey showed that the matter inputted, including 1.75 tons of iron ore, 0.75 tons of coke and 0.10 tons of methane gas, totaled 4.99 tons, and that through mass balance accounting the outputs were all recognized. Producing 1 ton of iron ore also produces 750 kg of carbon dioxide. The entropy increased from 17,879,562 units to 18,365,986 units (2.7%) Entropy calculations were based on molar entropy tables (Atkins & Jones, 2002).
A table of tonnages and another table of moles and molar energy (together in Table 5) was produced, and also a volumetric schema of inputs and outputs was deduced (Figure 9).
www.intechopen.com Table 5. The Substance Flow Analysis of an iron refining process showing that to refine 1 ton of pig iron requires a mass balance in and out of 4. tons of matter, and that the entropy increased from 17,879,562 units to 18,365986. Fig. 9. A "Black Box" approach to measuring entropy in an out of a blast furnace.
Assumptions. All data is for 1 tonne of iron produced in 1 minute. Matter measured in mols by dividing weight by molecular weight. Total entropy means molar entropy in kJ/K times number of mols.
Blast furnace chemistry is complex and varies. It possibly requires about 13 steps involving coke and limestone and a designated range of temperatures. Economics focuses first on the iron and steel output for sale and secondly what costs are involved in inputs. With the global output now around 1.5 billion tons in 2011, in a closed, materially finite biosphere, it is imperative to acknowledge down at the basic level of a cubic nanometer, while separating a few atoms of iron into a nearly pure iron lattice, the electromagnetic forces released out of fossil fuel is occurring by changing the ratio of bond energies to total atomic charges to a higher number, making them unavailable for future use. The ratio when considering one methane molecule was estimated to change from 0.059 to 0.079. At the macroscopic level of refining one ton of pig iron, the same phenomenon measured in an entirely different way result is an estimated to increase entropy by 2.7%. In an entire village making bricks over one year, that ratio of bond energies in molecules to their total base charge was calculated to increase from 1.455228 to 1.455255.

Global extrapolation
Whether at the scale of a nanometer or a kilometer, human manipulation of emf results in an increasing ratio of unavailable columbic forces. Entropy increases. At the global scale, so far problems have been localized. The Earth is still a big place for human activity, and defining the 24 km skin as the biosphere, we have almost 5 billion cubic kilometers of solid "land"silicates and other rocks, of which we have barely, literally, "scratched the surface". Underground are 200 zetajoules of methane clathrate (Hermann, 2006) formed during the accretion of the planet that have not been tapped except for one small facility in Russia.
The concept of biosphere adds both a vertical dimension of space and a characteristic of material finiteness to what has previously been conceived of as a "flat earth" even by the www.intechopen.com most modern economists, strategic analysts and world leaders. From primitive hunters through to the peasants of Quesnays's pre-industrial revolution France, right up to modern national planners in various countries, there is a prevalence to conceiving of "landuse" maps in two dimensions. The maps on the walls of the world's great planners, in business, diplomacy and military, are usually flat Mercator projections. Trade requires transport, and the camels of the Silk Road and the galleons of Europe's western seaboard were the media of world trade for many centuries, and no one thought of anything but moving across the surface of the Earth. There was a gradual cognizance of the wealth in fertile soils, then water wells and finally mines and oil wells. But the further from the surface the more difficult economic exploitation became and recent troubles in deep sea oil wells demonstrate the limits of current technologies. And the vertical dimension above the surface was not considered until air travel and then air pollution became imprinted on our consciousness.
What we thought we already knew, and was so obvious as to ignore, now requires a studious and inquisitive visit in the world of the biosphere. The points below may seem puerile and obvious, but there is a purpose in enunciating them: i. On Earth, 2D space is a trap for incoming sunlight, rain, if any, and air. This is obvious for farmland, and two storied farm fields would be a problem. The trap feature is something of value. ii. Land, sea and air have radically different viscosities, porosity and visible transparency.
People live and most economic processes are centered on the land surface. Even air travel, fishing and mining activities are centered for economic transactions on the land surface. iii. Land, namely rock, is the antithesis of fluids. It is "solid as a rock". Only when weathered and crumbled and mixed with organic matter is it soil. We don't like quicksand. iv. Water flows. On land it flows attracted by gravity. In the sea it flows by temperature and the spinning of the Earth, and manifests the moon's gravity in tides. v. Air is invisible, and has a mere density of about 1.4kg/cubic meter. Only under significant pressure changes from temperature (weather) can we feel the wind, and occasionally very strong winds. It constantly delivers us oxygen, and carries emissions, including our own, away, and delivers carbon dioxide to plants.
In the era since 2008, when hundreds of billions of dollars can be conjured, not even by printing bills, but by a click on a computer screen to dilute what participants in markets think goods and services are worth, it is a reasonable check on reality to survey the physical world using science. China till 1993 published National Accounts in tonnages of all major commodities. This provided a perspective that can be calculated in exergy produced and consumed (destroyed by being used up). While financial National Accounts boast a GDP grossing all goods and services produced without acknowledging from what source they were drawn, the exergy accounts show in China for 1993 that the per capita output was 9.88 giajoules, but that the per capita input of exergy (used up) was 53.3 gigajoules (Chen and Chen, 2006).
One contribution to bridging the gap between surreal financial reporting and the physical world is in the emerging discipline of environmental informatics. In the past several decades new technologies have produced advanced sensors to monitor a vast range of physical conditions from simple temperature and moisture to many pollutants in air, water and soil, www.intechopen.com and through wireless technologies and powerful computing, integrate the information into a real time picture of a farm, river, lake, city's air quality, or a whole region. In some fields this approach has gone global.
Environmental informatics can be applied to the refining of iron from an iron ore molecule to the global emission situation. Reconciling the simple facts with Figure 6 is challenging and insightful. If the central cubic nanometer is solid iron ore crystal at the face, then moving to the next cell and zooming out to a space where there are a 1000 cubic nanometers in a blast furnace, with high speed gas, mainly carbon atoms hitting the crystal face and ricocheting off with an oxygen atom, we can then conduct an imaginary track through centimeters and meters to hundreds of meters away from the chimney. It is theoretically plausible to zoom out to the biosphere and observe CO in the atmosphere, as shown in Figure 10. While it would be too much to track each cubic nanometer, should any challenge be made as to the veracity of the observation, links in the chain of tracking can be traced, zoomed in on and discussed.
Scientists may feel uncomfortable at this brash shorthand, but it has more objectivity than many assumptions made by economists. The bare truth is that microeconomics does not add up to macroeconomics, even in theory. In microeconomics individuals and enterprises conduct their activities based on their market perceptions, and markets may be sensitive enough to correct in hours, but maybe in decades. To some extent some wars are market corrections, adjusting for conflict over resources, often land or oil, or even just water. There is no logical connection between how grassroots microeconomics can be aggregated to a nation's GDP; how around 200 GDPs are added to estimate global GDP is a work of rough fiction, and gives little insight into the wellbeing and positive and negative contributions to the state of our planet.
When energy accountants conclude that the global burning of fossil fuels in 2011, liquids, natural gas and coal, was 450 exajoules, there are two observations that can be made on the basis of the content of this chapter: first, the aggregate can be checked and verified as a reasonable number, and can be disaggregated and argued about and discrepancies identified, and either corrected or explained. The disaggregation can zoom down to where the number of atoms must balance, at least in theory, and approximately, in practice. The exercise is much more a scientific method than estimates at global GDP, and more meaningful than global accounts in money terms of energy usage. The second observation is that that energy was not just "released" from nowhere in order to drive generators and transport and blast furnaces and heating. It is the numerator begging us to identify the denominator. In the case of methane, as a representative fossil fuel, the 8 eV of energy released came from a denominator of 464 eV, so that the ratio increased from 0.059 to 0.079. In the case of burning 100 kg of methane to refine a ton of iron, using a different approach to calculation, entropy increased 2.7%. A different approach again was taken looking at a village making bricks over a year, where change was also measured against the backdrop of a whole cubic kilometer of the biosphere.
Whatever the methodology, some attempt should be made to quantify the increase in entropy at the global scale, thus providing some guidance on how to alleviate degradation of the biosphere. Since the 450 exajoules of fossil fuels consumed in 2011 consist of liquids, gas and coal, the ratios at the molecular level would be one useful indicator. Despite the www.intechopen.com complexities of liquid fossil fuels and lignite and anthracite coals, their carbon composition all is in the vicinity of 70-90% and more exact calculation can be made, but the ratio is roughly that as of the representative methane combustion model. In rough terms, total energy from fuels annually is 0.5 zetajoules "released" from 26 zetajoules locked up in the coulombic forces of proton-electron pairs in carbon, hydrogen and oxygen atoms. Fig. 10. Composite satellite image of the world showing heavy CO from China and drifting across Korea and Japan, even to US. Very dark pixels over Tibet are simply no reading because the data is from a lower altitude equivalent to air pressure of 850 hPa.
In this frame of mind, it is insightful to review the global exergy chart published by Stanford University Global Climate and Energy Project (Exergy Flow Chart, 2009). Dispelling the confusion in the term, energy, exergy can be rigorously quantified as 1) stored, 2) flowing, or 3) destroyed (used up and "unavailable" for further use). On the chart, the storage and flow of fossil fuels can be summarized as in Table 6.
Stanford University Global Climate and Energy Project exergy chart extends over 2 A3 pages and has extremely well researched detail integrated into a comprehensive system covering the entire biosphere. It begins with an input flow of 162 x 10 15 joules per second www.intechopen.com  (2008) can be related to the 450 exajoules of energy "released" in 2011. For insight into the impact on the biosphere, we should present the result as energy "released" (or exergy flow) against the context of making 26 zetajoules in carbon, hydrogen and oxygen atoms now locked up and "unavailable".

Anthropogenic electromagnetic forces
Of the four kinds of forces (section 6.3) the nuclear force holding protons and neutrons together in a nuclei to give an element its identity is not of relevance. We cannot economically turn lead (82 protons) into gold (79 protons). The last 60 years of development of harnessing the nuclear force for generating electricity are an added complexity and in this initial study is not discussed. The force of gravity is ubiquitous and though it plays a crucial part in economic processes is not a cost, and cannot be paid for. If we wish to influence gravity, we must harness electromagnetic forces to change the status, such as lifting using electricity or gasoline engines. The fourth force recognized in physics (the weak force only encountered in particle physics) is not relevant to this chapter.
Thus it becomes clear how confusing are the pronouncements by economists, business people, public administrators up to and including world leaders, and yes, some scientists, on the topic of "energy". The concept of exergy is far more concise, and apart from the nuclear power industry is entirely related to electromagnetic forces. Moreover, to address economic management issues, natural exergy flows that are truly free should not be included in costs. Hydropower, wind turbines, tidal power generators all have costs in manufacture of facilities, and in service and parts maintenance, but not in the flow of natural exergy.
The Szargut approach (Szargut, 1988) to exergy analysis introduces with clarity the concept of "embodied exergy" in matter, which has as its starting point a reference environment.

www.intechopen.com
The question is posed, how much exergy is required to transform that matter in initial condition to a subsequent condition? As an example, iron ore lying insitu is assigned a value of 13.54 kJ/mol, possibly the exergy cost of realizing it is there. Then to transform into pure iron, the exergy cost embodied in the final product is 374.8 kJ/mol (Ranz-Villarino, Valero & Cebellero, 1998, Table 1).

Useful trends
The trend of economic and financial analysis has not resulted in the market correction needed. Analysts do not know where to look for answers, and therefore recommend remedies still stuck in the old language of solving global economic and environmental problems with more money, stimulus packages, quantitative easing and bailouts. There is an imagined linear trend of improved livelihood, and any recession is treated as unexpected and temporary, and at the worst, a hint of a double dip before recovery takes hold again. This approach is destined to catastrophic correction.
The real and actual trend occurring in the biosphere is an inexorable increase in entropy. Put another way humans are destroying exergy at a rate much higher than the flows we can harness. Realization of this trend immediately directs thinking into exploring how can the flows and stores we do not have the technology not to harness be tapped. Clearly they are not easy, but some of them have not even been considered.
The term "energy" confuses analysis of how to manage our biosphere. We have amounts of energy from the sun, and from the orbiting, spinning Earth, and its core that are unimaginably large. We need to study, firstly, what is useful energy, and secondly how much of that is free (eg wind) and what do we have to do to harness that which is not free. In short, the economic cost can be linked to anthropogenic exergy, as is addressed by Stanford University Global Climate and Energy Project. The trend exposed in dollar terms, that the value of ecological services per year is about double global GDP (Costanza et al, 1997) can be explored in terms of science without relying on money, which is a refreshing relief when governments are severely watering down currencies. In this chapter, case studies were described that began at the scale of the simplest hydrocarbon molecule (methane) in a cubic nanometer and scaled up through various studies to address global fuel exergy flow and destruction. The key trend which needs to be explored and developed into strong science is that anthropogenic exergy, or what we popularly think of as energy we can harness, must come from somewhere, and after it is used (destroyed) it leaves a base of unusable energy. In lay terms coulombic bonds in some molecules easily release "energy" (they can explode, or at least react spontaneously, even if slowly, as in sugars we eat). The resultant product molecules are more "locked up" with, for example, the coulombic force between a single oxygen and carbon atom three times as strong as that between two oxygen atoms. That is why CO is toxic and O 2 is the breath of life. No way can we breath CO hoping the oxygen atom will split off and match up with another oxygen atom. What we perceive conventionally as usable energy has been addressed in this chapter as a numerator that has a scientifically calculable denominator. All our actions drive the ratio of the pair to a higher number (entropy inexorably increases). The ratio is presented as an estimate or at least intimated as a number yet to be understood in Table 7.
case study unit of measure  Table 7. A summary of 7 scales of exergy destruction starting from the combustion of one molecule of the simple fuel, methane, up to global fossil fuels estimated to be "consumed" in 2011. At different scales different units are used, and different methods of quantification are applied, but the universal point is that there is exergy destruction (mainly heat released) as a numerator over a denominator which in the past has not been acknowledged or even conceptualized. The denominator for global fossil fuels is simply an estimate commensurate with computations at lower scales.

Acknowledgement
The direction of Professor Shi Lei, Environmental Science and Engineering Department, Tsinghua University, especially in iron and aluminium Substance Flow Analysis has been an inspiration in linking chemistry at the molecular level to macroscales.