Order of magnitude of the power required to perform some common activities.
1. Introduction
Energy management is considered a task of strategic importance in contemporary society. It is a common fact that the most successful economies of the planet are the economies that can transform and use large quantities of energy. In this chapter we will discuss the role of energy with specific attention to the processes that happens at micro and nanoscale, that are the scales where the modern ICT devices are built and operated.
2. Energy and its transformation
We start our journey toward the role of energy in ICT devices by addressing the most fundamental question of what energy is. According to Richard Feynman “It is important to realize that in Physics today, we have no knowledge of what energy is”. The Feynman Lectures on Physics Volume I, 4-1.
In the following table we list few examples of how much power is required to perform some common tasks.
|
|
|
|
108 | |
|
106 | |
|
105 | |
|
103 | |
|
102 | |
|
10 | |
|
1 | |
|
10-1 | |
|
10-2 | |
|
10-3 | |
|
10-4 | |
|
10-5 | |
|
10-6 | |
|
10-7 | |
|
10-8 | |
|
10-21 |
Table 1.
From our everyday experience we know that energy is used for many different tasks and comes in many different forms. We know that it is energy because we can use it to perform work, like moving a car or a train. Thus energy is a property of physical systems that can be used to perform work and usually comes inside physical objects like a hot gas or a gasoline tank. Thinking about it we can ask questions like: how can we make the energy contained in a litre of gasoline to push forward a car or how can we use the heat produced by burning coal to make the train run?
Questions like these were at the very base of the activities performed in the early seventeen hundreds by the first inventors of the so-called thermal machines. People like Thomas Newcomen (1664-1729) who built the first practical steam engine for pumping water and James Watt (1736-1819) who few decades after proposed an improved version of the same machine. It is thanks to the work of scientists like Sadi Carnot (1796-1832) and subsequently of Émile Clapeyron (1799 - 1864), Rudolf Clausius (1822 - 1888) and William Thomson (Lord Kelvin) (1824 – 1907) that studies on the efficiency of these machines aimed at transforming heat (just a form of energy) into work brought us the notion of entropy and the laws of thermodynamics.
These laws do not tell us much about what energy is but they are very good in ruling what can we do and what we cannot do with energy. Let’s briefly review them.
It was initially formulated by Julius Robert von Mayer (1814 - 1878) and subsequently reviewed by James Prescot Joule (1818-1889) and Hermann Ludwig Ferdinand von Helmholtz Feynman assertion that the notion of energy has never been very clear is testified by the fact that the key publication of Helmholtz, considered the father of the conservation of energy, is entitled “Über die Erhaltung der Kraft”, “On the conservation of the strength”. On the other hand the kinetic energy has been called for a long time “vis viva”, Latin expression for “living strength”.
Clausius formulation: “No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature”.
Kelvin formulation: “No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work”.
An important consequence of the second law is that there is a limit to the efficiency of a thermal machine. This limit was discovered by Sadi Carnot in 1824 when he was only 28. In the publication entitled
Few years after the work of Carnot, Clausius used this result to introduce a quantity that is useful in describing how much heat can be changed into work during a transformation. He proposed the name “entropy” for his quantity. The idea is the following: if you want to operate a thermal machine you have to find a cyclic transformation during which heat is changed into work. The cycle is necessary because you want to operate the machine continuously and not just once. Clausius proved a theorem that states that during a cyclic transformation, if you do the transformation carefully enough not to loose any energy in other ways (like friction), then the sum of the heat exchanged with the external divided by the temperature at which the exchange occurs is zero:
The cycle does not depend on the path that you take and, clearly you start and end at the same state. This is equivalent to say that it exists a state function
(or in differential form
A transformation like this is also called an
In the particular case in which we are considering a transformation without any heat exchanged then the second term is zero and the final entropy is always larger than the initial one. A typical example is the so-called adiabatic expansion of a gas. If we consider an infinitesimal transformation we have:
where the equal sign hold during a reversible transformation only. The previous equation is sometimes considered a concise formulation of the second principle of thermodynamics. If I put in contact a physical system that is at temperature T1 with a heat reservoir that is at temperature T2>T1 then some heat is transferred from the reservoir to the system. Accordingly the integral is positive and the entropy of the system increases. The other way around phenomenon by which heat is transferred from the system to the reservoir does not happen (second principle) and thus we conclude that during a spontaneous transformation (i.e. without external work) the entropy always increases. We can make the entropy of our system decrease (like in a refrigerator) but we have to add work from outside.
Back to the Clausius inequality, it is useful to interpret the quantity
In summary we can state that change in entropy is a measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work.
3. The microscopic perspective
Notwithstanding the above explanation, available since the late 1800, in general the entropy remained an obscure quantity whose physical sense was (and somehow still is) difficult to grasp. It was the work of Ludwing Boltzmann (1844 – 1906) that shed some light on the microscopic interpretation of the second law (and thus the entropy) as a consequence of the tendency of a physical system to attain the equilibrium condition identified with the most probable state among all the possible states it can be in. The ideal world of Boltzmann is made by physical systems constituted by many small parts represented by colliding small spheres L. Boltzmann was a strong supporter of the atomistic view of the matter. An idea that was not given for granted (nor so popular) at his time. This did not help an often depressed Boltzmann that in 1906, during one of his bad mood crises decided to kill himself.

If all the particles have the same velocity

This will compress the spring up to an extent
This is a simple transformation of kinetic energy into potential energy. We can always recover the potential energy
This example helped us to understand how energy and entropy are connected to the microscopic properties of the physical systems. In the simple case of an ideal gas, the system energy is nothing else than the sum of all the kinetic energies of the single particles. We can say that the energy is associated with “how much” the particles move. On the other hand we have seen that there is also a “quality” of the motion of the particles that is relevant for the entropy. We can say that the entropy is associated with “the way” the particles moves. This concept of “way of moving” was made clear by Boltzmann at the end of 1800, who proposed for the entropy the following definition:
where Here we assume that all the microstate are equiprobable. The extension to the more general case with microstates with different probabilities has been proposed by Josiah Willard Gibbs (1839 – 1903).
We have seen above that during a spontaneous transformation the entropy of the system increases. This can happen without any change in the energy of the system itself There is a famous experiment performed by Joule that shows that the free expansion of a gas is a process that can happen without exchange of energy. Thermodynamics in the small scale is a kind of oxymoron. According to J. P. Sethna, thermodynamics is the theory that emerges from statistical mechanics in the limit of large systems (J.P. Sethna, Statistical Mechanics: Entropy, Order Parameters and Complexity, 6.4, Oxford Univ. Press. 2008).
4. What does irreversible mean?
When we introduced the entropy change we specified that this is defined in terms of heat transfer, once we perform a Here slow means slow compared to the time it take for the system to relax to equilibrium.
What happens if we do not go “slow”? Well, as we have seen before, in this case we are performing an
In this regards it is interesting to inspect more in detail the example of the movable set in contact with the gas, we introduced before. When the system represented by the particle gas + the movable set is at equilibrium the movable set is not only acted on by the collision of the particles but is also damped by the very same source. To see this effect we can consider two simple cases.
We compress the spring to some extent and then we release the compression leaving it free to oscillate. After few oscillations we observe that the oscillation amplitude decreases as a consequence of what we call the friction (viscous damping force) action due to the presence of the gas. The decrease ceases when the oscillation amplitude reaches a certain equilibrium value and after that it remains constant (on average, see following figure). Some energy has been dissipated into heat.

We now start with the movable set at rest and leave it free. After few seconds we will see that the set starts to move with increasing oscillation amplitude that soon reaches an equilibrium condition at the very same value (on average) of the first case (see following figure).

In both cases the two different roles of damping-force and pushing-force has been played by the gas. This fact led to think that there must be a connection between the process of dissipating energy (a typical irreversible, i.e. non-equilibrium process) and the process of fluctuating at equilibrium with the gas.
5. A bridge toward non-equilibrium: fluctuation-dissipation relation
In order to unveil such a link we need to introduce a more formal description of the dynamics of the movable set. This problem has been addressed and solved by Albert Einstein (1879 - 1955) in his 1905 discussion of the Brownian motion and subsequently by Paul Langevin (1872 - 1946) who proposed the following equation:
As before
where the <> indicates average over the statistical ensemble.
Now, as we noticed before, by the moment that the gas is responsible at the same time for the fluctuating part of the dynamics (i.e. the random force Nyquist established this relation while he was studying the voltage fluctuations across an electrical resistor. The random electromagnetic force arising across a resistance at finite temperature is a function of the value of the resistance itself.
and represents a formulation of the so-called Fluctuation-Dissipation Theorem (FDT)[1,2]. There exist different formulations of the FDT. As an example we mention that it can be generalized to account for a different kind of dissipative force, i.e. internal friction type where
Why is FDT important? It is important because it represent an ideal bridge that connects the equilibrium properties of our thermodynamic system (represented by the amplitude and character of the fluctuations) with the non-equilibrium properties (represented here by the dissipative phenomena due to the presence of the friction). Thus there are basically two ways of using the FDT: it can be used to predict the characteristics of the fluctuation or the noise intrinsic to the system from the known characteristics of the dissipative properties or it can be used to predict what kind of dissipation we should expect if we know the equilibrium fluctuation properties. Its importance however goes beyond the practical utility. Indeed it shows like dissipative properties, meaning the capacity to produce entropy, are intrinsically connected to the equilibrium fluctuations.
6. Energy transformations for small systems
How does the description of the energy transformation processes presented so far change when we deal with small systems? To answer this question we start considering an important aspect when we deal with physical systems: the condition of being an
Before answering the question about the energy transformations in small systems we should be more precise in defining what a
One possibility is to do what we did just before, when we dealt with the movable set in contact with the gas of N particles. Here N is of the order of NA. Overall our system is composed by 3N+1 degrees of freedom (dof): 3 for each of the N particles and 1 for the movable set position coordinate
To summarize our approach: we have transformed a
On the other hand we know that if we perform a real experiment with our movable set, indeed we observe a decrease in the oscillation amplitude of the set until it reaches the stop and then it does start to fluctuate around the equilibrium position. This is not and illusion. The potential energy initially stored in the spring is now dissipated due to the presence of the gas particles. How does this fit with what we just said about the dissipation being an illusion? The answer is that the total energy (the kinetic energy of the gas particles + the potential energy initially stored in the spring) is conserved because the (not small) system is isolated. What happened is that the potential energy of the movable set has been progressively transformed into additional kinetic energy of the N particles that now have a slightly larger average velocity (the temperature of our gas slightly increased). Thus during the dynamics the energy is transferred from some (few) dof to others (many) dof. This is what we called before energy dissipation and now it appears to be nothing more than energy re-distribution. Before we have seen that dissipative effects during a transformation are associated with an increase of entropy. Indeed this energy distribution process is an aspect of the tendency of the system to reach the maximum entropy (while conserving the energy). This is what we have called a spontaneous transformation: the increase of the entropy up to the point where no more energy distribution process takes place, i.e. the thermal equilibrium.
Is this the end of the story? Actually it is not. There is a quite subtle aspect that is associated with the conservation of energy. It is known as Poincaré recurrence theorem. It states that in a system that conserves energy the dynamics evolve in such a way that, after a sufficiently long time, it returns to a state arbitrarily close to the initial state. The time that we have to wait in order to have this recurrence is called the Poincaré recurrence time. In simple words this means that not only the dissipation of energy is an illusion because the energy is simply re-distributed among all the dof but also that this redistribution is not final (i.e. on the long term the equilibrium does not exist). If we wait long enough we will see that after some time the energy will flow back to its initial distribution and our movable set will get its potential energy back (with the gas particle becoming slightly colder). This is quite surprising indeed because it implies that in this way we can reverse the entropy increase typical of processes with friction and thus fail the second principle. Although this may appear a paradox this answer was already included in the description of entropy proposed by Boltzmann and specifically in its intrinsic probabilistic character. The decrease of entropy for a system composed by many dof is not impossible: it is simply extremely improbable. It goes like this: for any finite observation time the dynamic system evolves most probably in a direction where the entropy increases because according to Boltzmann this is the most probable evolution. However if we wait long enough also the less probable outcome will be realized and thus the second principle
References
- 1.
Nyquist H (1928). "Thermal Agitation of Electric Charge in Conductors". Physical Review 32: 110–113. - 2.
H. B. Callen, T. A. Welton (1951). Physical Review 83: 34. - 3.
L. D. Landau, E. M. Lifshitz. Physique Statistique. Cours de physique théorique. 5. Mir. - 4.
J.P. Sethna, Statistical Mechanics: Entropy, Order Parameters and Complexity, 6.4, Oxford Univ. Press. 2008 - 5.
Umberto Marini Bettolo Marconi; Andrea Puglisi; Lamberto Rondoni; Angelo Vulpiani (2008). "Fluctuation-Dissipation: Response Theory in Statistical Physics". Physics Reports 461 (4–6): 111–195. - 6.
Macroscopic equations for the adiabatic piston, Massimo Cencini, Luigi Palatella, Simone Pigolotti, Angelo Vulpiani. Physical Review E (Statistical, Nonlinear, and Soft Matter Physics), Vol. 76, No. 5. (2007).
Notes
- “It is important to realize that in Physics today, we have no knowledge of what energy is”. The Feynman Lectures on Physics Volume I, 4-1.
- Feynman assertion that the notion of energy has never been very clear is testified by the fact that the key publication of Helmholtz, considered the father of the conservation of energy, is entitled “Über die Erhaltung der Kraft”, “On the conservation of the strength”. On the other hand the kinetic energy has been called for a long time “vis viva”, Latin expression for “living strength”.
- L. Boltzmann was a strong supporter of the atomistic view of the matter. An idea that was not given for granted (nor so popular) at his time. This did not help an often depressed Boltzmann that in 1906, during one of his bad mood crises decided to kill himself.
- Here we assume that all the microstate are equiprobable. The extension to the more general case with microstates with different probabilities has been proposed by Josiah Willard Gibbs (1839 – 1903).
- There is a famous experiment performed by Joule that shows that the free expansion of a gas is a process that can happen without exchange of energy.
- Thermodynamics in the small scale is a kind of oxymoron. According to J. P. Sethna, thermodynamics is the theory that emerges from statistical mechanics in the limit of large systems (J.P. Sethna, Statistical Mechanics: Entropy, Order Parameters and Complexity, 6.4, Oxford Univ. Press. 2008).
- Here slow means slow compared to the time it take for the system to relax to equilibrium.
- Nyquist established this relation while he was studying the voltage fluctuations across an electrical resistor. The random electromagnetic force arising across a resistance at finite temperature is a function of the value of the resistance itself.