in

Entropy

Entropy

is a measure of disorder. This innocuous- looking definition indicates neither that is connected closely to time nor that the road to an adequate understanding of might some­times wind its way through an area full of wrong paths and dangerous traps. To show the connec­tion between and time, and to dispel confusion, this entry provides a short tour of the history of the concept of .

The concept of entropy was introduced in ther­modynamics, the physical science that studies energy and how it is transformed. In the first half of the 19th century, the technological issue that drove forward was the problem of how to optimize the efficiency of steam engines. Efficiency is the ratio of the net work produced by a cycle of movements in a steam engine to the heat supplied to the engine during such a cycle. The higher the efficiency of a steam engine, the better it converts the supplied heat into work. What steam engine is the most efficient? In 1824 the French engineer and officer Sadi Carnot solved this prob­lem by inventing an ideal steam engine, now called the Carnot machine. Its workings are completely reversible: A cycle in a Carnot machine can be made to go backward by an infinitesimal change in it because all forces that cause the cycle are only infinitesimally removed from balance. Someone viewing a motion picture of the working machine cannot decide whether the picture is being shown forward or backward. So it is impossible for the observer to distinguish objectively between differ­ent directions of time in Carnot machine cycles.

This result clearly contradicts our everyday experience: We know that heat is transferred only from a body at a given temperature to a body at a lower temperature. The German physicist Rudolf Clausius generalized this best-confirmed observa­tion and stated the : A transfer of heat from a body at a given tempera­ture to a body at a higher temperature never occurs spontaneously, that is, without any other change in the state of the bodies at the end of the process. If the second law were falsified, it would be possible to construct a Carnot machine that could use heat transferred from the body at lower temperature to the body at higher temperature for the production of work. The latter body would give up and take in the same amount of heat, so that it would be in the same state at the end of the cycle as it was at the beginning. If the temperature of the colder body were kept constant, we would have suc­ceeded in constructing a machine that is, for all practical purposes, a perpetuum mobile: Simply by cooling down its environment, it could completely transform thermal energy, of which there exists a practically unlimited amount in the environment, into work.

In 1865, Clausius introduced the term entropy (from the Greek verb entrepein, “to turn”) to denote a mathematical function of the state of a physical system like a Carnot machine or an irre­versibly working steam engine. The difference in entropy between two states of such a system is the integral, taken over a transformation from the first to the second state, of the ratio of the net heat absorbed by the system from its environment to the temperature of the environment. If the trans­formation of a thermodynamically isolated system (i.e., a system that does not exchange heat with its environment) is reversible, the difference in entropy is equal to zero; if the transformation of such a system is irreversible, the difference in entropy is greater than zero. Clausius could thus rephrase the second law of thermodynamics: In a thermody­namically isolated system, the entropy can never decrease with time. In case of an irreversible trans­formation, an observer can decide whether some state has occurred earlier or later than another state in the history of a system. One just has to calculate the difference in entropy between these states. After an isolated system has reached its state of maximum entropy, an irreversible trans­formation of energy cannot occur in it anymore. Clausius applied this result to our universe as an isolated system in which entropy tends to a maxi­mum, too, and he called its final state “heat death of the universe”: a state in which the whole of nature will be in thermodynamic balance.

Ludwig Boltzmann, one of the most important physicists of the second half of the 19th century, redefined Clausius’s concept of entropy in statisti­cal terms. Without going into the mathematical technicalities, statistical interpretation means roughly that thermodynamic concepts (like that of temperature as degree of heat in a system) are explained from an atomistic point of view (e.g., temperature as measure of the average kinetic energy of atoms in a system). Entropy then becomes a function of the number of atomic microstates of a system that can realize its present thermodynamic macrostate. Whereas the descrip­tion of the of a system contains infor­mation about the states of all atoms of the system, a description of its macrostate contains just such information that characterizes its overall thermo­dynamic properties.

The statistical interpretation of entropy can be illustrated by means of a thought experiment. Imagine that you have a box containing 1 million tiny red marbles and 1 million tiny blue marbles in your hands.

  • At the beginning of your experiment, the red marbles are separated perfectly from the blue ones: All red marbles are in the right half of the box, and all blue marbles are in the left half. If this macrostate of the box is not be altered, then only such changes in the position of a marble (and, thus, in the microstate of the box) are allowed that let all red marbles stay in the right half of the box and all blue marbles in its left half.
  • Now you shake the box violently. The longer you do so, the more the red and blue marbles will be mixed. The probability that a marble chosen ran­domly from, for example, the right half of the box, is blue will approximate the probability that it is red, namely 50%; before the shaking, the probability was 100%. The macrostate of an equal distribution of blue and red marbles to the halves of the box can be realized by many more microstates than the first macrostate: Any marble in, again, the right half of the box may inter­change its position not only with any other mar­ble in this half but also with any marble of the same color from the left half. The more micro­states can realize one and the same macrostate of the system, the more disordered the system is and the higher its entropy is.
  • The probability that, by shaking the box anew, the first macrostate will reappear is negligibly low. You might have to shake longer than the heat death of the universe will allow you. If you again start your experiment with another macro­state, your shaking will almost certainly trans­form this initial macrostate once more into increasingly probable macrostates, that is, into macrostates that can be realized by more and more microstates. Yet it is not completely impos­sible that your shaking will result in a less prob­able macrostate. The thermodynamical reason for such a decrease of entropy is that the box is not isolated: It absorbs the energy of your shaking.

Since the turn of the 20th century, the statistical concept of entropy has been used in many sciences other than thermodynamics to quantify the degree of disorder of a system. The most important of these applications is Claude E. Shannon’s theory of information, which laid the foundation of modern communication engineering.

The concept of entropy attracted much inter­est also beyond science. The foremost reason for this is that the second law of thermodynamics not only implies the existence of an objective direction of time in nature but also has a pessi­mistic consequence from the standpoint that history is a process of unlimited progress. In the long run, the heat death of the universe sets an unsurpassable limit to the transformation of energy into work—even in a thermodynamically open system like the earth, which is currently fed with energy from the sun.

Thomas Pynchon’s short story Entropy (1958­1959) is an intriguing literary document of the nonscientific reception of the entropy concept. Pynchon’s narrative puts together, on a few pages, many of its facets: its thermodynamic sense, its statistical definition, its connection to information theory, its cultural interpretation as a natural boundary of human progress, and its psychologi­cally disastrous misunderstanding by individuals who short-circuit their experiences with the gen­eral tendency of the universe toward heat death. Pynchon effectively contrasts a party that, in dan­ger of ending in a state of total disorder, is pre­vented from reaching it by a constant energy input of the host, with the ordered everyday life of an isolated couple that has resigned itself to consider any event as a depressing symptom of the uni­verse’s growing entropy. Thermodynamics cannot tell us which ethics we should choose.

Stefan Artmann

See also Cosmogony; Information; Logical Depth; Maxwell’s Demon; Time, Cosmic; Time, End of; Universe, Contracting or Expanding

Further Readings

Atkins, P. W. (1984). The second law. New York: Scientific American.

Balian, R. (2003). Entropy, a protean concept. In J. Dalíibard, B. Duplantier, & V. Rivasseau (Eds.), Poincare seminar 2003 (pp. 119-144). Basel, Switzerland: Birkhâuser.

Van Ness, H. C. (1983). Understanding thermodynamics. New York: Dover.

What do you think?

Friedrich Engels

Friedrich Engels

Epistemology

Epistemology