A short history of energy and entropy:
Aristotle coined the word “energy,” Greek for “at work.” Two millennia
later, a scientific understanding of “at work” came into being in the study
of thermodynamics. The word “energy” got buffeted about until its
scientific meaning today has, in many ways, the opposite of its colloquial
meaning.
COLLOQUIAL USE |
SCIENCE USE |
|
|
|
|
|
|
|
|
|
|
more like negative entropy |
related to mass by logical equivalence |
That thing we call energy
in day-to-day usage is not the energy
of science.
Not even close!
“…there is a sea of quantum
energy between us and extending from the body” and which “moves faster
than the speed of light.”
TT practitioners “can palpably sense an energy field that extends some 10 cm beyond the surface of the skin. Treatment consists of manually smoothing the field.” |
This “energy” has no relationship to modern science beyond a little plagiarism of language. It retains the sense of importance we sense in Aristotle’s meaning, but even Aristotle’s primitive understanding got lost.
Science is one result when human beings engage "perceptions" that are capable of "seeing" in ways known only in the past few centuries to any effective degree. "Quantum," "field," "energy," and the surprising place of speed of light in modern physics, are insights of these new perceptions. Those uses of those words in the purple-bathed prose belong to the pre-science insights, conveying meaning unlikely to tax the understanding of contemporaries of Aristotle. Their modern meaning, however, requires a little hard work, a bit of persistent puzzle solving.
How
does the living organism avoid decay? The obvious answer is: By eating,
drinking, breathing, and (in the case of plants) assimilating. The
Greek word (metaballein)
means change or exchange. Exchange of what? Originally the
underlying idea is no doubt, exchange of material. (E.g. the German
word for metabolism is Stoffwechsel.) That the exchange of
material should be the essential thing is absurd. Any atom of nitrogen,
oxygen, sulfur, etc, is as good as any other of its kind; what could be
gained by exchanging them? For a while in the past our curiosity
was silenced by being told that we feed upon energy. In some very
advanced country (I don't remember whether it was Germany or the U.S.A.
or both) you could find menu cards in restaurants indicating, in addition
to the price, the energy content of every dish. Needless to say,
this is just as absurd. For an adult organism the energy content
is as stationary as the material content. Since, surely, any calorie
is worth as much as any other calorie, one cannot see how a mere exchange
could help.
What then is that precious something contained in our food which keeps us from death? That is easily answered. Every process, event, happening—call it what you will: in a word, everything that is going on in Nature means an increase in entropy of the part of the world where it is going on. Thus a living organism continually increases its entropy—or as you might say, produces positive entropy—and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy—which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help but produce while alive. |
|
James Watson says that Schrödinger's What is Life?
inspired
him to start his quest that led to his Nobel Prize. Perhaps others
might start a similar quest that leads them to "seeing" the deep absurdities
in the purpled prose above. (It would be hard to invent sillier phrases
masquerading as spawn of modern science.) The deep truths—the
subtle insights—that
Schrödinger directs our gaze toward reveal much about how we, as
intelligent
living organisms, relate to the world around us.
So drop your lottery tickets, and discover what the statistical world is really trying to tell us... STATISTICAL MECHANICS & THERMODYNAMICS Puzzles #6 and #19 give a sense of predictability buried in randomness. Sadi Carnot established the second law of thermodynamics in about 1850.
The first law came later, about 1865. The second law says entropy
cannot spontaneously decrease: that means that the predictability
of things (like transfer of energy or of momentum)
can only get worse without outside influence. The first law says that
the quantity of a very abstract entity, the energy
of science, is always the same no matter what happens and as long as everything
affected is taken into account.
If things are at different temperatures, energy
can flow in a somewhat predictable manner (higher to lower temperature)
and that predictability is what makes some of the thermal energy
available for doing work. (This available energy is called free energy because it is free to do work.)
|
Gamblers become gulls of the casinos and lotteries because they don't really understand statistics. ![]() Before the late 19th century, statistics
was a mystery to just about everybody. Thermodynamics unlocked a
lot of the mysteries.
"Life is two locked boxes, each containing the other's key." Piet Hein
|
||||||
Entropy is a measure of probability that something is in some
state.
That “state” is usually something of interest to an engineer designing a “heat engine”: a steam engine or Diesel engine or gasoline engine, engines that derive mechanical energy from flow of heat. The “something” is the steam or burned fuel. Thermal energy is random energy, like the randomness of the device that chooses the numbers that win the state lottery. On the other hand, mechanical energy is organized energy, like the kinetic energy of all the individual atoms of iron in a sledge hammer. If you know the motion of one atom in a moving sledge hammer, you can state with high probability the motions of all the others. The hammer does work on what it hits and that work is a transfer of mechanical energy. The sledge hammer is like a lottery in which the number selecting machine is completely rigged and all the money goes exactly where the operator wants. If lottery numbers were picked by a set of “loaded” dice, and you knew what that loading is, you could make the lottery work for you. Statistically speaking, that is. Getting mechanical energy out of a steam engine is more like getting money out of a lottery that uses loaded dice, not a rigged selecting machine. If you know the motion of one molecule of steam in the cylinder of a steam engine, you have very little idea of the motions of the other atoms. Their motions are highly random, and you can describe them only in terms of statistics. You can get a little mechanical energy out of the steam engine by letting the steam expand against the piston, but you can't begin to get all of that kinetic energy tied up in the random motions of the steam molecules. You must go take a course in engineering thermodynamics to learn how to maximize what you do get. (For example, when everything related to your heat engine is at the same temperature, getting work out of the engine is like getting money out of a fairly run lottery: it happens only by the most remote of odds.) Thermodynamics will teach you the statistical nature of heat, temperature, and entropy. |
lower entropy
![]()
Statistics is about predicting outcomes. |
Heat
is energy that is transferred from one place to another place at a different
temperature, and it's energy that gets transferred solely because of that
temperature difference. Heat is energy transferred by statistical
exchanges of energy between particles. Temperature is a parameter in a
statistical distribution of energyand the greater the temperature
difference, the greater the amount of heat transferred. Entropy
was originally a measure of that part of the total thermal energy which is
rendered unavailable for doing work because those exchanges are so random.
(Free energy is total energy minus the energy which is unavailable for doing work.)
Later, Ludwig Boltzmann demonstrated that the entropy of something in some
state is simply the logarithm of the probability of it being in that state:
|
Information is a key concept here, too. Information
selects from alternatives. A mailing address selects which mailbox
the envelope must end up in. The more mailboxes, the more numbers
in the address. That’s why the cost of the hardware in Puzzle #5
increases (roughly) as the logarithm of the number. (One costs $.20;
ten costs $.40; one hundred costs $.60; One thousand costs $.80; etc.)
More generally, living organisms use information to help select from alternatives. We then take action, and are successful to the extent that we anticipate the outcome of the action. “Order” is a kind of predictability. At the level of molecules in our cells, it's the predictability of chemical
reactions, the predictability of osmotic diffusion through membranes, the
predictability of heat transfer These are statistical matters, and that makes them matters of modern
"insight," insight just like that which goes pervasively and persistently
unseen in the gambling casino. (And
then there's that "sea of quantum energy" that "moves faster than the speed
of light" |
Information is about selecting
from so that we can predict the outcomes of actions we take. |
January 26, 2000
The next step in this look at entropy is to examine dot patterns.
Dot patterns can be highly ordered, completely random, and everything in
between. We have several patterns of the "in betweens" that give
a picture (literally) of information content. It also
gives a picture of how ordering lets us predict some things about that
part of the world mapped by the dot patterns. It's the kind of
prediction that we see reflected in the stock market, the gambling casino,
and the molecular events in and about steam engines and living cells.
Entropy is a measure of "the odds" at the molecular and particle level of happenings.
It'll take a while to prepare our old materials for the Web. In the meanwhile please tell us how these materials might better help you. |