Re: How do you measure entropy?
Date: Wed Oct 28 08:59:06 1998
Posted By: Georg Hager, Grad student, Theoretical Particle Physics
Area of science: Physics
ID: 907721055.Ph
Message:
Greetings!
Entropy is likely to be one of the most misunderstood and
misused terms from physics. So I will first try to explain
what entropy is, in simple terms. Unfortunately I will not
be able to leave mathematics completely out of the game.
Entropy is, roughly speaking, a measure for our missing knowledge,
or ignorance, about a system. There are numerous ways to
translate this stetement into a more rigorous, mathematical
form which can be useful in practice, and I will give
two of those possibilities here. The first one is actually
the most fundamental one, and all other definitions can be
shown to be compatible with it.
- Entropy of a system can be defined as the number of
yes/no-questions one must ask to gain complete knowledge about the
system. For example, take a chessboard and throw a coin onto it. In
the worst case, another person (who didn't see the throw) has to ask
you six questions in order to know on which tile the coin has landed.
There is a formula which can tell you this number, and in which
you have to insert all the probabilities for the experiment to take
different ways (64 times 1/64, in our example, if the coin
can occupy any tile with equal probability).
This specific definition of entropy comes from information theory,
but it was not the first one that had been conceived. In fact,
physicists have discovered entropy first, which must be regarded
as one of their greatest achievements. The following definition
gives you the physicist's view about entropy, in the language
of thermodynamics. This can be shown to be equivalent to the first
definition, apart from numerical factors and an arbitrary
choice of the point of zero entropy.
- Entropy of a thermodynamical system (like a gas at a certain
temperature) is defined as the logarithm of the number of states
the system may take while having a fixed energy. In the example
of the gas, imagine the gas being inside a box of volume V.
Each molecule can take a certain amount of space, and a lot
of rearrangements can be made among the molecules without
altering the overall energy. Now take the same gas, at the
same temperature, but in a volume that is just V/2. Now
there is much less space left for each molecule, and there
are far less possibilities for rearrangement. In other words,
our knowledge of the system has increased. Now reduce the volume
further and further, and at zero volume we will know exactly
where each molecule is, namely at one certain point. To reduce
entropy we could as well lower the temperature, because
`knowledge' does not only cover the positions but also the
velocities of all the molecules. At absolute zero all molecular
motion will have stopped (apart from quantum effects, which I
will leave out of the game here). At the point where we know
exactly where each molecule is and how fast it is moving, entropy
is at its minimum.
The second definition is not very well suited for measuring
entropy, and in fact one can translate it into a more usable
formula which is able to yield changes in entropy
of a system, which is all one is usually interested in:
The change in entropy is equal to the amount of heat energy
transported into a system divided by its temperature. This
is valid only for small changes, as temperature may
of course change when the system is heated. So in order
to get the overall change in entropy for a finite amount
of heat energy transferred you would have to sum up all
the small contributions `along the way'.
Hope that helps,
Georg.
Current Queue |
Current Queue for Physics |
Physics archives
Try the links in the MadSci Library for more information on Physics.
MadSci Home | Information |
Search |
Random Knowledge Generator |
MadSci Archives |
Mad Library | MAD Labs |
MAD FAQs |
Ask a ? |
Join Us! |
Help Support MadSci
MadSci Network,
webadmin@www.madsci.org
© 1995-1998. All rights reserved.