# General Chemistry/Thermodynamics/Entropy

Entropy is the measure of disorder in a system.

## An Example[edit | edit source]

First, let's examine a non-chemistry example. Say you have one card from a deck of playing cards. It can have one of 52 possible values. Now you pick up four more cards. If the deck were shuffled randomly, the odds of you holding, say, a "straight flush" are 0.00154%. The odds of having a "high card", which is essentially a "nothing" hand, is 50.12%. In the case of a straight flush, the cards you are holding are highly organized. They must be in a specific, exact pattern (like 10 9 8 7 6 of the same suit). To have a high card, no two cards must have the same rank and the cards cannot be in order.

There is a very low probability of having a straight flush because the cards are in a very orderly state. There is a large probability of having "nothing" because the cards are in a random, disorderly state. In our card-playing system, straight flushes have **low entropy** because they are so orderly. "Nothing" hands have **high entropy** because they are disorderly and random. Furthermore, it takes minimal effort to toss all the cards in the air and randomize their order. Increasing the entropy of the system comes naturally. On the other hand, sorting the cards in order takes time and effort. It does not happen randomly. Decreasing the entropy of the system is unnatural and takes effort, or energy.

### In Chemistry[edit | edit source]

The entropy of a chemical system is a measure of its disorder or chaos. More precisely, it is a measure of the *dispersion of energy*. A solid has low entropy (low chaos, orderly) because the molecules are locked into a rigid structure. Their energy is not dispersed freely. A gas has high entropy (high chaos, disorderly) because the molecules are free to move about randomly. The energy of the system is dispersed over a large area with unlimited possibilities of the location of each molecule.

As temperature decreases, so does entropy. Theoretically, at **absolute zero** (0 K, or -273 °C), the entropy of the system would be zero. This is because the solid would be perfectly crystallized so that its energy is not dispersed at all.

As you will soon learn, the Second Law of Thermodynamics tells us that the entropy of the universe always increases. Think about it. If you have built a house of cards, the entropy of the system is low. A house of cards is very orderly, with each card having a very specific location. The house of cards will undoubtedly collapse. The resulting pile of cards is very disorderly. The cards can be in any position and still be a random pile of cards. The entropy has increased spontaneously. Houses of cards will spontaneously collapse, but they never spontaneously build themselves up. This is because high entropy is natural and low entropy is unnatural.

## Entropy Changes[edit | edit source]

When analyzing the entropy change of a chemical reaction, you would need specific numbers. As a guideline, you can estimate the entropy change based on some basic rules:

- Melting and boiling
*increases*entropy - Freezing and condensing
*decreases*entropy - Dissolving a solute
*increases*entropy - Forming precipitates
*decreases*entropy

If you do happen to know the absolute entropy of substances in a reaction (by looking it up in a chart), you can calculate the change in entropy. Entropy is symbolized with . The change in entropy is . As with enthalpy, the degree symbol () represents STP. The change in entropy is the absolute entropy of the products minus the absolute entropy of the reactants.

## See also[edit | edit source]

- Entropy for beginners- a Wikibook that provides a mathematical explanation.