Entropy (S) is the thermodynamic measure of randomness throughout a system (also simplified as “disorder”). Entropy can also be described as thermal energy not able to do work since energy becomes more evenly distributed as the system becomes more disordered. Entropy is particularly important when describing how energy is used and transferred within a system. As an exact value of entropy is impossible to measure; however, through relationships derived by both Josiah Willard Gibbs and James Clerk Maxwell the change in energy between one state and another can be calculated based on measurable functions, like temperature and pressure. That value in turn gives insight into how chemical reactions are favored and, most importantly, allows for the calculation of Gibbs Free Energy (ΔG = ΔH-TΔS).
Using statistical mechanics of the gas phase, entropy can be estimated by using Boltzmann’s formula. According to the formula, S = k ln W where k, the Boltzmann's constant, equals 1.381 x 10−23 (in J/K). The Boltzmann's constant was calculated by relating to the gas constant R = kNA. W stands for the number of ways that the atoms or molecules in the sample can be arranged while still containing the same total energy.
It is important to note that the change in entropy, like temperature and volume, is a state function: the value is independent of the path used to get from the original state to the final state. Additionally the overall change in entropy of the universe is positive, meaning that the universe is continuously moving to a state of higher disorder.
A simple example where entropy is increased is when ice melts to water. The structure of ice is a well-ordered, crystalline system. When energy is put into the system in the form of heat, molecules begin to move more rapidly and no longer have the neatly ordered structure of ice. Thus, there distribution throughout space is more “random”. Another example where entropy is increased is when a reaction produces more moles of products than the reactants in the same phase.
The favorability of intramolecular reactions over intermolecular reactions is explained entropically. In an intermolecular coupling, two molecules come together to form one thus increasing the order in the system and decreasing the entropy. In an intramolecular reaction there is one molecule to start and one at the end which does not change the entropy of the system in an unfavorable way as is seen in intermolecular reactions.
Entropy can further be divided into thermal disorder, in which the entropy increases as heat is added to the system, and positional disorder, which related to the increase in entropy as the volume of the system is increased.
Entropy is also of particular interest in biochemistry as one of the unofficial definitions of life is an aggregate of molecules that work to decrease entropy in a certain localized area or volume. Additionally, it helps describe many phenomenons found in biochemical systems, which are described next.
Entropy in Biochemical Interactions[edit | edit source]
Entropy is a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system's disorder, that is a property of the system's state. It is varies directly with any reversible change in heat in the system and inverserly with the temperature of the system.
Entropy can be a strong driving force in nature. For example, it plays a very large part in the behavior of hydrophobic substances in water. A very common example of entropy at work would be lipids in solution. The lack of polarity in longer hydrocarbon chains tends to "force" water molecules to align themselves in an orderly pattern around the saturated part of the molecule. This orderly pattern decreases entropy as it prevents water from freely associating itself with other water molecules via hydrogen bonding. An increase in entropy would lead to a more negative Gibbs Free Energy, and a spontaneous reaction. The saturating effect of decreasing the change in entropy serves as the driving force for lipids to associate with one another instead of with water. Lipids coalesce to reduce the amount of water surrounding its molecules, and thereby increasing entropy. This phenomenon results in the formation of essential evolutionary components of life, such as lipid bilayer structures such as the lipid bilayer membranein eukaryotic cells.
Another place that the entropic favoring of hydrophobic molecules to dissociate with water can be found is in the active sites of enzymes. Many enzymes have a high concentration of hydrophobic residues in their active sites. The binding of an enzyme to its substrate alleviates the lack of entropy by driving water molecules out of the active site.
The value of understanding entropy’s role in chemistry can be utilized in the lab. For example, ammonium sulfate can be added in high concentrations to an aqueous solution containing at least one or more proteins. Proteins are much larger than ammonium sulfate ions. Thus the condensed charge of the newly dissolved salt attracts the water molecules in solution to form hydrate shells around them. In order to form these shells around ammonium sulfate, water molecules from the hydration shells around proteins must be used. The hydration shells of proteins are generally more ordered since not all of the protein surface is charged nor are the charges as condensed. Thus, the decrease in water molecules around proteins reaches a limit where the proteins become insoluble in water and are precipitated out for isolation and further study. .
However, entropy can also play a negative role in biochemistry.
For example, denatured protein by heating is an example where entropy plays a role in denaturation. In a folded protein, entropy is high due to its packed structure. As the protein becomes unfolded(denatured), the hydrophobic regions in a protein are surrounded by water. Overall there is no change in entropy but the protein becomes denatured.
Oil spill in sea also follows the same argument. 
Entropy during phase changes[edit | edit source]
As mentioned earlier, entropy is the measure of disorder, and this is also the case when it comes to phase changes.
This could be thought of a more simplified manner. For example, in the solid form of H2O, they are in a very rigid and ordered crystal structure. As the temperature is raised, the rigid crystal structures begin to loosen up from its tight grasp of one another(via hydrogen bonds), and eventually, melting occurs. The ice turns into liquid. There was an increase in the disorder of this system, it went from a solid rigid crystal structure, to a bunch of freely moving molecules. This amount of disorder for this phase change to occur is entropy, specifically in the case of melting, the entropy of fusion.
References[edit | edit source]
- Atkins, Chemical Principles The Quest for insight, Fourth Edition
- Whitford, Proteins: Structure and function, Chapter 9
- ACS (http://learningcenter.nsta.org/products/symposia_seminars/ACS/webseminar3.aspx)