Entropy, Free Energy and Thermodynamic Equilibrium
Entropy, Free Energy and Thermodynamic Equilibrium
Chemical reactions are performed by mixing the reactants and regulating external conditions such as temperature and pressure. Two basic questions though arise:
- Is it possible for the reaction to occur at the selected conditions?
- If the reaction proceeds, what determines the ratio of products and reactants at equilibrium?
Both questions are answered by chemical thermodynamics:
- Thermodynamics can tell us whether a proposed reaction is spontaneous (possible) under particular conditions even before the actual experiment.
- Thermodynamics can also predict the ratio of products and reactants at equilibrium provided that the reaction is spontaneous.
Note: Thermodynamics cannot answer though how fast a reaction will proceed. The field of Chemical Kinetics studies reaction rates.
After many years of observation scientists concluded that the characteristic common to all spontaneous processes (processes that occur in a definite direction without outside intervention) is an increase in the property called entropy (S). . An example of a physical spontaneous process is shown below in Fig. I.1. A ball rolls down spontaneously a hill but never spontaneously rolls back up the hill.
An example of a chemical spontaneous process is the reaction of iron with oxygen (rusting of iron). The forward reaction is a spontaneous process (natural process that may take years to occur) but the product iron oxide in rust does not spontaneously change back to iron metal and oxygen.
4Fe(s) + 3O2 (g) -------› 2 Fe2O3 (s)
Note: 1) Processes that are spontaneous in one direction are non-spontaneous in the reverse direction 2) The total energy in the above examples of spontaneous processes remains constant and therefore the direction of the processes cannot be attributed to energy changes. Therefore, the First law of Thermodynamics cannot explain why spontaneous processes (natural processes) occur. As a reminder the First Law of Thermodynamics states that the energy of the universe is constant. The driving force behind spontaneous processes (natural processes) is the change in the entropy (ΔS) of the universe.
The 2nd Law of Thermodynamics states: In a spontaneous process, the entropy of the universe increases ΔSuniverse = ΔSsys + ΔSsurr > 0
How entropy is defined?
A precise, quantitative definition of entropy was proposed by the Austrian physicist Ludwig Boltzmann in the late 19th century. According to this definition entropy is related to probability:
If a system has several states available to it, the one that can be achieved in the greatest number of ways (has the largest number of microstates) is the one most likely to occur. The state with the greatest probability has the highest entropy.
S = kB . lnΩ
Where,
kB is Boltzmann’s constant (R/NA)
Ω is the number of microstates corresponding to a given state (including both position and energy)
Note: The above definition of entropy is not useful in a practical sense for the typical types of samples used by chemists because those samples contain so many components (for example 1 mole of gas contains 6.022 x 1023 individual particles).
Let us examine how entropy S and entropy changes (ΔS) – entropy changes can be thought as disorder – can explain the occurrence of spontaneous processes. As an example let us consider four “tagged” gas molecules (labelled 1, 2, 3 and 4) which are concentrated on the left part of the container under vacuum (Fig. I.2). The stopcock is opened and the gas molecules are allowed to equilibrate without any intervesion. If enough times elapses half of the molecules will be in each container.
In the process described above and shown in Fig. I.2 the following are observed:
- There is no any energy change (the total energy remains constant)
- The degree of disorder (entropy) increases after the stopcock is removed and the molecules are uniformly distributed
How can be explained that in such a spontaneous process – a process where entropy increase is observed - the molecules tend to get uniformly distributed in the two containers?
Molecular statistics answers this question by considering the probabilities of the possible arrangements of the molecules in the two containers. Each of these arrangements defines a macrostate. There are five possible arrangements (macrostates) of the four molecules in the two containers are:
- All 4 molecules in the left container
- Three molecules in the left container and 1 in the right
- Two molecules in the left container and two in the right container
- Three molecules in the right container and 1 in the left
- All 4 molecules in the right container
There is also a number of ways (configurations) called microstates that each of the above arrangements can be achieved:
For example from Fig. I.2 the following are observed:
There is only one configuration that all four molecules are in the left container (1 microstate, Microstate #1).
There are four configurations that three molecules are in the left container and one molecule in the right container (4 microstates, Microstates #2 and #3 and #4 and #5).
There are six configurations that two molecules are in the left container and two in the right container (6 microstates, Microstates #6 up to #11 inclusive)
There are four configurations that three of the molecules are in the right container and one molecule in the left container (4 microstates, Microstates #12 to #15 inclusive).
There is only one configuration that all four molecules are in the right container (1 microstate, Microstate #16).
The above described arrangements (macrostates) and configurations (microstates) of the molecules are summarized in Table I.1 below.
The probability that one particular molecule is in the left container at a given time is ½. A second specific molecule may be either in the left or in the right container so the probability that both are in the left is ½ x ½ = ¼. The probability that all 4 molecules are in the left container is: ½ x ½ x ½ x ½ = 1/16
Continuing this argument for N = 6.023 * 1023 molecules the probability that all will be on the left is equal to:
½ x ½ x…. x ½ = (½)N ≈ 0 (where N = 6.023 * 1023)
This is a very small probability almost equal to zero.
From Table I.1 it becomes apparent that:
The greater the number of microstates that correspond to a given macrostate, the greater the probability of that macrostate.
For example, the arrangement (macrostate) 2 molecules in the left container and 2 in the right has the greatest number of possible configurations (microstates) – 6 microstates - of the molecules. The probability that this macrostate occurs is the highest and equal to 6/16 = 3/8. As a matter of fact this macrostate is observed when the four molecules are allowed to move freely in the two containers.
The arrangement (macrostate) with the second highest probability to occur is 3 molecules in the left and 1 molecule in the right container or 3 molecules in the right and 1 molecule in the left container. The number of the possible configurations of the molecules is 4 in this case and the corresponding probability 4/16 = ¼.
Therefore, a gas placed in one end of a container will spontaneously expand to fill the entire container evenly because for a large number of gas molecules there is a huge number of microstates corresponding to equal number of molecules in both ends.
The consequences of this principle are dramatic for large number of molecules in chemical systems (as shown above) because of the following:
- There is a huge number of particles (statistical predictions are always more accurate for larger sample)
- The change process proceeds spontaneously (no external intervention is needed)
Macrostate | Configurations (microstates | Probability | Microstate # (Fig. I.2) | ||
|
| 1/16 | 1 | ||
3 molecules in the left and 1 molecule in the right container | 4 | 4/16 = 1/4 | 2, 3, 4, 5 | ||
2 molecules in the left and 2 molecules in the right container | 6 | 6/16 = 3/8 | 6, 7, 8, 9, 10, 11 | ||
3 molecules in the right and 1 molecule in the left container | 4 | 4/16 = 1/4 | 12, 13, 14, 15 | ||
| 1 | 1/16 | 16 |
How entropy is associated with chemical processes?
Entropy changes, ΔS – not S – are associated with changes of state (from solid to liquid, liquid to gas…). Since a change of state – for example from solid to liquid – at a substance’s melting point is a reversible process, we can calculate the change in entropy for this process by using the equation:
ΔS = qrev / T = ΔΗ / Τ (at constant temperature T and pressure P)
Where:
ΔS change in entropy that occurs during the change of state
qrev = ΔΗ / Τ energy required for the reversible process to occur (for example energy required to melt 1 mole of solid at the melting point, ΔΗ is the enthalpy change of fusion)
T is the temperature where the change of state occurs (melting point, boiling point)
Equation (2) is a very important relationship since it relates entropy changes (ΔS) to macroscopic properties such as heat and volume since these changes are relatively easy to measure. The definition of entropy given by equation (1) is based on probability while the one by (2) on thermodynamic properties.
Which processes are called reversible? Which processes are called irreversible?
Reversible process is a process that the system changes in such a way that the system and surroundings can be put back in their original states by exactly reversing the process.
An example of a reversible process is the heat that can be transferred between two bodies by changing the temperature difference between them in infinitesimal steps each of which can be undone by reversing the temperature difference
An irreversible process cannot be undone by reversing the change of the system. Spontaneous processes are irreversible.
An example of an irreversible process is the free expansion of a gas into a vacuum (Fig. I.2).
Solved examples on entropy changes are given in the post entitled “Entropy changes ΔS and Thermodynamic Equilibrium – Solved Examples”.
Relevant Posts
Free energy, entropy and thermodynamic equilibrium
References
- P. Atkins, J. de Paula, “Physical Chemistry”, 9th Edition, W. H. Freeman (2009)
- I. N. Levine, “Physical Chemistry”, 6th Edition, McGraw-Hill (2008)
- S. S. Zumdahl, “Chemical Principles”, 6th Edition, Houghton Mifflin Company (2009)
- A. W. Adamson, A. P. Gast, “Physical Chemistry of Surfaces”, John Wiley & Sons (1997
Key Terms
Comments
Post a Comment