Entropy

  • Entropy is a measure of randomness or disorder within a system. The greater the disorder, the higher the entropy.

  • Chemistry uses statistical entropy which is based on probabilities. It considers the number of ways that particles can be arranged in a system.

  • In chemical reactions, entropy can either increase or decrease. Reactions that result in an increase in randomness - for instance, those causing gas molecules to spread out - usually have a positive change in entropy.

  • Entropy has the symbol ‘S’ and the units Joules per mole per Kelvin (J mol-1 K-1).

  • Standard entropy (S°) values are used as a reference, these refer to the absolute entropy of a substance at 298K and 1atm pressure.

  • The second law of thermodynamics states that the total entropy of a system and its surroundings always increases for a spontaneous process.

  • Entropy change (∆S) for a system is calculated using the relation: ∆S = ∆S(products) - ∆S(reactants)

  • When a solid melts or liquid evaporates, entropy increases. Dissolving a substance in a solvent also generally leads to an increase in entropy.

  • Gibbs free energy (∆G) is the energy available to do useful work. It combines enthalpy and entropy into one value. The relationship between these three is given by the equation: ∆G = ∆H - T∆S

  • A reaction is thermodynamically feasible when the Gibbs free energy change is negative (∆G < 0).

Remember, understanding entropy, its calculation, and its implication in thermodynamics can be the key to answering many advanced chemistry problems. Practical examples help illustrate these abstract concepts, so try to think in terms of real chemical reactions and processes.