Quantum mechanics

We have been doing classical calculations. It is time to make the transition to a statistical theory that is compatible with quantum mechanics.

    "Ludwig Boltzmann [right], who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics. Perhaps it will be wise to approach the subject cautiously. "

    - David L. Goodstein, States of Matter




Live dangerously: Read Chapter 12! (Starting with 12.3)

Energy levels

To start with, the only result of quantum mechanics that we will use is that systems have allowed energies which are discrete. We call these discrete energies, "energy levels".

Imagine that we have an isolated system with a simple set of energy levels:

  • The energy levels are evenly spaced.
  • We'll label the energy levels $E_0=0$, $E_1=1\epsilon$, $E_2=2\epsilon$, ...$E_j=j\epsilon$.
  • There are $n+1$ energy levels, so the highest numbered level is $n$.
  • Any "particle" in the system may occupy any energy level. (The particles in a system might be atoms, or electrons, or photons, or phonons, or....)
  • Any energy level may contain any number of particles.
  • The number of particles in level $j$ is $N_j$.
  • The total number of particles is $N$. At first, we'll contemplate only closed systems that contain a constant number of particles, such that: $$\sum_{j=0}^n N_j=N.$$
  • Let's assume for simplicity that how the particles are arranged in the energy levels does not actually change the energy levels. The total energy of such a system is: $$\sum_{j=0}^n N_jE_j=U\ \ \text{internal energy}$$
  • At equilibrium we expect that the internal energy remains constant, but the arrangement of particles might vary over all possible arrangements compatible with the total internal energy. (Think: elastic collisions in an ideal gas...)

    Macrostates and microstates

    The concepts of "microstates" and "macrostates" are as important to statistical mechanics as the concept of a "system" was in thermodynamics.

    • A microstate is a full specification of which particles are in which states. [Picture of a particular microstate]
    • A macrostate is a particular histogram of the number of particles in each energy level. [Picture of a histogram].

      We can also describe a macrostate as an enumeration of the number of particles $N_j$ there are in each energy level $j$, e.g.
      $$\{ N_0, N_1, N_2, N_3...N_n \}$$
      for $n+1$ different energy levels.

      [Notice $n$ is not # of kilomoles here!]

    • Let's label each macrostate with an index $k$. There are potentially several or eventually many microstates that are compatible with a given macrostate.

    The fundamental assumption of statistical thermodynamics is that all possible microstates of an isolated system which are compatible with the thermodynamic specification of the state (E.g., a particular internal energy...) are equally likely.

    Making sense of table 12.2

    The question is... for a particular internal energy, which macrostate is the system most likely to be found in??

    We can attack this problem by counting the microstates, $w_k$, compatable with each possible histogram $k$...

    Further lingo... The total number of microstates for a particular $U$: $$\Omega=\sum_k w_k.$$ The probability of macrostate $k$: $$P_k=w_k/\Omega.$$

    Now, do problem 12.6.

    In the statement of the problem, it looks like there are only 4 energy levels (they list 0, 1$\epsilon$, 2$\epsilon$, 3$\epsilon$...) Assume that the energy levels just keep going: 0, 1$\epsilon$, 2$\epsilon$, 3$\epsilon$, 4$\epsilon$, 5$\epsilon$, 6$\epsilon$, 7$\epsilon$, 8$\epsilon$,....

    Review 12.6... Take home messages:

    • Method of counting with 1 particle in each of 4 levels:
      • 4 choices for first level, then 3 choices for second level, then 2 choices for third level, then only 1 for the remaining level
      • 4*3*2*1 = 4! combinations
    • Macrostate with highest $w_k$ is most probable.
    • Most probable macrostate is the one with the particles most spread out among energy levels.
    • With larger number of particles, probability falls off rapidly away from the most probable macrostate.

    A preview of the statistical mechanics meaning of entropy from Richard Feynmann:

    So we now have to talk about what we mean by disorder and what we mean by order. …

    Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case.

    We measure “disorder” by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the “disorder” is less.

    Image credits

    Michael Diderich, Charles Cowley, Theresa Knott