Statistical mechanics is the extrapolation of quantitative properties from systems of large numbers of particles (on the order of \(~10^{23}\) particles). The general approach is to specify some system in terms of it's Hamiltonian, and then to apply some corresponding treatment of the system as a statistical ensemble defined by a specific set of macroscopic properties. Thus, statistical mechanics allows us to model macroscopic systems with large numbers of constituents quantitatively, by averaging over and modeling systems in terms of macroscopic properties. Thermodynamics is a subset of this more general set of tools.
Perhaps the most intuitive thermodynamic function is the internal energy of a system. For a general system, the internal energy is defined as a function of the entropy, volume, and number of particles. Taking the exterior derivative of this then gives the relation
As discussed above, intensive properties are those that characterize not the total system but inherent properties of any (adequately sized) subsystem contained within. As such, they're generally derivatives of state functions with respect to extensive properties of the system.
Extrinsic properties are those that only describe the collected system in it's entirety. That is, they are not properties inherent in any subsystem contained within. Rather, they're physical properties that only describe the system's overall extent and content. These properties generally include \(U\) (internal energy), \(S\) (entropy), \(V\) (volume), and \(N\) (the number of particles); as well as other definable properties of this nature.
Some other common properties that may be defined for thermodynamic systems are expansion coefficients, compressibilities, and specific heats. Generally, they're specified with respect to some other state variable that doesn't appear explicitly in the defining relation itself. This can be taken as a signal that this component is being held constant, but it's more to specify that, in the relevant context, we're treating the relevant state function as a function of the variables referenced (i.e. the one being differentiated w.r.t. and the one specified externally). Defining relations for some of these properties are specified below for convenience.
Specific heats describe the differential relationship between entropy and internal energy of the system.
Compressibilities describe how the volume of a system changes under changes in pressure.
Coefficients of volume expansion describe how volume changes under changes in temperature.
The first law of thermodynamics is a statement of energy conservation. It essentially states that the internal energy of a system is only depepndent on the work done on the system and the heat absorbed by the system.
The second law defines the direction of time, in that it claims the sum of entropies of isolated systems is less than or equal to the entropy of the total system given the systems are then allowed to interact over time.
The third law specifies that the entropy of any system approaches a constant value at zero temperature. Nernst's postulate further states that this constant value is exactly zero, but this has been proven empirically false in certain systems, such as super cooled glasses.
A statistical ensemble, or just ensemble, is a particular approach for applying the methods of statistical mechanics. Effectively, ensembles are distributions in phase space, defined by a set of variables in a particular system completely specified by it's Hamiltonian.
To to that end, ensembles are predetermined strategies for applying the methods of thermodynamics, and statistical mechanics at large, to systems characterized by a certain set of extensive or intensive properties. They can be thought of as the way in which we may distribute a certain set of properties across all the potential arrangements in phase space that satisfy such properties.
The micro-canonical ensemble can be used to model isolated or completely contained systems. That is, systems in which their total energy, number of particles, and volume can, at any instant, be completely specified.
Suppose you are given some macroscopic system, specified by it's Hamiltonian, and know the system's energy. From this relation for energy and the expectation that it is well defined for all potential arrangements of the system, we may calculate from it the number of accessible states \(N_{\Gamma_a}\) in the relevant phase space \(\Gamma\) for any total internal energy \(E\).
Since the entropy of the system is the only result of this treatment, we may only perform thermodynamics in the micro-canonical ensemble. Nevertheless, it may be powerful for isolated systems. Relevant relations related to entropy \(S\) are specified below for convenience.
In the quantum mechanical formalism, our micro-canonical ensemble is described as a density matrix of the form given below:
The canonical ensemble is an approach suited for application to systems distinguishable from their surroundings, but which interact with their surrounding 'environment'. That is, this treatment may be applied to systems with a definite number of particles, volume, and temperature, but may still exchange energy within a larger system. Thus, the energy of our subsystem under analysis isn't entirely well-defined at an arbitrary time and it is neccessary to specify, or depend on, an intensive property of the system when defining it, hence the relevance of temperature.
The general approach to modeling systems in the canonical ensemble is to solve for the particular partition function \(Z\) of the system, from which we may perform statistical mechanics. If one wishes to model such a system thermodynamically, the general approach then is to derive the Helmholtz free energy from \(Z\) and derive other thermodynamical properties from there.
From the partition function, we may define the average value of energy (denoted by a bar above the operator) via
Since the Helmholtz free energy is the system of state relevant to treatments in the canonical enesemble, relevant relations are specified below.
In the quantum mechanical formalism, our canonical ensemble is described as a density matrix of the form given as:
An approach to statistical mechanics apt for subsystems which further exchange particles with their surrounding environment, but are chemically indistinguishable from those elements of it's environment. Essentially, we replace dependence on the number of particles \(N\) with dependence on inherent chemical properties of these particles, namely their chemical potential \(\mu\).
The general approach in the grand canonical ensemble is to solve for the particular partition function \(\mathcal{Z}\) of the system given by the relation above. From here the grand potential of the system is easily found via
From the grand-canonical partition function, we may define the average value of an arbitrary operator via
The grand potential \(\Omega \) is the system of state relevant to the grand-canonical enesemble, and as such, relevant relations are specified below.
Several probability distributions are particularly relevant for statistical mechanics. These generally include: the Maxwell-Boltzmann Distribution, relevant for classical thermal applications; the Fermi-Dirac Distribution, applicable to Fermionic quantum systems; and the Bose-Einstein Distribution, related to Bosonic quantum systems.
The Maxwell-Boltzmann distribution describes the thermal distribution of the velocity of classical gasses. Note that it only applies in thermal equilibirium and to gases that aren't exhibiting some macroscopic flow.
Fermi-Dirac distributions are used to describe the probability of energy states being occupied for thermal Fermionic systems. Fermions obey the Pauli exclusion principle, as they have non-integer spin, which immediately resolves all degeneracies.
In the context of Fermi-Dirac statistics, we often come across integrals of the form:
Bose-Einstein distributions are used to describe the occupation of states of thermal Bosonic systems, which, in contrast to Fermions, can have several quanta occupying the same state.
Moments may be calculated via
The cumulant generating function may be evaluated as
Gaussian integral:
Stirling's Approximation (Large N):
Geometric Series:
Binomial Expansion:
Volume of n-ball with radius R: