# Molecular Simulation/Statistical properties

Statistical thermodynamics describes physical descriptions according to probability distributions.

## Probability Distributions

A probability distribution is a function that shows the likelihood of an outcome. In statistical mechanics, this generally means the probability of of a system being in a particular state.

### Gaussian Distribution

A Gaussian distribution, also called a normal distribution is a function with a bell shaped distribution. Any random variable with this normal probability density

${\displaystyle p(x)={\frac {1}{\sqrt {2\pi \sigma ^{2}}}}\;e^{-{\frac {(x-\alpha )^{2}}{2\sigma ^{2}}}}}$

is called a normal variable with mean ${\displaystyle \alpha }$ and variance ${\displaystyle \sigma ^{2}}$ [1]

The Boltzmann Distribution is one such probability distribution that gives the probability distribution as a function of a states energy ${\displaystyle E}$ and temperature if a system ${\displaystyle T}$.

${\displaystyle p_{i}={\frac {e^{-{\varepsilon }_{i}/kT}}{\sum _{j=1}^{M}{e^{-{\varepsilon }_{j}/kT}}}}}$

As shown below physical properties of a system can be calculated using this Boltzmann distribution.

### Conformational Distribution

Within solids, liquids, and gases, atoms can take on different orientations and arrangements. Conformation changes within a molecule can occur by rotations around bonds. For example, gauche or eclipsed conformations, trans or cis and any conformation in between these would all fall under the distribution of conformation.

### Calculating Physical Properties

Macroscopic properties are the average arrangement a system take son over time. Assuming all possible energy levels are continuous, we can employ classical mechanics to calculate physical properties. This assumption is only valid when particles and heavy and forces are relatively soft.

## Averages

The ensemble average in statistical mechanics refers to the mean of all possible states of a system as given by its probability distribution. It is dependant on the ensemble chosen (i.e canonical, microcanonical etc...).

The expectation value ${\displaystyle \langle M\rangle }$ gives the average value when any physical property of a system is measured. depending on the type of distribution, it may not be the most probable value but is the probability weighted value which we expect to measure. In a classical system, the expectation value is an integral over all of the possible configurations. The possible configurations are integrated over the interval ${\displaystyle [x_{i},x_{f}]}$.

${\displaystyle \langle M\rangle ={\frac {\int _{x_{i}}^{x_{f}}M(x)\exp \left({\frac {-{\mathcal {V}}(x)}{k_{B}T}}\right)\,dx}{\int _{x_{i}}^{x_{f}}\exp \left({\frac {-{\mathcal {V}}(x)}{k_{B}T}}\right)\,dx}}}$

where ${\displaystyle {M}(x)}$ represents the property being calculated, ${\displaystyle {\mathcal {V}}(x)}$ represents the potential energy, ${\displaystyle k_{B}}$ represents the Boltzmann constant, and ${\displaystyle {T}}$ represents the temperature of the system.

In classical statistical mechanics, the classical ensemble average is the normalized Boltzmann weighted integral over all phase space.

${\displaystyle \langle M\rangle ={\frac {{\int _{-\infty }^{\infty }}{\int _{-\infty }^{\infty }}{\int _{-\infty }^{\infty }}{\int _{0}^{L}}{\int _{0}^{L}}{\int _{0}^{L}}{\mathcal {H}}(x,y,z,p_{x},p_{y},p_{z})e^{\frac {-{\mathcal {H}}(x,y,z,p_{x},p_{y},p_{z})}{k_{B}T}}dxdydzdp_{x}dp_{y}dp_{z}}{{\int _{-\infty }^{\infty }}{\int _{-\infty }^{\infty }}{\int _{-\infty }^{\infty }}{\int _{0}^{L}}{\int _{0}^{L}}{\int _{0}^{L}}e^{\frac {-{\mathcal {H}}(x,y,z,p_{x},p_{y},p_{z})}{k_{B}T}}dxdydzdp_{x}dp_{y}dp_{z}}}}$

Where ${\displaystyle {\hat {H}}}$ is the Hamiltonian for the described system. This expression can be used to find many physical properties such as the average energy of a single particle system and many-body systems by integrating over spatial coordinates (${\displaystyle dxdydz}$ and integrating over the Maxwell distribution (${\displaystyle dp_{x}dp_{y}dp_{z}}$

In a quantum mechanical system, the expectation value is the Boltzmann weighted sum over energy levels.

${\displaystyle \langle M\rangle ={\frac {\sum _{i}g_{i}M_{i}e^{\frac {-E_{j}}{k_{B}T}}}{\sum _{j}g_{j}e^{\frac {-E_{j}}{k_{B}T}}}}}$

### Example: Conformationally Averaged Dipole Moment

Conformational averaging using the Boltzmann distribution gives a way of finding the average dipole moment, as described in previous section.

The dipole moment of a molecule changes as its conformation changes as is it the vector sum of all bond moment and therefore the observed (or 'expected') value is a linear combination of the two conformations. For example, in 1,2-dichloroethane the trans conformation is non-polar but the cis conformation is polar. Since the molecular dipole is a vector quantity, the conformationally-averaged dipole moment is the average of the square of the individual dipole moments.

${\displaystyle \mu _{average}={\sqrt {\mu ^{2}}}={\sqrt {\frac {\sum _{i}g_{i}\mu ^{2}e^{\frac {-\nu }{k_{B}T}}}{\sum _{j}g_{j}e^{\frac {-\nu }{k_{B}T}}}}}}$

${\displaystyle {\sqrt {\mu ^{2}}}={\sqrt {\frac {\mu _{trans}^{2}e^{\frac {-\nu _{trans}}{k_{B}T}}+\mu _{cis+}^{2}e^{\frac {-\nu _{cis}}{k_{B}T}}+\mu _{cis-}^{2}e^{\frac {-\nu _{cis}}{k_{B}T}}}{e^{\frac {-\nu _{trans}}{k_{B}T}}+e^{\frac {-\nu _{cis}}{k_{B}T}}+e^{\frac {-\nu _{cis}}{k_{B}T}}}}}}$

Using this formula, the conformationally averaged dipole moment for a molecule like 1,2 dichloroethane where the trans conformation has no dipole and the cis does can be calculated.

## Variance

Variance is value that indicates how widely spread a set of data is when compared to the average. The variance is the expectation of the square of the standard deviation of a given set of data. If the standard deviation or variance is low, the data is close to the expected value. The variation of ${\displaystyle {X}}$, a random variable, is the expected value of the square of the standard deviation from the mean. ${\displaystyle {E}}$ represents the expectation and ${\displaystyle \mu }$ represents the mean. Variance can be represented by ${\displaystyle \operatorname {Var} (X)}$ or ${\displaystyle \sigma ^{2}}$.

${\displaystyle \operatorname {Var} (X)=\operatorname {E} \left[(X-\mu )^{2}\right]}$

## References

1. Rozanov Y.A. (2015) Probability Theory: A concise Course, Dover Publications, USA
2. McQuarrie, A. (2000) Statistical Mechanics, University Science Books, California