# Artificial Neural Networks/Hopfield Networks

## Hopfield Networks

Hopfield networks are one of the oldest and simplest networks. Hopfield networks utilize a network energy function. The activation function of a binary Hopfield network is given by the signum function of a biased weighted sum:

${\displaystyle y_{i}=\operatorname {sgn}(\zeta _{i}-\theta _{i})}$

Hopfield networks are frequently binary-valued, although continuous variants do exist. Binary networks are useful for classification and clustering purposes.

## Energy Function

The energy function for the network is given as:

${\displaystyle E=-{\frac {1}{2}}\sum _{i}\sum _{j}w_{ij}y_{i}y_{j}}$

Here, the y parameters are the outputs of the ith and jth units. During training the network energy should decrease until it reaches a minimum. This minimum is known as the attractor of the network. As a Hopfield network progresses, the energy minimizes itself. This means that mathematical minimization or optimization problems can be solved automatically by the Hopfield network if that problem can be formulated in terms of the network energy.

## Associative Memory

Hopfield networks can be used as an associative memory network for data storage purposes. Each attractor represents a different data value that is stored in the network, and a range of associated patterns can be used to retrieve the data pattern. The number of distinct patterns p that can be stored in such a network is given approximately as:

${\displaystyle p_{max}\approx 0.15n}$

Where n is the number of neurons in the network.