# Artificial Neural Networks/Hebbian Learning

## Hebbian Learning

Hebbian learning is one of the oldest learning algorithms, and is based in large part on the dynamics of biological systems. A synapse between two neurons is strengthened when the neurons on either side of the synapse (input and output) have highly correlated outputs. In essence, when an input neuron fires, if it frequently leads to the firing of the output neuron, the synapse is strengthened. Following the analogy to an artificial system, the tap weight is increased with high correlation between two sequential neurons.

## Mathematical Formulation

Mathematically, we can describe Hebbian learning as:

${\displaystyle w_{ij}[n+1]=w_{ij}[n]+\eta x_{i}[n]x_{j}[n]}$

Here, η is a learning rate coefficient, and x are the outputs of the ith and jth elements.

## Plausibility

The Hebbian learning algorithm is performed locally, and doesn’t take into account the overall system input-output characteristic. This makes it a plausible theory for biological learning methods, and also makes Hebbian learning processes ideal in VLSI hardware implementations where local signals are easier to obtain.