Artificial Neural Networks/Hebbian Learning

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Hebbian Learning[edit | edit source]

Hebbian learning is one of the oldest learning algorithms, and is based in large part on the dynamics of biological systems. A synapse between two neurons is strengthened when the neurons on either side of the synapse (input and output) have highly correlated outputs. In essence, when an input neuron fires, if it frequently leads to the firing of the output neuron, the synapse is strengthened. Following the analogy to an artificial system, the tap weight is increased with high correlation between two sequential neurons.

Mathematical Formulation[edit | edit source]

Mathematically, we can describe Hebbian learning as:

Here, η is a learning rate coefficient, and and are, respectively, the outputs of the ith and jth elements at time step n.

Plausibility[edit | edit source]

The Hebbian learning algorithm is performed locally, and doesn’t take into account the overall system input-output characteristic. This makes it a plausible theory for biological learning methods, and also makes Hebbian learning processes ideal in VLSI hardware implementations where local signals are easier to obtain.