Differentiable plasticity: training plastic neural networks with backpropagation
Goal
To build networks that are plastic: quick and efficient learning from experience, inspired by synaptic plasticity. This is to bridge the gap with biological agents, which are able to learn quickly from prior experience, mastering environments with changing features.
An alternative to Meta Learning, synaptic plasticity strengthens and weakens connections between neurons based on neural activity: whether they fire together.
Plasticity has traditionally been explored with evolutionary algorithms, differential plasticity allows for learning such plasticity updates via backpropagation.
Key Idea
We include an additional plastic component for each neuron. The fixed
part contains regular neuronal weights
where
The Hebbian trace is updated based on Hebbian dynamics:
The Hebbian trace is initialized to zero, at the beginning of each episode, and is purely a lifetime quantity.