Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Adriano Barra
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2023) 35 (5): 930–957.
Published: 18 April 2023
Abstract
View article
PDF
Hebb's learning traces its origin in Pavlov's classical conditioning; however, while the former has been extensively modeled in the past decades (e.g., by the Hopfield model and countless variations on theme), as for the latter, modeling has remained largely unaddressed so far. Furthermore, a mathematical bridge connecting these two pillars is totally lacking. The main difficulty toward this goal lies in the intrinsically different scales of the information involved: Pavlov's theory is about correlations between concepts that are (dynamically) stored in the synaptic matrix as exemplified by the celebrated experiment starring a dog and a ringing bell; conversely, Hebb's theory is about correlations between pairs of neurons as summarized by the famous statement that neurons that fire together wire together . In this letter, we rely on stochastic process theory to prove that as long as we keep neurons' and synapses' timescales largely split, Pavlov's mechanism spontaneously takes place and ultimately gives rise to synaptic weights that recover the Hebbian kernel.