Physiological experiments have highlighted how the dendrites of biological neurons can nonlinearly process distributed synaptic inputs. However, it is unclear how aspects of a dendritic tree, such as its branched morphology or its repetition of presynaptic inputs, determine neural computation beyond this apparent nonlinearity. Here we use a simple model where the dendrite is implemented as a sequence of thresholded linear units. We manipulate the architecture of this model to investigate the impacts of binary branching constraints and repetition of synaptic inputs on neural computation. We find that models with such manipulations can perform well on machine learning tasks, such as Fashion MNIST or Extended MNIST. We find that model performance on these tasks is limited by binary tree branching and dendritic asymmetry and is improved by the repetition of synaptic inputs to different dendritic branches. These computational experiments further neuroscience theory on how different dendritic properties might determine neural computation of clearly defined tasks.
Skip Nav Destination
Article navigation
June 2021
May 13 2021
Might a Single Neuron Solve Interesting Machine Learning Problems Through Successive Computations on Its Dendritic Tree?
In Special Collection:
CogNet
Ilenna Simone Jones,
Ilenna Simone Jones
Department of Neuroscience, University of Pennsylvania, Philadelphia, PA 19104, U.S.A. ilennaj@pennmedicine.upenn.edu
Search for other works by this author on:
Konrad Paul Kording
Konrad Paul Kording
Departments of Neuroscience and Bioengineering, University of Pennsylvania, Philadelphia, PA 19104, U.S.A. kording@upenn.edu
Search for other works by this author on:
Ilenna Simone Jones
Department of Neuroscience, University of Pennsylvania, Philadelphia, PA 19104, U.S.A. ilennaj@pennmedicine.upenn.edu
Konrad Paul Kording
Departments of Neuroscience and Bioengineering, University of Pennsylvania, Philadelphia, PA 19104, U.S.A. kording@upenn.edu
Received:
August 27 2020
Accepted:
January 21 2021
Online ISSN: 1530-888X
Print ISSN: 0899-7667
© 2021 Massachusetts Institute of Technology
2021
Massachusetts Institute of Technology
Neural Computation (2021) 33 (6): 1554–1571.
Article history
Received:
August 27 2020
Accepted:
January 21 2021
Citation
Ilenna Simone Jones, Konrad Paul Kording; Might a Single Neuron Solve Interesting Machine Learning Problems Through Successive Computations on Its Dendritic Tree?. Neural Comput 2021; 33 (6): 1554–1571. doi: https://doi.org/10.1162/neco_a_01390
Download citation file:
Sign in
Don't already have an account? Register
Client Account
You could not be signed in. Please check your email address / username and password and try again.
Could not validate captcha. Please try again.
Sign in via your Institution
Sign in via your InstitutionEmail alerts
Advertisement
Related Articles
Understanding Dynamics of Nonlinear Representation Learning and Its Application
Neural Comput (March,2022)
Toward a Brain-Inspired Developmental Neural Network Based on Dendritic Spine Dynamics
Neural Comput (January,2022)
Extending Cable Theory to Heterogeneous Dendrites
Neural Comput (July,2008)
Dendritic Subunits Determined by Dendritic Morphology
Neural Comput (November,2001)
Related Book Chapters
Dendritic Spines
The Theoretical Foundation of Dendritic Function: The Collected Papers of Wilfrid Rall with Commentaries
Equalizing Time Constants and Electrotonic Length of Dendrites
The Theoretical Foundation of Dendritic Function: The Collected Papers of Wilfrid Rall with Commentaries
Cable Properties of Neurons with Complex Dendritic Trees
The Theoretical Foundation of Dendritic Function: The Collected Papers of Wilfrid Rall with Commentaries
Compartmental Method for Modeling Neurons, and the Analysis of Dendritic Integration
The Theoretical Foundation of Dendritic Function: The Collected Papers of Wilfrid Rall with Commentaries