Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Daya Gupta
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2023) 35 (5): 763–806.
Published: 18 April 2023
Abstract
View article
PDF
Machine learning tools, particularly artificial neural networks (ANN), have become ubiquitous in many scientific disciplines, and machine learning-based techniques flourish not only because of the expanding computational power and the increasing availability of labeled data sets but also because of the increasingly powerful training algorithms and refined topologies of ANN. Some refined topologies were initially motivated by neuronal network architectures found in the brain, such as convolutional ANN. Later topologies of neuronal networks departed from the biological substrate and began to be developed independently as the biological processing units are not well understood or are not transferable to in silico architectures. In the field of neuroscience, the advent of multichannel recordings has enabled recording the activity of many neurons simultaneously and characterizing complex network activity in biological neural networks (BNN). The unique opportunity to compare large neuronal network topologies, processing, and learning strategies with those that have been developed in state-of-the-art ANN has become a reality. The aim of this review is to introduce certain basic concepts of modern ANN, corresponding training algorithms, and biological counterparts. The selection of these modern ANN is prone to be biased (e.g., spiking neural networks are excluded) but may be sufficient for a concise overview.