Spiking neural networks, considered the third generation of neural networks (Maass, 1997), communicate by sequences of spikes, discrete events that take place at points in time, as depicted in Figure 1. SNNs have been widely used in numerous applications, including the brain-machine interface (Mashford, Yepes, Kiral-Kornek, Tang, & Harrer, 2017), machine control and navigation systems (Tang & Michmizos, 2018), speech recognition (Dominguez-Morales et al. 2018), event detection (Osswald, Ieng, Benosman, & Indiveri, 2017), forecasting (Lisitsa & Zhilenkov, 2017), fast signal processing (Simeone, 2018), decision making (Wei, Bu, & Dai, 2017), and classification problems (Dora, Subramanian, Suresh, & Sundararajan, 2016). They have increasingly received attention as powerful computational platforms that can be implemented in software or hardware. Table 1 shows the differences between SNNs and ANNs in terms of neuron, topology, and their features. A spiking neuron has a similar structure as an ANN neuron but different behavior. There are various spiking neuron models.

Figure 1:

Table 1:

. | Spiking Neural Network . | Artificial Neural Network . |
---|---|---|

Neuron | Spiking neuron (e.g., integrate and fire, Hodgkin-Huxley, Izhikevich) | Artificial neuron (sigmoid, ReLU, tanh) |

Information representation | Spike trains | Scalars |

Computation mode | Differential equations | Activation function |

Topology | LSM, Hopfield Network, RSNN, SCNN | RNN, CNN, LSTM, DBN, DNC |

Features | Real-time, low power, online learning, hardware friendly, biological close, fast and massively parallel data processing | Online learning, computation intensive, moderate parallelization of computations |

. | Spiking Neural Network . | Artificial Neural Network . |
---|---|---|

Neuron | Spiking neuron (e.g., integrate and fire, Hodgkin-Huxley, Izhikevich) | Artificial neuron (sigmoid, ReLU, tanh) |

Information representation | Spike trains | Scalars |

Computation mode | Differential equations | Activation function |

Topology | LSM, Hopfield Network, RSNN, SCNN | RNN, CNN, LSTM, DBN, DNC |

Features | Real-time, low power, online learning, hardware friendly, biological close, fast and massively parallel data processing | Online learning, computation intensive, moderate parallelization of computations |

This site uses cookies. By continuing to use our website, you are agreeing to our privacy policy.