Detailed Information

Cited 4 time in webofscience Cited 6 time in scopus
Metadata Downloads

Multisample Online Learning for Probabilistic Spiking Neural Networks

Authors
Jang, HyeryungSimeone, Osvaldo
Issue Date
May-2022
Publisher
IEEE
Keywords
Neuromorphic computing; probabilistic models; spiking neural networks (SNNs); variational learning
Citation
IEEE Transactions on Neural Networks and Learning Systems, v.33, no.5, pp 2034 - 2044
Pages
11
Indexed
SCIE
SCOPUS
Journal Title
IEEE Transactions on Neural Networks and Learning Systems
Volume
33
Number
5
Start Page
2034
End Page
2044
URI
https://scholarworks.dongguk.edu/handle/sw.dongguk/3851
DOI
10.1109/TNNLS.2022.3144296
ISSN
2162-237X
2162-2388
Abstract
Spiking neural networks (SNNs) capture some of the efficiency of biological brains for inference and learning via the dynamic, online, and event-driven processing of binary time series. Most existing learning algorithms for SNNs are based on deterministic neuronal models, such as leaky integrate-and-fire, and rely on heuristic approximations of backpropagation through time that enforces constraints such as locality. In contrast, probabilistic SNN models can be trained directly via principled online, local, and update rules that have proven to be particularly effective for resource-constrained systems. This article investigates another advantage of probabilistic SNNs, namely, their capacity to generate independent outputs when queried over the same input. It is shown that the multiple generated output samples can be used during inference to robustify decisions and to quantify uncertainty -a feature that deterministic SNN models cannot provide. Furthermore, they can be leveraged for training in order to obtain more accurate statistical estimates of the log-loss training criterion and its gradient. Specifically, this article introduces an online learning rule based on generalized expectation-maximization (GEM) that follows a three-factor form with global learning signals and is referred to as GEM-SNN. Experimental results on structured output memorization and classification on a standard neuromorphic dataset demonstrate significant improvements in terms of log-likelihood, accuracy, and calibration when increasing the number of samples used for inference and training. © 2012 IEEE.
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Advanced Convergence Engineering > Department of Computer Science and Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Jang, Hye Ryung photo

Jang, Hye Ryung
College of Advanced Convergence Engineering (Department of Computer Science and Artificial Intelligence)
Read more

Altmetrics

Total Views & Downloads

BROWSE