TY - JOUR
T1 - An Interclass Margin Maximization Learning Algorithm for Evolving Spiking Neural Network
AU - Dora, Shirin
AU - Sundaram, Suresh
AU - Sundararajan, Narasimhan
PY - 2019/3/1
Y1 - 2019/3/1
N2 - This paper presents a new learning algorithm developed for a three layered spiking neural network for pattern classification problems. The learning algorithm maximizes the interclass margin and is referred to as the two stage margin maximization spiking neural network (TMM-SNN). In the structure learning stage, the learning algorithm completely evolves the hidden layer neurons in the first epoch. Further, TMM-SNN updates the weights of the hidden neurons for multiple epochs using the newly developed normalized membrane potential learning rule such that the interclass margins (based on the response of hidden neurons) are maximized. The normalized membrane potential learning rule considers both the local information in the spike train generated by a presynaptic neuron and the existing knowledge (synaptic weights) stored in the network to update the synaptic weights. After the first stage, the number of hidden neurons and their parameters are not updated. In the output weights learning stage, TMM-SNN updates the weights of the output layer neurons for multiple epochs to maximize the interclass margins (based on the response of output neurons). Performance of TMM-SNN is evaluated using ten benchmark data sets from the UCI machine learning repository. Statistical performance comparison of TMM-SNN with other existing learning algorithms for SNNs is conducted using the nonparametric Friedman test followed by a pairwise comparison using the Fisher's least significant difference method. The results clearly indicate that TMM-SNN achieves better generalization performance in comparison to other algorithms.
AB - This paper presents a new learning algorithm developed for a three layered spiking neural network for pattern classification problems. The learning algorithm maximizes the interclass margin and is referred to as the two stage margin maximization spiking neural network (TMM-SNN). In the structure learning stage, the learning algorithm completely evolves the hidden layer neurons in the first epoch. Further, TMM-SNN updates the weights of the hidden neurons for multiple epochs using the newly developed normalized membrane potential learning rule such that the interclass margins (based on the response of hidden neurons) are maximized. The normalized membrane potential learning rule considers both the local information in the spike train generated by a presynaptic neuron and the existing knowledge (synaptic weights) stored in the network to update the synaptic weights. After the first stage, the number of hidden neurons and their parameters are not updated. In the output weights learning stage, TMM-SNN updates the weights of the output layer neurons for multiple epochs to maximize the interclass margins (based on the response of output neurons). Performance of TMM-SNN is evaluated using ten benchmark data sets from the UCI machine learning repository. Statistical performance comparison of TMM-SNN with other existing learning algorithms for SNNs is conducted using the nonparametric Friedman test followed by a pairwise comparison using the Fisher's least significant difference method. The results clearly indicate that TMM-SNN achieves better generalization performance in comparison to other algorithms.
KW - Classification
KW - multilayer network
KW - spiking neural networks
UR - https://pure.ulster.ac.uk/en/publications/an-interclass-margin-maximization-learning-algorithm-for-evolving
UR - http://www.scopus.com/inward/record.url?scp=85041003621&partnerID=8YFLogxK
U2 - 10.1109/TCYB.2018.2791282
DO - 10.1109/TCYB.2018.2791282
M3 - Article
C2 - 29994611
AN - SCOPUS:85041003621
SN - 2168-2267
VL - 49
SP - 989
EP - 999
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
IS - 3
M1 - 8267508
ER -