SWAT: A Spiking Neural Network Training Algorithm for Classification Problems

Research output: Contribution to journalArticlepeer-review

158 Citations (Scopus)
431 Downloads (Pure)


This paper presents a synaptic weight associationtraining (SWAT) algorithm for spiking neural networks (SNNs).SWAT merges the Bienenstock–Cooper–Munro (BCM) learningrule with spike timing dependent plasticity (STDP). TheSTDP/BCM rule yields a unimodal weight distribution wherethe height of the plasticity window associated with STDP ismodulated causing stability after a period of training. The SNNuses a single training neuron in the training phase where dataassociated with all classes is passed to this neuron. The rulethen maps weights to the classifying output neurons to reflectsimilarities in the data across the classes. The SNN also includesboth excitatory and inhibitory facilitating synapses which createa frequency routing capability allowing the information presentedto the network to be routed to different hidden layer neurons.A variable neuron threshold level simulates the refractory period.SWAT is initially benchmarked against the nonlinearly separableIris and Wisconsin Breast Cancer datasets. Results presentedshow that the proposed training algorithm exhibits a convergenceaccuracy of 95.5% and 96.2% for the Iris and Wisconsin trainingsets, respectively, and 95.3% and 96.7% for the testing sets,noise experiments show that SWAT has a good generalizationcapability. SWAT is also benchmarked using an isolated digitautomatic speech recognition (ASR) system where a subset ofthe TI46 speech corpus is used. Results show that with SWAT asthe classifier, the ASR system provides an accuracy of 98.875%for training and 95.25% for testing.
Original languageEnglish
Pages (from-to)1817-1830
JournalIEEE Transactions on Neural Networks
Issue number11
Publication statusPublished (in print/issue) - Nov 2010

Bibliographical note

©2010 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
Reference text: [1] L. Benuskova and N. Kasabov, “Modeling L-LTP based on changes in
concentration of pCREB transcription factor,” Neurocomputing, vol. 70,
nos. 10–12, pp. 2035–2040, Jun. 2007.
[2] M. C. W. van Rossum, G. Q. Bi, and G. G. Turrigiano, “Stable Hebbian
learning from spike timing-dependent plasticity,” J. Neurosci., vol. 20,
no. 23, pp. 8812–8821, Dec. 2000.
[3] E. M. Izhikevich and N. S. Desai, “Relating STDP to BCM,” Neural
Comput., vol. 15, no. 7, pp. 1511–1523, Jul. 2003.
[4] S. M. Bohte, J. N. Kok, and H. L. Poutré, “Error-backpropagation in
temporally encoded networks of spiking neurons,” Neurocomputing, vol.
48, nos. 1–4, pp. 17–37, Oct. 2002.
[5] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning internal
representations by error propagation,” in Parallel Distributed Processing,
J. L. McClelland and D. E. Rumelhart, Eds. Cambrige, MA: MIT Press,
[6] P. Rowcliffe and J. Feng, “Training spiking neuronal networks with
applications in engineering tasks,” IEEE Trans. Neural Netw., vol. 19,
no. 9, pp. 1626–1640, Sep. 2008.
[7] J.-P. Pfister, D. Barber, and W. Gerstner, “Optimal Hebbian learning: A
probabilistic point of view,” in Proc. ICANN/ICONIP, Istanbul, Turkey,
Jun. 2003, pp. 92–98.
[8] A. Belatreche, L. P. Maguire, T. M. McGinnity, and Q.-X. Wu, “A
method for supervised training of spiking neural networks,” in Proc.
IEEE Cybern. Intell.-Challenges Adv., Reading, U.K., Sep. 2003, pp.
[9] S. Soltic, S. G. Wysoski, and N. K. Kasabov, “Evolving spiking neural
networks for taste recognition,” in Proc. IEEE Int. Joint Conf. Neural
Netw. (IEEE World Congr. Comput. Intell.), Hong Kong, China, Jun.
2008, pp. 2091–2097.
[10] D. O. Hebb, The Organization of Behavior: A Neuropsychological
Theory. New York: Wiley, 1949.
[11] B. Ruf and M. Schmitt, “Learning temporally encoded patterns in
networks of spiking neurons,” Neural Process. Lett., vol. 5, no. 1, pp.
9–18, 1997.
[12] R. Legenstein, C. Naeger, and W. Maass, “What can a neuron learn with
spike-timing-dependent plasticity?” Neural Comput., vol. 17, no. 11, pp.
2337–2382, Nov. 2005.
[13] G. Bi and M. Poo, “Synaptic modifications in cultured hippocampal neurons:
Dependence on spike timing, synaptic strength, and postsynaptic
cell type,” J. Neurosci., vol. 18, no. 24, pp. 10464–10472, Dec. 1998.
[14] H. Markram, J. Lübke, M. Frotscher, and B. Sakmann, “Regulation
of synaptic efficacy by coincidence of postsynaptic APs and EPSPs,”
Science, vol. 275, no. 5297, pp. 213–215, Jan. 1997.
[15] A. Kasinski and F. Ponulak, “Experimental demonstration of learning
properties of a new supervised learning method for the spiking neural
networks,” in Proc. Int. Conf. Artificial Neural Netw., Warsaw, Poland,
Sep. 2005, pp. 145–152.
[16] T. J. Strain, L. J. McDaid, L. P. Maguire, and T. M. McGinnity, “A novel
mixed supervised-unsupervised training approach for a spiking neural
network classifier,” in Proc. Conf. Intell. Cybern. Syst., SMC UK-RI
Chapter, Londonderry, U.K., Sep. 2004, pp. 202–206.
[17] Q. Wu, T. McGinnity, L. Maguire, A. Belatreche, and B. Glackin,
“Adaptive co-ordinate transformation based on a spike timing-dependent
plasticity learning paradigm,” in LNCS: Advances in Natural Computation,
vol. 3610, L. Wang, K. Chen, and Y. S. Ong, Eds. Berlin, Germany:
Springer-Verlag, 2005, pp. 420–428.
[18] L. F. Abbott and S. B. Nelson, “Synaptic plasticity: Taming the beast,”
Nat. Neurosci. Suppl., vol. 3, pp. 1178–1183, Nov. 2000.
[19] S. Song, K. D. Miller, and L. F. Abbott, “Competitive Hebbian learning
through spike-timing-dependent synaptic plasticity,” Nat. Neurosci., vol.
3, no. 9, pp. 919–926, Sep. 2000.
[20] Q. X. Wu, T. M. McGinnity, L. P. Maguire, B. Glackin, and A.
Belatreche, “Learning under weight constraints in networks of temporal
encoding spiking neurons,” Neurocomputing, vol. 69, nos. 16–18, pp.
1912–1922, Oct. 2006.
[21] J. M. Brader, W. Senn, and S. Fusi, “Learning real-world stimuli in a
neural network with spike-driven synaptic dynamics,” Neural Comput.,
vol. 19, no. 11, pp. 2881–2912, Nov. 2007.
[22] A. M. Thomson and J. Deuchars, “Temporal and spatial properties of
local circuits in neocortex,” Trends Neurosci., vol. 17, no. 3, pp. 119–
126, Mar. 1994.
[23] M. V. Tsodyks and H. Markram, “The neural code between neocortical
pyramidal neurons depends on neurotransmitter release probability,”
Proc. Natl. Acad. Sci., vol. 94, no. 2, pp. 719–723, Jan. 1997.
[24] M. V. Tsodyks, K. Pawelzik, and H. Markram, “Neural networks with
dynamic synapses,” Neural Comput., vol. 10, no. 4, pp. 821–835, May
[25] R. C. Froemke and Y. Dan, “Spike-timing-dependent synaptic modification
induced by natural spike trains,” Nature, vol. 416, pp. 433–438,
Mar. 2002.
[26] P. Jedlicka, “Synaptic plasticity, metaplasticity and BCM theory,” Bratisl.
Lek. Listy, vol. 103, nos. 4–5, pp. 137–143, 2002.
[27] E. L. Bienenstock, L. N. Cooper, and P. W. Munro, “Theory for the
development of neuron selectivity: Orientation specificity and binocular
interaction in visual cortex,” J. Neurosci., vol. 2, no. 1, pp. 32–48, Jan.
[28] V. A. Klyachko and C. F. Stevens, “Excitatory and feed-forward inhibitory
hippocampal synapses work synergistically as an adaptive filter
of natural spike trains,” PLoS Biol., vol. 4, no. 7, p. e207, Jul. 2006.
[29] A. J. Delaney and C. E. Jahr, “Kainate receptors differentially regulate
release at two parallel fiber synapses,” Neuron, vol. 36, no. 3, pp. 475–
482, Oct. 2002.
[30] J. S. Dittman, A. C. Kreitzer, and W. G. Regehr, “Interplay between
facilitation, depression, and residual calcium at three presynaptic terminals,”
J. Neurosci., vol. 20, no. 4, pp. 1374–1385, Feb. 2000.
[31] M. Mori, M. H. Abegg, B. H. Gähwiler, and U. Gerber, “A frequencydependent
switch from inhibition to excitation in a hippocampal unitary
circuit,” Nature, vol. 431, no. 7007, pp. 453–456, Sep. 2004.
[32] C. Saviane, L. P. Savtchenko, G. Raffaelli, L. L. Voronin, and
E. Cherubini, “Frequency-dependent shift from paired-pulse facilitation
to paired-pulse depression at unitary CA3–CA3 synapses in the rat
hippocampus,” J. Physiol., vol. 544, no. 2, pp. 469–476, Oct. 2002.
[33] J. J. Wade, L. J. McDaid, J. A. Santos, and H. M. Sayers, “A biologically
inspired training algorithm for spiking neural networks,” in Proc. Irish
Signals Syst. Conf., Londonderry, U.K., Sep. 2007, pp. 7–12.
[34] J. J. Wade, L. J. McDaid, J. A. Santos, and H. M. Sayers, “SWAT: An
unsupervised SNN training algorithm for classification problems,” in
Proc. IEEE Int. Joint Conf. Neural Netw. (IEEE World Congr. Comput.
Intell.), Hong Kong, China, Jun. 2008, pp. 2648–2655.
[35] A. Araque, G. Carmignoto, and P. G. Haydon, “Dynamic signaling
between astrocytes and neurons,” Annu. Rev. Phys., vol. 63, no. 1, pp.
795–813, 2001.
[36] P. Kurosinski and J. Götz, “Glial cells under physiologic and pathologic
conditions,” Arch. Neurol., vol. 59, no. 10, pp. 1524–1528, Oct. 2002.
[37] G. Perea and A. Araque, “Properties of synaptically evoked astrocyte
calcium signal reveal synaptic information processing by astrocytes,”
J. Neurosci., vol. 25, no. 9, pp. 2192–2203, Mar. 2005.
[38] G. Perea and A. Araque, “Synaptic information processing by astrocytes,”
J. Physiol.-Paris, vol. 99, nos. 2–3, pp. 92–97, Mar.–May 2006.
[39] G. Perea and A. Araque, “Astrocytes potentiate transmitter release at
single hippocampal synapses,” Science, vol. 317, no. 5841, pp. 1083–
1086, Aug. 2007.
[40] W. Gerstner and W. M. Kistler, Spiking Neuron Models: Single Neurons,
Populations, Plasticity. Cambridge, U.K.: Cambridge Univ. Press, 2002.
[41] N. Kasabov and L. Benuskova, “Computational neurogenetics,” J. Comput.
Theor. Nanosci., vol. 1, no. 1, pp. 47–61, Mar. 2004.
[42] R. F. Thompson, The Brain: A Neuroscience Primer, 2nd ed. San
Francisco, CA: Freeman, 1993.
[43] N. Kasabov, “Neuro-, genetic-, and quantum inspired evolving intelligent
systems,” in Proc. Int. Symp. Evolving Fuzzy Syst., Ambleside, U.K.,
Sep. 2006, pp. 63–73.
[44] E. R. Kandel, J. H. Schwartz, and T. M. Jessell, Principles of Neural
Science, 4th ed. New York: McGraw-Hill, 2000.
[45] R. A. Fisher, “The use of multiple measurements in taxonomic problems,”
Ann. Eugenics, vol. 7, no. 2, pp. 179–188, 1936.
[46] Z. Yang and A. F. Murray, “An artificial early visual model adopting
spike-timing-dependent plasticity,” Neurocomputing, vol. 69, nos. 16–
18, pp. 1904–1911, Oct. 2006.
[47] W. H. Wolberg and O. L. Mangasarian, “Multisurface method of pattern
separation for medical diagnosis applied to breast cytology,” Proc. Natl.
Acad. Sci., vol. 87, no. 23, pp. 9193–9196, Dec. 1990.
[48] Breast Cancer Wisconsin Dataset [Online]. Available:
[49] W. Gerstner, R. Kempter, J. L. van Hemmen, and H. Wagner, “A
neuronal learning rule for sub-millisecond temporal coding,” Nature,
vol. 383, no. 6595, pp. 76–78, Sep. 1996.
[50] I. A. Basheer and M. Hajmeer, “Artificial neural networks: Fundamentals,
computing, design, and application,” J. Microbiol. Methods, vol.
43, no. 1, pp. 3–31, Dec. 2000.
[51] G. R. Doddington and T. B. Schalk, “Speech recognition: Turning theory
to practice,” IEEE Spectrum, vol. 18, no. 9, pp. 26–32, Sep. 1981.
[52] L. R. Rabiner and M. R. Sambur, “An algorithm for determining the
endpoints of isolated utterances,” J. Acoust. Soc. Am. Bell Syst. Tech.,
vol. 54, no. 2, pp. 297–315, Feb. 1975.
[53] X. Zhang, Y. Guo, and X. Hou, “A speech recognition method of isolated
words based on modified LPC cepstrum,” in Proc. IEEE Int. Conf.
Granular Comput., Washington D.C., Nov. 2007, p. 481.
[54] R. Fernández-Lorenzana, F. Pérez-Cruz, J. M. García-Cabellos, C.
Peláez-Moreno, A. Gallardo-Antolín, and F. Díaz-de-María, “Some
experiments on speaker-independent isolated digit recognition using
SVM classifiers,” in Proc. ISCA Tutorial Res. Workshop Non-Linear
Speech Process., Le Croisic, France, May 2003, pp. 1–7.
[55] Auditory Toolbox Version 2 [Online]. Available:


  • Index Terms—Automatic speech recognition
  • Bienenstock–
  • Cooper–Munro
  • dynamic synapses
  • spike timing dependent plasticity
  • spiking neural networks.


Dive into the research topics of 'SWAT: A Spiking Neural Network Training Algorithm for Classification Problems'. Together they form a unique fingerprint.

Cite this