Deep learning and deep knowledge representation in Spiking Neural Networks for Brain-Computer Interfaces

Kaushalya Kumarasinghe, Nikola Kasabov, Denise Taylor

Research output: Contribution to journalArticlepeer-review

27 Citations (Scopus)


Objective: This paper argues that Brain-Inspired Spiking Neural Network (BI-SNN) architectures can learn and reveal deep in time-space functional and structural patterns from spatio-temporal data. These patterns can be represented as deep knowledge, in a partial case in the form of deep spatio-temporal rules. This is a promising direction for building new types of Brain-Computer Interfaces called Brain-Inspired Brain–Computer Interfaces (BI-BCI). A theoretical framework and its experimental validation on deep knowledge extraction and representation using SNN are presented. Results: The proposed methodology was applied in a case study to extract deep knowledge of the functional and structural organisation of the brain's neural network during the execution of a Grasp and Lift task. The BI-BCI successfully extracted the neural trajectories that represent the dorsal and ventral visual information processing streams as well as its connection to the motor cortex in the brain. Deep spatiotemporal rules on functional and structural interaction of distinct brain areas were then used for event prediction in BI-BCI. Significance: The computational framework can be used for unveiling the topological patterns of the brain and such knowledge can be effectively used to enhance the state-of-the-art in BCI.

Original languageEnglish
Pages (from-to)169-185
Number of pages17
JournalNeural Networks
Early online date20 Sep 2019
Publication statusPublished (in print/issue) - 5 Jan 2020


  • Deep learning NeuCube
  • Knowledge representation
  • Spiking Neural Networks
  • Electroencephalography
  • Brain-Computer Interface


Dive into the research topics of 'Deep learning and deep knowledge representation in Spiking Neural Networks for Brain-Computer Interfaces'. Together they form a unique fingerprint.

Cite this