Abstract
Reservoir computing is a framework which uses the nonlinear
internal dynamics of a recurrent neural network to perform complex
non-linear transformations of the input. This enables reservoirs to
carry out a variety of tasks involving the processing of time-dependent or
sequential-based signals. Reservoirs are particularly suited for tasks that
require memory or the handling of temporal sequences, common in areas
such as speech recognition, time series prediction, and signal processing.
Learning is restricted to the output layer and can be thought of as
“reading out” or “selecting from” the states of the reservoir. With all but
the output weights fixed they do not have the costly and difficult training
associated with deep neural networks. However, while the reservoir
computing framework shows a lot of promise in terms of efficiency and
capability, it can be unreliable. Existing studies show that small changes
in hyperparameters can markedly affect the network’s performance. Here
we studied the role of network topologies in reservoir computing in the
carrying out of three conceptually different tasks: working memory, perceptual
decision making, and chaotic time-series prediction. We implemented
three different network topologies (ring, lattice, and random)
and tested reservoir network performances on the tasks. We then used
algebraic topological tools of directed simplicial cliques to study deeper
connections between network topology and function, making comparisons
across performance and linking with existing reservoir research.
internal dynamics of a recurrent neural network to perform complex
non-linear transformations of the input. This enables reservoirs to
carry out a variety of tasks involving the processing of time-dependent or
sequential-based signals. Reservoirs are particularly suited for tasks that
require memory or the handling of temporal sequences, common in areas
such as speech recognition, time series prediction, and signal processing.
Learning is restricted to the output layer and can be thought of as
“reading out” or “selecting from” the states of the reservoir. With all but
the output weights fixed they do not have the costly and difficult training
associated with deep neural networks. However, while the reservoir
computing framework shows a lot of promise in terms of efficiency and
capability, it can be unreliable. Existing studies show that small changes
in hyperparameters can markedly affect the network’s performance. Here
we studied the role of network topologies in reservoir computing in the
carrying out of three conceptually different tasks: working memory, perceptual
decision making, and chaotic time-series prediction. We implemented
three different network topologies (ring, lattice, and random)
and tested reservoir network performances on the tasks. We then used
algebraic topological tools of directed simplicial cliques to study deeper
connections between network topology and function, making comparisons
across performance and linking with existing reservoir research.
Original language | English |
---|---|
Title of host publication | UK Workshop on Computational Intelligence |
Number of pages | 17 |
Publication status | Accepted/In press - 24 Jul 2024 |
Event | 23rd Annual UK Workshop on Computational Intelligence 2024 - Belfast Duration: 2 Sept 2024 → 4 Sept 2024 https://computing.ulster.ac.uk/ZhengLab/UKCI2024/about.html |
Workshop
Workshop | 23rd Annual UK Workshop on Computational Intelligence 2024 |
---|---|
City | Belfast |
Period | 2/09/24 → 4/09/24 |
Internet address |
Keywords
- Reservoir Computing
- Topology
- Algebraic Topology
- Directed Simplicial Cliques
- Network structure
- Spectral Graph Theory
- Machine Learning
- Computational Intelligence