Liquid state machine LSM , a bio-inspired computing model consisting of the input sparsely connected to a randomly interlinked reservoir or liquid of spiking neurons followed by a readout layer, finds utility in a range of applications varying from robot control and sequence generation to action, speech, and image recognition. Plethora of recent efforts have been focused toward mimicking certain characteristics of biological systems to enhance the performance of modern artificial neural networks. It has been shown that biological neurons are more likely to be connected to other neurons in the close proximity, and tend to be disconnected as the neurons are spatially far apart. Inspired by this, we propose a group of locally connected neuron reservoirs, or an ensemble of liquids approach, for LSMs. We analyze how the segmentation of a single large liquid to create an ensemble of multiple smaller liquids affects the latency and accuracy of an LSM.
The Liquid State Machine (LSM)
"On the Effect of Heterogeneity on the Dynamics and Performance of Dyna" by Alireza Goudarzi
Natural computation, Evolutionary computation, Neural networks Computer science , Complex systems, Liquid state machine, Random Boolean networks. The high cost of processor fabrication plants and approaching physical limits have started a new wave research in alternative computing paradigms. As an alternative to the top-down manufactured silicon-based computers, research in computing using natural and physical system directly has recently gained a great deal of interest. A branch of this research promotes the idea that any physical system with sufficiently complex dynamics is able to perform computation. The power of networks in representing complex interactions between many parts make them a suitable choice for modeling physical systems. Many studies used networks with a homogeneous structure to describe the computational circuits. However physical systems are inherently heterogeneous.
Structured Liquids in Liquid State Machines
Reservoir computing is a framework for computation derived from recurrent neural network theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir. The concept of reservoir computing stems from the use of recursive connections within neural networks to create a complex dynamical system. A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components. Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks.
Brian is a free, open source simulator for spiking neural networks. While added additional functions to pull and subsequently access delayed neuron variables, this isn't done for pre and postsynaptic variables associated with weight update models. What about coding a Spiking Neural Network using an automatic differentiation framework? In SNNs, there is a time axis and the neural network sees data throughout time, and activation functions are instead spikes that are raised past a certain pre-activation threshold. Pre-activation values constantly fades if neurons aren't excited enough.