A new neural machine code to program reservoir computers
Reservoir computing, a promising framework for computation based on recurrent networks (RNNs), maps input data into a high-dimensional space of computation. It keeps some parameters in artificial neural networks (ANNs), while updating others. This framework can improve the performance and reduce the amount of training data for machine learning algorithms.
RNNs use recurrent connections to their processing units in order to process data sequentially and make accurate predictions. RNNs are capable of performing well in a variety of tasks. However, optimizing the performance by identifying parameters most relevant to each task can be difficult and time-consuming.
Two researchers from the University of Pennsylvania, Jason Kim and Dani S. Bassett have recently developed an alternative method to design and program RNN reservoir computers. This approach is inspired by programming languages on computer hardware. This approach is published in Nature Machine Intelligence and can identify the parameters that are appropriate for a network. It then programs its computations so as to optimize it’s performance on target problems.
Source:
https://techxplore.com/news/2023-07-neural-machine-code-reservoir.html