Abstract
In this study, we develop a recurrent neural network-induced Gaussian process (RNNGP) to model sequence data. We derive the equivalence between infinitely wide neural networks and Gaussian processes (GPs) for a relaxed recurrent neural network (RNN) with untied weights. We compute the covariance function of the RNNGP using an analytical iteration formula derived through the RNN procedure with an error-function-based activation function. To simplify our discussion, we use the RNNGP to perform Bayesian inference on vanilla RNNs for various problems, such as Modified National Institute of Standards and Technology digit identification, Mackey–Glass time-series forecasting, and lithium-ion battery state-of-health estimation. The results demonstrate the flexibility of the RNNGP in modeling sequence data. Furthermore, the RNNGP predictions typically outperform those of the original RNNs and GPs, demonstrating the efficiency of the RNNGP as a data-driven model. Moreover, the RNNGP can quantify the uncertainty in the predictions, which implies the significant potential of the RNNGP in uncertainty quantification analyses.
Original language | English |
---|---|
Pages (from-to) | 75-84 |
Number of pages | 10 |
Journal | Neurocomputing |
Volume | 509 |
DOIs | |
Publication status | Published - 2022 Oct 14 |
Bibliographical note
Funding Information:This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government(Ministry of Science and ICT) (NRF-2017R1E1A1A0-3070161 and NRF-20151009350) and the Fundamental Research Funds for the Central Universities (202213038).
Publisher Copyright:
© 2022
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence