One-Dimensional Wave Equation Simulation Using Recurrent Neural Networks
Palavras-chave:
Wave Equation, Recurrent Neural Networks, Finite Difference Method, LSTM, GRUResumo
This paper presents an application of Recurrent Neural Networks to the one-dimensional Wave Equation.
Nowadays, Neural Networks have been widely used due to the advances in computer hardware, since it allows
that a great amount of data could be processed in parallel. Over the years, new and improved neural network
architectures were developed, as for example the Recurrent Neural Network, mainly used in time series analysis.
In this study, the 1-D wave equation solution is implemented using Finite Differences in Time Domain considering
Neumann Boundary Conditions, and two architectures of Recurrent Neural Networks were explored: LSTM and
GRU. The results were organized according to the hyper-parameters used to train and validate the networks, and
they were evaluated quantitatively, using the mean squared error as loss function, and qualitatively, observing the
response plots for dataset validation. It was possible to achieve predictions with mean squared errors of order 10−6
and a training time of 23 seconds per epoch using GPUs.