Capturing Provenance to Improve the Model Training of PINNs: first hand- on experiences with Grid5000
Palavras-chave:
Physics-informed Neural Networks, Eikonal Equation, Provenance, Hybrid ComputingResumo
The growing popularity of Neural Networks in computational science and engineering raises several
challenges in configuring training parameters and validating the models. Machine learning has been used to ap-
proximate costly computational problems in computational mechanics, discover equations by coefficient estima-
tion, and build surrogates. Those applications are outside of the common usage of neural networks. They require a
different set of techniques generally encompassed by Physics-Informed Neural Networks (PINNs), which appear
to be a good alternative for solving forward and inverse problems governed by PDEs in a small data regime, espe-
cially when it comes to Uncertainty Quantification. PINNs have been successfully applied for solving problems in
fluid dynamics, inference of hydraulic conductivity, velocity inversion, phase separation, and many others. Never-
theless, we still need to investigate its computational aspects, especially its scalability, when running in large-scale
systems. Several hyperparameter configurations have to be evaluated to reach a trained model, often requiring
fine-tuning hyperparameters, despite the existence of a few setting recommendations. In PINNs, this fine-tuning
requires analyzing configurations and how they relate to the loss function evaluation. We propose provenance data
capture and analysis techniques to improve the model training of PINNs. We also report our first experiences on
running PINNs in Grid5000 using hybrid CPU-GPU computing.