Communication Dans Un Congrès Année : 2025

ANAGRAM: a natural gradient relative to adapted model for efficient PINNS learning

Résumé

In the recent years, Physics Informed Neural Networks (PINNs) have received strong interest as a method to solve PDE driven systems, in particular for data assimilation purpose. This method is still in its infancy, with many shortcomings and failures that remain not properly understood. In this paper we propose a natural gradient approach to PINNs which contributes to speed-up and improve the accuracy of the training. Based on an in depth analysis of the differential geometric structures of the problem, we come up with two distinct contributions: (i) a new natural gradient algorithm that scales as minpP 2 S, S 2 P q, where P is the number of parameters, and S the batch size; (ii) a mathematically principled reformulation of the PINNs problem that allows the extension of natural gradient to it, with proved connections to Green's function theory.

Fichier principal
Vignette du fichier
Papier_Features_Learning_for_PINNs.pdf (6.51 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
licence

Dates et versions

hal-04918272 , version 1 (29-01-2025)

Licence

Identifiants

  • HAL Id : hal-04918272 , version 1

Citer

Nilo Schwencke, Cyril Furtlehner. ANAGRAM: a natural gradient relative to adapted model for efficient PINNS learning. ICLR 2025 - International Conference on Learning Representations, Apr 2025, Singapour, Malaysia. ⟨hal-04918272⟩
0 Consultations
0 Téléchargements

Partager

More