A Priori Neural Networks Versus A Posteriori MOOD Loop: A High Accurate 1D FV Scheme Testing Bed

  • PDF / 5,203,136 Bytes
  • 36 Pages / 439.37 x 666.142 pts Page_size
  • 66 Downloads / 169 Views

DOWNLOAD

REPORT


A Priori Neural Networks Versus A Posteriori MOOD Loop: A High Accurate 1D FV Scheme Testing Bed Alexandre Bourriaud1 · Raphaël Loubère1

· Rodolphe Turpault1

Received: 17 December 2019 / Revised: 2 June 2020 / Accepted: 9 July 2020 © Springer Science+Business Media, LLC, part of Springer Nature 2020

Abstract In this work we present an attempt to replace an a posteriori MOOD loop used in a high accurate Finite Volume (FV) scheme by a trained artificial Neural Network (NN). The MOOD loop, by decrementing the reconstruction polynomial degrees, ensures accuracy, essentially non-oscillatory, robustness properties and preserves physical features. Indeed it replaces the classical a priori limiting strategy by an a posteriori troubled cell detection, supplemented with a local time-step re-computation using a lower order FV scheme (ie lower polynomial degree reconstructions). We have trained shallow NNs made of only two so-called hidden layers and few perceptrons which a priori produces an educated guess (classification) of the appropriate polynomial degree to be used in a given cell knowing the physical and numerical states in its vicinity. We present a proof of concept in 1D. The strategy to train and use such NNs is described on several 1D toy models: scalar advection and Burgers’ equation, the isentropic Euler and radiative M1 systems. Each toy model brings new difficulties which are enlightened on the obtained numerical solutions. On these toy models, and for the proposed test cases, we observe that an artificial NN can be trained and substituted to the a posteriori MOOD loop in mimicking the numerical admissibility criteria and predicting the appropriate polynomial degree to be employed safely. The physical admissibility criteria is however still dealt with the a posteriori MOOD loop. Constructing a valid training data set is of paramount importance, but once available, the numerical scheme supplemented with NN produces promising results in this 1D setting. Keywords Neural network · Machine learning · Finite Volume scheme · High accuracy · Hyperbolic system · a posteriori MOOD Mathematics Subject Classification 65M08 · 65A04 · 65Z05 · 85A25

B

Raphaël Loubère [email protected] Alexandre Bourriaud [email protected] Rodolphe Turpault [email protected]

1

Institut de Mathématiques de Bordeaux (IMB), CNRS, Bordeaux INP, Université de Bordeaux, 33400 Talence, France 0123456789().: V,-vol

123

31

Page 2 of 36

Journal of Scientific Computing

(2020) 84:31

1 Introduction Undoubtedly there is a frenetic activity revolving around the key words ’Machine Learning’, ’Artificial Intelligence’, ’Neural Networks’, etc. in most, if not all, branches of science. While there has been some genuine success brought by the use of those seemingly new tools, some unreasonable expectations (overstated by the media, the society and, sometimes, the scientists themselves) creep in the scientific laboratories. Unfortunately the deep understanding of those ’revolutionary tools’ is still far. This