Neural networks catching up with finite differences in solving partial differential equations in higher dimensions
- PDF / 846,568 Bytes
- 16 Pages / 595.276 x 790.866 pts Page_size
- 82 Downloads / 218 Views
(0123456789().,-volV)(0123456789(). ,- volV)
ORIGINAL ARTICLE
Neural networks catching up with finite differences in solving partial differential equations in higher dimensions Vsevolod I. Avrutskiy1 Received: 29 December 2018 / Accepted: 17 January 2020 Springer-Verlag London Ltd., part of Springer Nature 2020
Abstract Solving partial differential equations using neural networks is mostly a proof of concept approach. In the case of direct function approximation, a single neural network is constructed to be the solution of a particular boundary value problem. Independent variables are fed into the input layer, and a single output is considered as the solution’s value. The network is substituted into the equation, and the residual is then minimized with respect to the weights of the network using a gradientbased method. Our previous work showed that by minimizing all derivatives of the residual up to the third order one can obtain a machine precise solution for 2D boundary value problem using very sparse grids. The goal of this paper is to use this grid complexity advantage in order to obtain a solution faster than finite differences. However, the number of all possible high-order derivatives (and therefore the training time) increases with the number of dimensions and it was unclear whether this goal can be achieved. Here, we demonstrate that this increase can be compensated by using random directional derivatives instead. In 2D case neural networks are slower than finite differences, but for each additional dimension the complexity increases approximately 4 times for neural networks and 125 times for finite differences. This allows neural networks to catch up in 3D case for memory complexity and in 5D case for time complexity. For the first time a machine precise solution was obtained with neural network faster than with finite differences method. Keywords Neural networks Partial differential equations Nonlinear Poisson equation 5D boundary value problem
1 Introduction Partial differential equations have an enormous number of applications and are fundamental for predicting many natural phenomena. Sometimes pencil and paper are enough to obtain their solutions as closed-form expressions or infinite series, but in most cases numerical methods are the only remedy. A vast number of those were developed [8, 17, 21, 25, 54, 55]. Operating with fixed resources, they all have to use some finite way to describe functions they are trying to find. Either by values on a grid or by a set of simple functions defined inside minute volumes, numerical methods are using a finite set of real, or rather, rational parameters that can construct a solution. In this paper, the & Vsevolod I. Avrutskiy [email protected] 1
Department of Aeromechanics and Flight Engineering, Moscow Institute of Physics and Technology, Institutsky Lane 9, Dolgoprudny, Moscow Region, Russia 141700
solving method approximates functions using neural networks. Here, is an example of how one can look like: !! X X X 0 3 2 1 uðxÞ ¼ Wk r Wkj r W
Data Loading...