[home] [Personal Program] [Help]
tag
15:45
25 mins
Machine Learning in Adaptive FETI-DP - Reducing the Effort in Sampling
Alexander Heinlein, Axel Klawonn, Martin Lanser, Janine Weber
Session: MS40: Machine Learning for Numerical Simulation
Session starts: Monday 30 September, 15:45
Presentation starts: 15:45
Room: Lecture room 530


Alexander Heinlein (University of Cologne)
Axel Klawonn (University of Cologne)
Martin Lanser (University of Cologne)
Janine Weber (University of Cologne)


Abstract:
The convergence rate of classic domain decomposition methods in general deteriorates severely for large discontinuities in the coefficient functions of the considered partial differential equation. To retain the robustness for such highly heterogeneous problems, the coarse space can be enriched by additional coarse basis functions. These can be obtained by solving local generalized eigenvalue problems on subdomain edges. In order to reduce the number of eigenvalue problems and thus the computational cost, we use a neural network to predict the geometric location of critical edges, i.e., edges where the eigenvalue problem is indispensable. As input data for the neural network, we use function evaluations of the coefficient function within the two subdomains adjacent to an edge. In the present article, we examine the effect of computing the input data only in a neighborhood of the edge, i.e., on slabs next to the edge. We show numerical results for both the training data as well as for a concrete test problem in form of a microsection subsection for linear elasticity problems. We observe that computing the sampling points only in one half or one quarter of each subdomain still provides robust algorithms.