European Numerical Mathematics and
Advanced Applications Conference 2019
30th sep - 4th okt 2019, Egmond aan Zee, The Netherlands
10:40   MS43: Efficient parametrization of complex models (Part 2)
Chair: Christian Rieger
10:40
25 mins
A representer theorem for deep kernel learning
Bastian Bohn, Michael Griebel, Christian Rieger
Abstract: In this talk we deal with chained approximations by linear combinations of kernel translates and discuss their relationship to deep neural networks. We introduce a representer theorem for these kinds of approximations for both the finite-data setting and for the case of infinitely many data points. We provide several numerical examples to illustrate how these approximation systems can be used to tackle function reconstruction and machine learning tasks.
11:05
25 mins
Kernel-based reconstruction for parametric PDEs
Holger Wendland, RĂ¼diger Kempf, Christian Rieger
Abstract: In uncertainty quantification, an unknown quantity has to be reconstructed which depends typically on the solution of a partial differential equation. This partial differential equation itself may depend on parameters, some of them are deterministic and some are random. To approximate the unknown quantity one thus has to solve the partial differential equation (usually numerically) for several instances of the parameters and then reconstruct the quantity from these simulations. As the number of parameters may be large, this becomes a high- dimensional reconstruction problem. In this talk, I will address the topic of reconstructing such unknown quantities using kernel-based reconstruction methods on sparse grids. I will introduce to the topic, explain the reconstruction process and provide error estimates.
11:30
25 mins
Kernel methods for inverse problems in the context of parametric equations
Christian Rieger
Abstract: In this talk, we review kernel based methods for parametric differential equations. We will mostly focus on the inverse problem to identify the parameters of a parametric differential equation. In this talk, we will present an error analysis using sampling inequalities and highlight similarities with methods from kernel based greedy approximation.
11:55
25 mins
The Kernel Tensor-Product Multi-Level Method
RĂ¼diger Kempf, Holger Wendland
Abstract: In applications such as machine learning and uncertainty quantification, one of the main tasks is to reconstruct an unknown function from given data with data sites lying in a high dimensional domain. This task is usually even for relatively small domain dimensions numerically difficult. We propose a new reconstruction scheme by combining the well-known kernel multi-level technique in low dimensional domains with the anisotropic Smolyak algorithm, which allows us to derive a high dimensional interpolation scheme. This new method has significantly lower complexity than traditional high-dimensional interpolation schemes. In this talk, I will give an introduction to the topics of kernel multi-level and anisotropic Smolyak algorithms before providing a convergence result for this new Kernel Tensor-Product Multi-Level method. If time permits, I will also give numerical examples. This talk is based upon joint work with H.Wendland (Bayreuth).