European Numerical Mathematics and
 08:30 25 mins High resolution alignment of shocks for reduced order models of parametric hyperbolic problems Gerrit Welper Abstract: Shocks for parametric hyperbolic problems in particular and parameter dependent jumps in general pose severe challenges for the construction of efficient reduced order models. Standard approaches perform poorly because the necessary regularity, such as fast decay of singular values or Kolmogorov $n$-width, is not available. Nonetheless, some groups have developed more non-linear reduced order models suitable for jump singularities, such as shifted POD, TSMOR, displacement interpolation or transformed snapshot interpolation. Typically, these methods add a shift, transport or transformation of the physical variables into a reduced order model in order to realign all parameter dependent jumps to the correct location for the reconstruction at some online target parameter. In order to illustrate the idea, one may compute several snapshots $u(x, \mu_i)$ and shift them in the physical variables $u(x-s_i(\mu, x), \mu_i)$ to align all jumps to the correct jump-set of the solution $u(\cdot, \mu)$, we wish to approximate, at some new parameter $\mu$. The values $s_i(\mu, x)$ are the amount the snapshots are shifted depending on the location and parameter. One common drawback of most contemporary methods is that the spacial resolutions of $s_i(x, \mu)$ are very low, usually low order polynomials or even constants in $x$. This quickly becomes too limiting if we have several jumps moving in various directions with respect to $\mu$, since locally $s_i(x,\mu)$ must track'' each jump. In particular if jumps are close or even collide, the shift $s_i(x, \mu)$ itself develops sharp gradients or jumps in $x$. In that case, the problem of finding suitable discretizations of $s_i(x, \mu)$ has all features of our original problem: It is a parametric function with parameter dependent sharp gradients or jumps. In the first part of the proposed talk, we present a judicious discretization of $s_i(x,\mu)$ that avoids a circular argument. Similar to neural networks, the shifts or transforms $s_i(x, \mu)$ are trained'' by minimizing the reconstruction error over a training sample of snapshots by a gradient descent method. We discuss an appropriate Banach space framework for this optimization problem because naive approaches fail to yield satisfactory results for shifts $s_i(x,\mu)$ with high spacial resolution. 08:55 25 mins Barycentric subspaces: natural series of embedded subspaces in manifolds for dimension reduction Xavier Pennec Abstract: In the context of geometric statistics (statistics on objects with a geometric structure), a key issue is the generalization of Principal Component Analysis (PCA) to manifolds and potentially more general stratified spaces. Tangent PCA is often sufficient for analyzing data which are sufficiently centered around a central value (unimodal or Gaussian-like data), but fails for multimodal or large support distributions (e.g. uniform on close compact subspaces). Instead of a covariance matrix analysis, Principal Geodesic Analysis (PGA) and Geodesic PCA (GPCA) minimize the distance to Geodesic Subspaces (GS). These subspaces are spanned by the geodesics going through a point with tangent vector restricted to a linear subspace. Other methods like Principal Nested Spheres (PNS) restrict to simpler manifolds but emphasize on the need for the nestedness of the resulting principal subspaces. In this work, we first propose a new and more general type of family of subspaces in manifolds called barycentric subspaces. They are implicitly defined as the locus of points which are weighted means of k+1 reference points. As this definition relies on points and do not on tangent vectors, it can also be extended to geodesic spaces which are not Riemannian. For instance, in stratified spaces, it naturally allows to have principal subspaces that span over several strata, which is not the case with PGA. Barycentric subspaces locally define a submanifold of dimension k which generalizes geodesic subspaces. Like PGA, barycentric subspaces can naturally be nested, which allow the construction of inductive forward nested subspaces (flags) approximating data points which contains the Frechet mean. However, it also allows the construction of backward flags which may not contain the mean. Second, we rephrase PCA in Euclidean spaces as an optimization on flags of linear subspaces (a hierarchies of properly embedded linear subspaces of increasing dimension). We propose for that an extension of the unexplained variance criterion that generalizes nicely to flags of barycentric subspaces in Riemannian manifolds. This results into a particularly appealing generalization of PCA on manifolds, that we call Barycentric Subspaces Analysis (BSA). We illustrate the method on spherical and hyperbolic spaces, and on diffeomorphisms encoding the deformation of the heart in cardiac image sequences. 09:20 25 mins Reduced Order Modeling of a Free Boundary Osmotic Cell Swelling Problem with Exact Mass Conservation Christoph Lehrenfeld, Stephan Rave Abstract: We consider reduced basis techniques for the model order reduction of a free boundary problem of an osmotic cell. For the description of the geometry we concentrate on the use of an Arbitrary-Lagrangian-Eulerian description resulting in a highly nonlinear unsteady problem with an explicit parametrization of the geometry. We adjust the empirical interpolation procedure to preserve the mass conservation property of the full-order model exactly. Numerical experiments are provided that highlight the performance of the resulting reduced order model. 09:45 25 mins Sub-riemannian approaches and shape-aware metrics on shape spaces Alain Trouve Abstract: The riemannian point of view of the so-called diffeomorphometry integrates several important issues for the analysis of high dimensional shapes variations within a tractable numerical framework. More recently, modular sub-riemannian approaches on shape spaces have opened the possibility of a more decomposable, shape driven analysis of variations and evolutions. In this talk, we will advocate that paradoxically, quite sophisticated tools are still needed to allow a simple and user-friendly incorporation of meaningful prior knowledge into the mathematical shape analysis machinery.