|Preprint-No.:||< 454 >||Published in:||July 2016||PDF-File:||IGPM454.pdf|
|Title:||Parametric PDEs: Sparse or low-rank approximations?|
|Authors:||Markus Bachmayr, Albert Cohen, Wolfgang Dahmen|
We consider a class of parametric operator equations where the involved parameters could either be of deterministic or stochastic nature. In both cases we focus on scenarios involving a large number of parameters. Typical strategies for addressing the challenges posed by high dimensionality use low-rank approximations of solutions based on a separation of spatial and parametric variables. One such strategy is based on performing sparse best n -term approximations of the solution map in an a priori chosen system of tensor product form in the parametric variables. This approach has been extensively analyzed in the case of tensor product Legendre polynomial bases, for which approximation rates have been established. The objective of this paper is to investigate what can be gained by exploiting further low rank structures, in particular using optimized systems of basis functions obtained by singular value decomposition techniques. On the theoretical side, we show that optimized low-rank expansions can either bring significant or no improvement over sparse polynomial expansions, depending on the type of parametric problem. On the computational side, we analyze an adaptive solver which, at near-optimal computational cost for this type of approximation, exploits low-rank structure as well as sparsity of basis expansions.
|Keywords:||parameter-dependent PDEs, low-rank approximations, sparse polynomial expansions, a posteriori error estimates, adaptive methods, complexity bounds|