You are here

  1. Home
  2. » Forschung
  3. » Preprints
Preprint-No.: <   412   >   Published in: November 2014   PDF-File: IGPM412.pdf
Title:Convergence of Alternating Least Squares Optimisation for Rank-One Approximation to High Order Tensors
Authors:Mike Espig, Aram Khachatryan
Abstract:
The approximation of tensors has important applications in various disciplines, but it remains an extremely challenging task. It is well known that tensors of higher order can fail to have best low-rank approximations, but with an important exception that best rank-one approximations always exists. The most popular approach to low-rank approximation is the alternating least squares (ALS) method. The conver- gence of the alternating least squares algorithm for the rank-one approximation problem is analysed in this paper. In our analysis we are focusing on the global convergence and the rate of convergence of the ALS algorithm. It is shown that the ALS method can converge sublinearly, Q-linearly, and even Q-superlinearly. Our theoretical results are demonstrated on explicit examples.
Keywords:tensor format, tensor representation, tensor network, alternating least squares optimisation, orthogonal projection method
Most recent Version:CLICK HERE