CUED Publications database

Regression on fixed-rank positive semidefinite matrices: A Riemannian approach

Meyer, G and Bonnabel, S and Sepulchre, R (2011) Regression on fixed-rank positive semidefinite matrices: A Riemannian approach. Journal of Machine Learning Research, 12. pp. 593-625. ISSN 1532-4435

Full text not available from this repository.

Abstract

The paper addresses the problem of learning a regression model parameterized by a fixed-rank positive semidefinite matrix. The focus is on the nonlinear nature of the search space and on scalability to high-dimensional problems. The mathematical developments rely on the theory of gradient descent algorithms adapted to the Riemannian geometry that underlies the set of fixedrank positive semidefinite matrices. In contrast with previous contributions in the literature, no restrictions are imposed on the range space of the learned matrix. The resulting algorithms maintain a linear complexity in the problem size and enjoy important invariance properties. We apply the proposed algorithms to the problem of learning a distance function parameterized by a positive semidefinite matrix. Good performance is observed on classical benchmarks. © 2011 Gilles Meyer, Silvere Bonnabel and Rodolphe Sepulchre.

Item Type: Article
Uncontrolled Keywords: Gradient-based learning Linear regression Low-rank approximation Positive semidefinite matrices Riemannian geometry
Subjects: UNSPECIFIED
Divisions: Div F > Control
Depositing User: Cron Job
Date Deposited: 07 Mar 2014 11:25
Last Modified: 08 Dec 2014 02:18
DOI: