Guessing Efficiently for Constrained Subspace Approximation
- Aditya Bhaskara ,
- Sepideh Mahabadi ,
- Madhusudhan Reddy Pittu ,
- Ali Vakilian ,
- David Woodruff
International Colloquium on Automata, Languages, and Programming (ICALP) |
In this paper we study constrained subspace approximation problem. Given a set of points {a_1,…,a_n} in R^d , the goal of the subspace approximation problem is to find a k-dimensional subspace that best approximates the input points. More precisely, for a given p>=1, we aim to minimize the pth power of the l_p norm of the error vector (||a_1- Pa_1||,…,||a_n-Pa_n||), where P denotes the projection matrix onto the subspace and the norms are Euclidean. In constrained subspace approximation (CSA), we additionally have constraints on the projection matrix P. In its most general form, we require P to belong to a given subset S that is described explicitly or implicitly.
We introduce a general framework for constrained subspace approximation. Our approach, that we term coreset-guess-solve, yields either (1+ε)-multiplicative or ε-additive approximations for a variety of constraints. We show that it provides new algorithms for partition-constrained subspace approximation with applications to fair subspace approximation, k-means clustering, and projected non-negative matrix factorization, among others. Specifically, while we reconstruct the best known bounds for k-means clustering in Euclidean spaces, we improve the known results for the remainder of the problems.