Membership Representation for Detecting Block-Diagonal Structure in Low-Rank or Sparse Subspace Clustering

Minsik Lee, Jieun Lee, Hyeogjin Lee, Nojun Kwak; The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2015, pp. 1648-1656

Abstract


Recently, there have been many proposals with state-of-the-art results in subspace clustering that take advantages of the low-rank or sparse optimization techniques. These methods are based on self-expressive models, which have well-defined theoretical aspects. They produce matrices with (approximately) block-diagonal structure, which is then applied to spectral clustering. However, there is no definitive way to construct affinity matrices from these block-diagonal matrices and it is ambiguous how the performance will be affected by the construction method. In this paper, we propose an alternative approach to detect block-diagonal structures from these matrices. The proposed method shares the philosophy of the above subspace clustering methods, in that it is a self-expressive system based on a Hadamard product of a membership matrix. To resolve the difficulty in handling the membership matrix, we solve the convex relaxation of the problem and then transform the representation to a doubly stochastic matrix, which is closely related to spectral clustering. The result of our method has eigenvalues normalized in between zero and one, which is more reliable to estimate the number of clusters and to perform spectral clustering. The proposed method shows competitive results in our experiments, even though we simply count the number of eigenvalues larger than a certain threshold to find the number of clusters.

Related Material


[pdf]
[bibtex]
@InProceedings{Lee_2015_CVPR,
author = {Lee, Minsik and Lee, Jieun and Lee, Hyeogjin and Kwak, Nojun},
title = {Membership Representation for Detecting Block-Diagonal Structure in Low-Rank or Sparse Subspace Clustering},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2015}
}