This paper proposed a new method for image segmentation based on the spectral theory. The presented algorithm, which is called the normalized cut, had stimulated a great number of similar work in this field(EX: this one). The paper also discussed several related techniques and provides a concrete mathematical deduction of the algorithm, which all give us insightful inspiration. I came to learn the spectral methods personally in these days and found this paper quite helpful :)
The normalized cuts is based on an intuitive concept - if we want to cluster data properly, we should try to maximize the within-class similarity while minimizing the between-class similarity. The author proposed the N-cut measure for this idea. Although the exact solution is NP-hard, an approximation to the resulting optimization then turns out to be solving a generalized eigen-vector problem in the real domain. The normalized cut is superior to the conventional cut measure in that it would not prefer shorter boundaries. In fact, the idea of normalized cut is very similar to LDA and both methods work by solving a generalized eigen-vector problem on affinity matrices.
The experiments done in the paper does help us to appreciate the idea better. Especially, the experiment on 1D data shows how the other spectral methods would fail in specific cases. I also like the spring-mass analogy. It is really interesting.
沒有留言:
張貼留言