Exploring Collections of 3D Models using Fuzzy Correspondences
Vladimir G. Kim, Wilmot Li, Niloy J. Mitra, Stephen DiVerdi, Thomas Funkhouser
ACM SIGGRAPH 2012

Abstract:

Large collections of 3D models from the same object class (e.g., chairs, cars, animals) are now commonly available via many public repositories, but exploring the range of shape variations across such collections remains a challenging task. In this work, we present a new exploration interface that allows users to browse collections based on similarities and differences between shapes in user-specified regions of interest (ROIs). To support this interactive system, we introduce a novel analysis method for computing similarity relationships between points on 3D shapes across a collection. We encode the inherent ambiguity in these relationships using fuzzy point correspondences and propose a robust and efficient computational framework that estimates fuzzy correspondences using only a sparse set of pairwise model alignments. We evaluate our analysis method on a range of correspondence benchmarks and report sub- stantial improvements in both speed and accuracy over existing alternatives. In addition, we demonstrate how fuzzy correspondences enable key features in our exploration tool, such as automated view alignment, ROI-based similarity search, and faceted browsing.

Code, data, etc.:

Please refer to project page.

Video:

Acknowledgements:

We thank Marc Alexa, Yaron Lipman, Amit Singer, and the anonymous reviewers for their comments and suggestions. The project was partially supported by NSERC, NSF, , AFOSR, Intel, Google, and Marie Curie Career Integration Grant 303541.

Bibtex:

@article{klmdf_fuzzyCorrespondence_sigg12,
AUTHOR = "Vladimir G. Kim and Wilmot Li and Niloy J. Mitra and Stephen DiVerdi and Thomas Funkhouser",
TITLE = "Exploring Collections of 3D Models using Fuzzy Correspondences",
JOURNAL = "ACM Transactions on Graphics",
VOLUME = "31",
NUMBER = "4", 
YEAR = "2012", 
pages = {54:1--54:11},
articleno = {54},
numpages = {11},
}

paper (24MB) slides (200MB)
back to publications
back to homepage