logo
banner

Journals & Publications

Publications Papers

Papers

Towards Scalable 3D Shape Retrieval
Jul 14, 2017Author:
PrintText Size A A

Title: GIFT: Towards Scalable 3D Shape Retrieval

 Authors: Bai, S; Bai, X; Zhou, ZC; Zhang, ZX; Tian, Q; Latecki, LJ

 Author Full Names: Bai, Song; Bai, Xiang; Zhou, Zhichao; Zhang, Zhaoxiang; Tian, Qi; Latecki, Longin Jan

 Source: IEEE TRANSACTIONS ON MULTIMEDIA, 19 (6):1257-1271; 10.1109/TMM.2017.2652071 JUN 2017

 Language: English

 Abstract: Projective analysis is an important solution in three-dimensional (3D) shape retrieval, since human visual perceptions of 3D shapes rely on various 2D observations from different viewpoints. Although multiple informative and discriminative views are utilized, most projection-based retrieval systems suffer from heavy computational cost, and thus cannot satisfy the basic requirement of scalability for search engines. In the past three years, shape retrieval contest (SHREC) pays much attention to the scalability of 3D shape retrieval algorithms, and organizes several large scale tracks accordingly [1]-[3]. However, the experimental results indicate that conventional algorithms cannot be directly applied to large datasets. In this paper, we present a real-time 3D shape search engine based on the projective images of 3D shapes. The real-time property of our search engine results from the following aspects: 1) efficient projection and view feature extraction using GPU acceleration; 2) the first inverted file, called F-IF, is utilized to speed up the procedure of multiview matching; and 3) the second inverted file, which captures a local distribution of 3D shapes in the feature manifold, is adopted for efficient context-based reranking. As a result, for each query the retrieval task can be finished within one second despite the necessary cost of IO overhead. We name the proposed 3D shape search engine, which combines GPU acceleration and inverted file (twice), as GIFT. Besides its high efficiency, GIFT also outperforms state-of-the-art methods significantly in retrieval accuracy on various shape benchmarks (ModelNet40 dataset, ModelNet10 dataset, PSB dataset, McGill dataset) and competitions (SHREC14LSGTB, ShapeNet Core55, WM-SHREC07).

 ISSN: 1520-9210

 eISSN: 1941-0077

 IDS Number: EY5YS

 Unique ID: WOS:000404059400012

*Click Here to View Full Record