logo
banner

Journals & Publications

Journals Publications Papers

Papers

Tracks Selection for Robust, Efficient and Scalable Large-Scale Structure from Motion
Oct 30, 2017Author:
PrintText Size A A

Title: Tracks Selection for Robust, Efficient and Scalable Large-Scale Structure from Motion

 Authors: Cui, HN; Shen, SH; Hu, ZY

 Author Full Names: Cui, Hainan; Shen, Shuhan; Hu, Zhanyi

 Source: PATTERN RECOGNITION, 72 341-354; 10.1016/j.patcog.2017.08.002 DEC 2017

 Language: English

 Abstract: Currently global structure-from-motion (SfM) pipeline consists of four steps: estimating camera rotations first, then computing camera positions, triangulating tracks, and finally doing bundle adjustment. However, for large-scale SfM problems, the tracks are usually too noisy and redundant for the bundle adjustment. Thus in this work, we propose a novel fast tracks selection method to improve both efficiency and robustness of the bundle adjustment. Firstly, three selection criteria: Compactness, Accurateness, and Connectedness, are introduced, where the first two are to calculate a selection priority for each track and the third is to guarantee the completeness of scene structure. Then, to satisfy these criteria, a more informative subset of tracks is selected by covering multiple spanning trees of epipolar geometry graph. Since tracks selection acts only an intermediate step in the whole SfM pipeline, it can be in principle embedded into any global SfM pipelines. To validate the effectiveness of our tracks selection module, we insert it into a state-of-the-art global SfM system and compare it with three other selection methods. Extensive experiments show that by embedding our tracks selection module, the new SfM system performs similarly or better than the original one in terms of reconstruction completeness and accuracy, but is much more efficient and scalable for large-scale scene reconstructions. Finally, our tracks selection module is further embedded into two other global SfM systems to demonstrated its versatility. (C) 2017 Elsevier Ltd. All rights reserved.

 ISSN: 0031-3203

 eISSN: 1873-5142

 IDS Number: FH9PY

 Unique ID: WOS:000411545400025

*Click Here to View Full Record