Title: FatRegion: A Fast Adaptive Tree-Structured Region Extraction Approach
Authors: Xing, JL; Hu, WM; Ai, HZ; Yan, SC
Author Full Names: Xing, Junliang; Hu, Weiming; Ai, Haizhou; Yan, Shuicheng
Source: IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 28 (3):601-615; 10.1109/TCSVT.2016.2615466 MAR 2018
Language: English
Abstract: Coherent image regions can be used as good features for many computer vision tasks, such as object tracking, segmentation, and recognition. Most of previous region extraction methods, however, are not suitable for online applications because of their either heavy computations or unsatisfactory results. We propose a seed-based region growing and merging approach to generate simultaneously coherent and discriminative image regions. We present a quadtree-based seed initialization algorithm to adaptively place seeds into different image areas and then grow them into regions by a color-and edge-guided growing procedure. To merge these regions in different levels, we propose to use the generalized boundary strength to measure the quality of region merging result. In addition, we present a region merging algorithm of linear time complexity to perform efficient and effective region merging. Overall, our new approach simultaneously holds these advantages: 1) it is extremely fast with linear complexity in both time and space, which takes less than 50 ms to process an HVGA image; 2) it can give a direct control of the region number and well adapt to image regions with various sizes and shapes; and 3) it provides a tree-structured representation of the regions and thus can model the image from multiple scales. We evaluate the proposed approach on the standard benchmarks with extensive comparisons with the state-of-the-art methods. The experimental results demonstrate its good comprehensive performances. Example applications using the extracted regions as features for online object tracking and multiclass object segmentation also exhibit its potential for many computer vision tasks.
ISSN: 1051-8215
eISSN: 1558-2205IDS
Number: FY3BVUnique
ID: WOS:000426693100004