logo
banner

Journals & Publications

Publications Papers

Papers

Feature Selection by Optimizing A Lower Bound of Conditional Mutual Information
Oct 30, 2017Author:
PrintText Size A A

Title: Feature Selection by Optimizing A Lower Bound of Conditional Mutual Information

 Authors: Peng, HY; Fan, Y

 Author Full Names: Peng, Hanyang; Fan, Yong

 Source: INFORMATION SCIENCES, 418 652-667; 10.1016/j.ins.2017.08.036 DEC 2017

 Language: English

 Abstract: A unified framework is proposed to select features by optimizing computationally feasible approximations of high-dimensional conditional mutual information (CMI) between features and their associated class label under different assumptions. Under this unified framework, state-of-the-art information theory based feature selection algorithms are re derived, and a new algorithm is proposed to select features by optimizing a lower bound of the CMI with a weaker assumption than those adopted by existing methods. The new feature selection method integrates a plug-in component to distinguish redundant features from irrelevant ones for improving the feature selection robustness. Furthermore, a novel metric is proposed to evaluate feature selection methods based on simulated data. The proposed method has been compared with state-of-the-art feature selection methods based on the new evaluation metric and classification performance of classifiers built upon the selected features. The experiment results have demonstrated that the proposed method could achieve promising performance in a variety of feature selection problems. (C) 2017 Elsevier Inc. All rights reserved.

 ISSN: 0020-0255

 eISSN: 1872-6291

 IDS Number: FI8LE

 Unique ID: WOS:000412253300041

*Click Here to View Full Record