logo
banner

Journals & Publications

Journals Publications Papers

Papers

Nonlinear discriminant analysis based on vanishing component analysis
Dec 16, 2016Author:
PrintText Size A A

Title: Nonlinear discriminant analysis based on vanishing component analysis
Authors: Shao, YX; Gao, GL; Wang, CH
Author Full Names: Shao, Yunxue; Gao, Guanglai; Wang, Chunheng
Source: NEUROCOMPUTING, 218 172-184; 10.1016/j.neucom.2016.08.058 DEC 19 2016
Language: English
Abstract: Most kernel-based nonlinear discriminant analysis methods need to compute the kernel distance between test samples and all of the training samples, but this approach consumes large volumes of time and memory, and it may be impractical when there is a large number of training samples. In this study, we propose a vanishing component analysis (VCA) based nonlinear discriminant analysis (VNDA) method. First, VNDA learns nonlinear mapping functions explicitly using the modified VCA method, before employing these functions to map the input feature onto a high-dimensional polynomial feature space, where the linear discriminant analysis (LDA) method is then applied. We prove that principal components analysis plus LDA is a special case of VNDA and that the set of mapping functions learned by VNDA is the best solution to the ratio trace problem in. the degree bounded polynomial feature space. Unlike kernel-based methods, VNDA only stores these mapping functions instead of all the training samples in the test step. Experimental results obtained based on four simulated data sets and 15 real data sets demonstrate that the proposed method yields highly competitive test recognition results compared to the state-of-the-art methods, while consuming less memory and time resources. (C) 2016 Elsevier B.V. All rights reserved.
ISSN: 0925-2312
eISSN: 1872-8286
IDS Number: EC3VA
Unique ID: WOS:000388053700019
*Click Here to View Full Record