Classification via group sparsity promoting regularization
- 1 April 2009
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- No. 15206149,p. 861-864
- https://doi.org/10.1109/icassp.2009.4959720
Abstract
Recently a new classification assumption was proposed in [1]. It assumed that the training samples of a particular class approximately form a linear basis for any test sample belonging to that class. The classification algorithm in [1] was based on the idea that all the correlated training samples belonging to the correct class are used to represent the test sample. The Lasso regularization was proposed to select the representative training samples from the entire training set (consisting of all the training samples). Lasso however tends to select a single sample from a group of correlated training samples and thus does not promote the representation of the test sample in terms of all the training samples from the correct group. To overcome this problem, we propose two alternate regularization methods, elastic net and sum-over-l2-norm. Both these regularization methods favor the selection of multiple correlated training samples to represent the test sample. Experimental results on benchmark datasets show that our regularization methods give better recognition results compared to [1].Keywords
This publication has 3 references indexed in Scilit:
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solutionCommunications on Pure and Applied Mathematics, 2006
- Regularization and Variable Selection Via the Elastic NetJournal of the Royal Statistical Society Series B: Statistical Methodology, 2005
- Scheduling with generalized batch delivery dates and earliness penaltiesIIE Transactions, 2000