Feature Selection using Search and Learning

Chapter Contents (Back)
Feature Selection. Dimensionality.

Narendra, P.M., Fukunaga, K.,
A Branch and Bound Algorithm for Feature Subset Selection,
TC(26), No. 9, September 1977, pp. 917-922. BibRef 7709

Rumelhart, D.E., Zipser, D.,
Feature Discovery by Computer Learning,
CogSci(9), 1985, pp. 75-112. BibRef 8500

Siedlecki, W., and Sklansky, J.,
On Automatic Feature Selection,
PRAI(2), No. 2, 1988, pp. 197-220. BibRef 8800

Siedlecki, W., and Sklansky, J.,
A Note on Genetic Algorithms for Large-Scale Feature Selection,
PRL(10), November 1989, pp. 335-347. BibRef 8911
And: A2, A1:
Large-Scale Feature Selection,
HPRCV97(Chapter I:3). (Univ California) (reprint) Discussion of best-first search in the space of feature subsets, including beam-search. BibRef

Siedlecki, W.[Wojciech], Siedlecka, K.[Kinga], Sklansky, J.[Jack],
An Overview of Mapping Techniques for Exploratory Pattern Analysis,
PR(21), No. 5, 1988, pp. 411-429.
Elsevier DOI BibRef 8800

Siedlecki, W.[Wojciech], Siedlecka, K.[Kinga], Sklansky, J.[Jack],
Experiments on Mapping Techniques for Exploratory Pattern Analysis,
PR(21), No. 5, 1988, pp. 431-438.
Elsevier DOI 0309

Yu, B.[Bin], Yuan, B.Z.[Bao-Zong],
A more efficient branch and bound algorithm for feature selection,
PR(26), No. 6, June 1993, pp. 883-889.
Elsevier DOI 0401

Kittler, J.V.,
Feature Selection and Extraction,
HPRIP86(59-83). Feature Selection. BibRef 8600

Novovicova, J., Pudil, P., Kittler, J.V.,
Divergence Based Feature-Selection for Multimodal Class Densities,
PAMI(18), No. 2, February 1996, pp. 218-223.
IEEE DOI BibRef 9602
Feature Selection Based on Divergence for Empirical Class Densities,
SCIA95(989-996). BibRef

Pudil, P., Novovicova, J., Choakjarernwanit, N., Kittler, J.V.,
Feature Selection Based on the Approximation of Class Densities by Finite Mixtures of Special Type,
PR(28), No. 9, September 1995, pp. 1389-1398.
Elsevier DOI BibRef 9509

Pudil, P., Novovicova, J., Choakjarernwanit, N., Kittler, J.V.,
An Analysis of the Max-Min Approach to Feature Selection and Ordering,
PRL(14), 1993, pp. 841-847. BibRef 9300

Pudil, P., Novovicova, J., Kittler, J.V.,
Floating Search Methods in Feature-Selection,
PRL(15), No. 11, November 1994, pp. 1119-1125. BibRef 9411

Somol, P., Pudil, P., Novovicová, J., Paclík, P.,
Adaptive floating search methods in feature selection,
PRL(20), No. 11-13, November 1999, pp. 1157-1163.
PDF File. 0001

Pudil, P., Ferri, F.J., Novovicova, J., Kittler, J.V.,
Floating Search Methods for Feature Selection with Nonmonotonic Criterion Functions,
IEEE DOI BibRef 9400

Pudil, P., Novovicová, J., Somol, P.,
Feature selection toolbox software package,
PRL(23), No. 4, February 2002, pp. 487-492.
Elsevier DOI 0202

Somol, P., Pudil, P.,
Feature selection toolbox,
PR(35), No. 12, December 2002, pp. 2749-2759.
Elsevier DOI 0209
Oscillating Search Algorithms for Feature Selection,
ICPR00(Vol II: 406-409).

Novovicová, J.[Jana], Somol, P.[Petr], Pudil, P.[Pavel],
Oscillating Feature Subset Search Algorithm for Text Categorization,
Springer DOI 0611

Somol, P., Pudil, P.,
Multi-Subset Selection for Keyword Extraction and Other Prototype Search Tasks Using Feature Selection Algorithms,
ICPR06(II: 736-739).

Somol, P.[Petr], Pudil, P.[Pavel], Kittler, J.V.[Josef V.],
Fast Branch & Bound Algorithms for Optimal Feature Selection,
PAMI(26), No. 7, July 2004, pp. 900-912.
IEEE Abstract. 0406
Predict criterion values to improve search. BibRef

Somol, P., Novovicova, J., Grim, J., Pudil, P.,
Dynamic Oscillating Search algorithm for feature selection,

Somol, P.[Petr], Novovicová, J.[Jana], Pudil, P.[Pavel],
Flexible-Hybrid Sequential Floating Search in Statistical Feature Selection,
Springer DOI 0608

Chen, X.W.[Xue-Wen],
An improved branch and bound algorithm for feature selection,
PRL(24), No. 12, August 2003, pp. 1925-1933.
Elsevier DOI 0304

Krishnapuram, B.[Balaji], Hartemink, A.J.[Alexander J.], Carin, L.[Lawrence], Figueiredo, M.A.T.[Mario A.T.],
A Bayesian Approach to Joint Feature Selection and Classifier Design,
PAMI(26), No. 9, September 2004, pp. 1105-1111.
IEEE Abstract. 0409
Learn both optimal classifier and the subset of relevant features. BibRef

Iannarilli, F.J., Rubin, P.A.,
Feature selection for multiclass discrimination via mixed-integer linear programming,
PAMI(25), No. 6, June 2003, pp. 779-783.
IEEE Abstract. 0306
Recast branch-and-bound feature selection as linear programming. BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.,
Enhancing prototype reduction schemes with LVQ3-type algorithms,
PR(36), No. 5, May 2003, pp. 1083-1093.
Elsevier DOI 0301

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On using prototype reduction schemes to optimize kernel-based nonlinear subspace methods,
PR(37), No. 2, February 2004, pp. 227-239.
Elsevier DOI 0311

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On using prototype reduction schemes to optimize locally linear reconstruction methods,
PR(45), No. 1, 2012, pp. 498-511.
Elsevier DOI 1410
Prototype reduction schemes (PRS) BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On Utilizing Search Methods to Select Subspace Dimensions for Kernel-Based Nonlinear Subspace Classifiers,
PAMI(27), No. 1, January 2005, pp. 136-141.
IEEE Abstract. 0412
PCA. Determine the dimensions of the classifier. BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On Using Prototype Reduction Schemes and Classifier Fusion Strategies to Optimize Kernel-Based Nonlinear Subspace Methods,
PAMI(27), No. 3, March 2005, pp. 455-460.
IEEE Abstract. 0501

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
Prototype reduction schemes applicable for non-stationary data sets,
PR(39), No. 2, February 2006, pp. 209-222.
Elsevier DOI 0512

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On using prototype reduction schemes to optimize dissimilarity-based classification,
PR(40), No. 11, November 2007, pp. 2946-2957.
Elsevier DOI 0707
On Optimizing Kernel-Based Fisher Discriminant Analysis Using Prototype Reduction Schemes,
Springer DOI 0608
On Optimizing Dissimilarity-Based Classification Using Prototype Reduction Schemes,
ICIAR06(I: 15-28).
Springer DOI 0610
Dissimilarity representation; Dissimilarity-based classification; Prototype reduction schemes (PRSs); Mahalanobis distances (MDs) See also On Optimizing Subclass Discriminant Analysis Using a Pre-clustering Technique. BibRef

Kim, S.W.[Sang-Woon],
An empirical evaluation on dimensionality reduction schemes for dissimilarity-based classifications,
PRL(32), No. 6, 15 April 2011, pp. 816-823.
Elsevier DOI 1103
Dissimilarity-based classifications; Dimensionality reduction schemes; Prototype selection methods; Linear discriminant analysis BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On using prototype reduction schemes to enhance the computation of volume-based inter-class overlap measures,
PR(42), No. 11, November 2009, pp. 2695-2704.
Elsevier DOI 0907
Prototype reduction schemes (PRS),; k-nearest neighbor (k-NN) classifier; Data complexity; Class-overlapping BibRef

Kim, S.W.[Sang-Woon], Gao, J.[Jian],
A Dynamic Programming Technique for Optimizing Dissimilarity-Based Classifiers,
Springer DOI 0812
On Using Dimensionality Reduction Schemes to Optimize Dissimilarity-Based Classifiers,
Springer DOI 0809

Oh, I.S.[Il-Seok], Lee, J.S.[Jin-Seon], Moon, B.R.[Byung-Ro],
Hybrid Genetic Algorithms for Feature Selection,
PAMI(26), No. 11, November 2004, pp. 1424-1437.
IEEE Abstract. 0410
Local search-embedded genetic algorithms for feature selection,
ICPR02(II: 148-151).

Krishnapuram, B.[Balaji], Carin, L.[Lawrence], Figueiredo, M.A.T.[Mario A.T.], Hartemink, A.J.[Alexander J.],
Sparse Multinomial Logistic Regression: Fast Algorithms and Generalization Bounds,
PAMI(27), No. 6, June 2005, pp. 957-968.
IEEE Abstract. 0505
Sparse learning. Multiclass formulation based on regression, combine using optimization and a component update procedure. BibRef

Liu, Y.[Yi], Zheng, Y.F.[Yuan F.],
FS_SFS: A novel feature selection method for support vector machines,
PR(39), No. 7, July 2006, pp. 1333-1345.
Elsevier DOI 0606
Sequential forward search; Support vector machines BibRef

Wang, X.Y.[Xiang-Yang], Yang, J.[Jie], Teng, X.L.[Xiao-Long], Xia, W.J.[Wei-Jun], Jensen, R.[Richard],
Feature selection based on rough sets and particle swarm optimization,
PRL(28), No. 4, 1 March 2007, pp. 459-471.
Elsevier DOI 0701
Feature selection; Rough sets; Reduct; Genetic algorithms; Particle swarm optimization; Hill-climbing method; Stochastic method BibRef

Zhang, P.[Ping], Verma, B.[Brijesh], Kumar, K.[Kuldeep],
Neural vs. statistical classifier in conjunction with genetic algorithm based feature selection,
PRL(26), No. 7, 15 May 2005, pp. 909-919.
Elsevier DOI 0506

Hong, J.H.[Jin-Hyuk], Cho, S.B.[Sung-Bae],
Efficient huge-scale feature selection with speciated genetic algorithm,
PRL(27), No. 2, 15 January 2006, pp. 143-150.
Elsevier DOI 0512

Huang, J.J.[Jin-Jie], Cai, Y.[Yunze], Xu, X.M.[Xiao-Ming],
A hybrid genetic algorithm for feature selection wrapper based on mutual information,
PRL(28), No. 13, 1 October 2007, pp. 1825-1844.
Elsevier DOI 0709
A Wrapper for Feature Selection Based on Mutual Information,
ICPR06(II: 618-621).
Machine learning; Hybrid genetic algorithm; Feature selection; Mutual information BibRef

Nakariyakul, S.[Songyot], Casasent, D.P.[David P.],
Adaptive branch and bound algorithm for selecting optimal features,
PRL(28), No. 12, 1 September 2007, pp. 1415-1427.
Elsevier DOI 0707
Branch and bound algorithm; Dimensionality reduction; Feature selection; Optimal subset search BibRef

Gavrilis, D.[Dimitris], Tsoulos, I.G.[Ioannis G.], Dermatas, E.[Evangelos],
Selecting and constructing features using grammatical evolution,
PRL(29), No. 9, 1 July 2008, pp. 1358-1365.
Elsevier DOI 0711
Keywords: Artificial neural networks; Feature selection; Feature construction; Genetic programming; Grammatical evolution BibRef

Nakariyakul, S.[Songyot], Casasent, D.P.[David P.],
An improvement on floating search algorithms for feature subset selection,
PR(42), No. 9, September 2009, pp. 1932-1940.
Elsevier DOI 0905
Dimensionality reduction; Feature selection; Floating search methods; Weak feature replacement BibRef

Nakariyakul, S.[Songyot],
Suboptimal branch and bound algorithms for feature subset selection: A comparative study,
PRL(45), No. 1, 2014, pp. 62-70.
Elsevier DOI 1407
A new feature selection algorithm for multispectral and polarimetric vehicle images,
Branch and bound algorithm BibRef

Hong, Y.[Yi], Kwong, S.[Sam],
To combine steady-state genetic algorithm and ensemble learning for data clustering,
PRL(29), No. 9, 1 July 2008, pp. 1416-1423.
Elsevier DOI 0711
Clustering analysis; Ensemble learning; Genetic-guided clustering algorithms BibRef

Hong, Y.[Yi], Kwong, S.[Sam], Chang, Y.C.[Yu-Chou], Ren, Q.S.[Qing-Sheng],
Unsupervised feature selection using clustering ensembles and population based incremental learning algorithm,
PR(41), No. 9, September 2008, pp. 2742-2756.
Elsevier DOI 0806
Clustering ensembles; Dimensionality unbiased; Population based incremental learning algorithm; Unsupervised feature selection BibRef

Hong, Y.[Yi], Kwong, S.[Sam], Wang, H.[Hanli], Ren, Q.S.[Qing-Sheng],
Resampling-based selective clustering ensembles,
PRL(30), No. 3, 1 February 2009, pp. 298-305.
Elsevier DOI 0804
Clustering analysis; Clustering ensembles; Resampling technique BibRef

Yusta, S.C.[Silvia Casado],
Different metaheuristic strategies to solve the feature selection problem,
PRL(30), No. 5, 1 April 2009, pp. 525-534.
Elsevier DOI 0903
Feature selection; Floating search; Genetic Algorithm; GRASP; Tabu Search; Memetic Algorithm BibRef

Wang, Y.[Yong], Li, L.[Lin], Ni, J.[Jun], Huang, S.H.[Shu-Hong],
Feature selection using tabu search with long-term memories and probabilistic neural networks,
PRL(30), No. 7, 1 May 2009, pp. 661-670.
Elsevier DOI 0904
Feature selection; Tabu Search; Probabilistic neural network; Smoothing parameter BibRef

Park, M.S.[Myoung Soo], Choi, J.Y.[Jin Young],
Theoretical analysis on feature extraction capability of class-augmented PCA,
PR(42), No. 11, November 2009, pp. 2353-2362.
Elsevier DOI 0907
Feature extraction; CA-PCA (class-augmented principal component analysis); Class information; PCA (principal component analysis); Classification BibRef

Sun, Y.J.[Yi-Jun], Todorovic, S.[Sinisa], Goodison, S.[Steve],
Local-Learning-Based Feature Selection for High-Dimensional Data Analysis,
PAMI(32), No. 9, September 2010, pp. 1610-1626.

Cebe, M.[Mumin], Gunduz-Demir, C.[Cigdem],
Qualitative test-cost sensitive classification,
PRL(31), No. 13, 1 October 2010, pp. 2043-2051.
Elsevier DOI 1003
Cost-sensitive learning; Qualitative decision theory; Feature extraction cost; Feature selection BibRef

Rodriguez-Lujan, I., Cruz, C.S.[C. Santa], Huerta, R.,
On the equivalence of Kernel Fisher discriminant analysis and Kernel Quadratic Programming Feature Selection,
PRL(32), No. 11, 1 August 2011, pp. 1567-1571.
Elsevier DOI 1108
Kernel Fisher discriminant; Quadratic Programming Feature Selection; Feature selection; Kernel methods BibRef

Shah, M.[Mohak], Marchand, M.[Mario], Corbeil, J.[Jacques],
Feature Selection with Conjunctions of Decision Stumps and Learning from Microarray Data,
PAMI(34), No. 1, January 2012, pp. 174-186.
Finding features that are consistent and reliable. BibRef

Bellal, F.[Fazia], Elghazel, H.[Haytham], Aussem, A.[Alex],
A semi-supervised feature ranking method with ensemble learning,
PRL(33), No. 10, 15 July 2012, pp. 1426-1433.
Elsevier DOI 1205
Semi-supervised learning; Feature selection; Ensemble learning BibRef

Liu, J.[Jing], Zhao, F.[Feng], Liu, Y.[Yi],
Learning kernel parameters for kernel Fisher discriminant analysis,
PRL(34), No. 9, July 2013, pp. 1026-1031.
Elsevier DOI 1305
Kernel Fisher discriminant analysis (KFDA); Kernel parameter optimization; Feature extraction; Spectral regression kernel discriminant analysis (SRKDA) BibRef

Liu, B.[Bo], Fang, B.[Bin], Liu, X.[Xinwang], Chen, J.[Jie], Huang, Z.H.[Zheng-Hong], He, X.[Xiping],
Large Margin Subspace Learning for feature selection,
PR(46), No. 10, October 2013, pp. 2798-2806.
Elsevier DOI 1306
Feature selection; l 2 , 1 - norm regularization; Large margin maximization; Subspace learning BibRef

Shu, W.H.[Wen-Hao], Shen, H.[Hong],
Incremental feature selection based on rough set in dynamic incomplete data,
PR(47), No. 12, 2014, pp. 3890-3906.
Elsevier DOI 1410
Feature selection BibRef

Shu, W.H.[Wen-Hao], Shen, H.[Hong],
Multi-criteria feature selection on cost-sensitive data with missing values,
PR(51), No. 1, 2016, pp. 268-280.
Elsevier DOI 1601
Feature selection BibRef

Zhao, L., Hu, Q., Wang, W.,
Heterogeneous Feature Selection With Multi-Modal Deep Neural Networks and Sparse Group LASSO,
MultMed(17), No. 11, November 2015, pp. 1936-1948.
Data mining BibRef

Ben Brahim, A.[Afef], Limam, M.[Mohamed],
A hybrid feature selection method based on instance learning and cooperative subset search,
PRL(69), No. 1, 2016, pp. 28-34.
Elsevier DOI 1601
Feature selection BibRef

Huang, D., Cabral, R.S., de la Torre, F.,
Robust Regression,
PAMI(38), No. 2, February 2016, pp. 363-375.
Computational modeling BibRef

Wang, W., Yan, Y., Winkler, S., Sebe, N.,
Category Specific Dictionary Learning for Attribute Specific Feature Selection,
IP(25), No. 3, March 2016, pp. 1465-1478.
Dictionaries BibRef

Mohsenzadeh, Y.[Yalda], Sheikhzadeh, H.[Hamid], Nazari, S.[Sobhan],
Incremental relevance sample-feature machine: A fast marginal likelihood maximization approach for joint feature selection and classification,
PR(60), No. 1, 2016, pp. 835-848.
Elsevier DOI 1609
Sparse Bayesian learning BibRef

Wang, X.D.[Xiao-Dong], Chen, R.C.[Rung-Ching], Yan, F.[Fei], Zeng, Z.Q.[Zhi-Qiang],
Semi-supervised feature selection with exploiting shared information among multiple tasks,
JVCIR(41), No. 1, 2016, pp. 272-280.
Elsevier DOI 1612
Semi-supervised learning BibRef

Wang, X.D.[Xiao-Dong], Chen, R.C.[Rung-Ching], Hong, C.Q.[Chao-Qun], Zeng, Z.Q.[Zhi-Qiang],
Unsupervised feature analysis with sparse adaptive learning,
PRL(102), 2018, pp. 89-94.
Elsevier DOI 1802
Unsupervised learning, Feature selection, Adaptive structure learning, -Norm BibRef

Zeng, Z.Q.[Zhi-Qiang], Wang, X.D.[Xiao-Dong], Chen, Y.M.[Yu-Ming],
Multimedia annotation via semi-supervised shared-subspace feature selection,
JVCIR(48), No. 1, 2017, pp. 386-395.
Elsevier DOI 1708
Semi-supervised, learning BibRef

Barbu, A.[Adrian], She, Y.Y.[Yi-Yuan], Ding, L.J.[Liang-Jing], Gramajo, G.[Gary],
Feature Selection with Annealing for Computer Vision and Big Data Learning,
PAMI(39), No. 2, February 2017, pp. 272-286.
Algorithm design and analysis BibRef

Zhou, H.J.[Hong-Jun], You, M.Y.[Ming-Yu], Liu, L.[Lei], Zhuang, C.[Chao],
Sequential data feature selection for human motion recognition via Markov blanket,
PRL(86), No. 1, 2017, pp. 18-25.
Elsevier DOI 1702
Sequential data BibRef

Piza-Davila, I.[Ivan], Sanchez-Diaz, G.[Guillermo], Lazo-Cortes, M.S.[Manuel S.], Rizo-Dominguez, L.[Luis],
A CUDA-based hill-climbing algorithm to find irreducible testors from a training matrix,
PRL(95), No. 1, 2017, pp. 22-28.
Elsevier DOI 1708
Pattern recognition BibRef

Wang, K.Z.[Kun-Zhe], Xiao, H.T.[Huai-Tie],
Sparse kernel feature extraction via support vector learning,
PRL(101), No. 1, 2018, pp. 67-73.
Elsevier DOI 1801
Kernel principal component analysis BibRef

Shi, H.L.[Hai-Lin], Zhu, X.Y.[Xiang-Yu], Lei, Z.[Zhen], Liao, S.C.[Sheng-Cai], Li, S.Z.[Stan Z.],
Learning Discriminative Features with Class Encoder,

Sato, Y.[Yoshikuni], Kozuka, K.[Kazuki], Sawada, Y.[Yoshihide], Kiyono, M.[Masaki],
Learning Multiple Complex Features Based on Classification Results,
Accuracy BibRef

Chang, Y.J.[Yao-Jen], Chen, T.H.[Tsu-Han],
Semi-supervised learning with kernel locality-constrained linear coding,
For low levels of labeled data, both labeled and unlabeled data. BibRef

Cortazar, E.[Esteban], Mery, D.[Domingo],
A Probabilistic Iterative Local Search Algorithm Applied to Full Model Selection,
Springer DOI 1111
For combinations of methods for supervised learning. BibRef

Sousa, R.[Ricardo], Oliveira, H.P.[Hélder P.], Cardoso, J.S.[Jaime S.],
Feature Selection with Complexity Measure in a Quadratic Programming Setting,
Springer DOI 1106

Duin, R.P.W.[Robert P. W.], Loog, M.[Marco], Pelkalska, E.[Elzabieta], Tax, D.M.J.[David M. J.],
Feature-Based Dissimilarity Space Classification,
Springer DOI 1008

Shen, J.F.[Ji-Feng], Yang, W.K.[Wan-Kou], Sun, C.Y.[Chang-Yin],
Learning Discriminative Features Based on Distribution,

Kundu, P.P.[Partha Pratim], Mitra, S.[Sushmita],
Multi-objective Evolutionary Feature Selection,
Springer DOI 0912

Ramirez, R.[Rafael], Puiggros, M.[Montserrat],
A Genetic Programming Approach to Feature Selection and Classification of Instantaneous Cognitive States,
Springer DOI 0704

Azhar, H.B.[Hannan Bin], Dimond, K.[Keith],
A Stochastic Search Algorithm to Optimize an N-tuple Classifier by Selecting Its Inputs,
ICIAR04(I: 556-563).
Springer DOI 0409

Chapter on Pattern Recognition, Clustering, Statistics, Grammars, Learning, Neural Nets, Genetic Algorithms continues in
Probabilistic Latent Semantic Analysis, pLSA. .

Last update:Mar 22, 2018 at 09:50:25