14.1.3.4 Feature Selection using Search and Learning

Chapter Contents (Back)
Feature Selection. Dimensionality.

Narendra, P.M., Fukunaga, K.,
A Branch and Bound Algorithm for Feature Subset Selection,
TC(26), No. 9, September 1977, pp. 917-922. BibRef 7709

Rumelhart, D.E., Zipser, D.,
Feature Discovery by Computer Learning,
CogSci(9), 1985, pp. 75-112. BibRef 8500

Siedlecki, W., and Sklansky, J.,
On Automatic Feature Selection,
PRAI(2), No. 2, 1988, pp. 197-220. BibRef 8800

Siedlecki, W., and Sklansky, J.,
A Note on Genetic Algorithms for Large-Scale Feature Selection,
PRL(10), November 1989, pp. 335-347. BibRef 8911
And: A2, A1:
Large-Scale Feature Selection,
HPRCV97(Chapter I:3). (Univ California) (reprint) Discussion of best-first search in the space of feature subsets, including beam-search. BibRef

Siedlecki, W.[Wojciech], Siedlecka, K.[Kinga], Sklansky, J.[Jack],
An Overview of Mapping Techniques for Exploratory Pattern Analysis,
PR(21), No. 5, 1988, pp. 411-429.
Elsevier DOI BibRef 8800

Siedlecki, W.[Wojciech], Siedlecka, K.[Kinga], Sklansky, J.[Jack],
Experiments on Mapping Techniques for Exploratory Pattern Analysis,
PR(21), No. 5, 1988, pp. 431-438.
Elsevier DOI 0309
BibRef

Yu, B.[Bin], Yuan, B.Z.[Bao-Zong],
A more efficient branch and bound algorithm for feature selection,
PR(26), No. 6, June 1993, pp. 883-889.
Elsevier DOI 0401
BibRef

Kittler, J.V.,
Feature Selection and Extraction,
HPRIP86(59-83). Feature Selection. BibRef 8600

Novovicova, J., Pudil, P., Kittler, J.V.,
Divergence Based Feature-Selection for Multimodal Class Densities,
PAMI(18), No. 2, February 1996, pp. 218-223.
IEEE DOI BibRef 9602
Earlier:
Feature Selection Based on Divergence for Empirical Class Densities,
SCIA95(989-996). BibRef

Pudil, P., Novovicova, J., Choakjarernwanit, N., Kittler, J.V.,
Feature Selection Based on the Approximation of Class Densities by Finite Mixtures of Special Type,
PR(28), No. 9, September 1995, pp. 1389-1398.
Elsevier DOI BibRef 9509

Pudil, P., Novovicova, J., Choakjarernwanit, N., Kittler, J.V.,
An Analysis of the Max-Min Approach to Feature Selection and Ordering,
PRL(14), 1993, pp. 841-847. BibRef 9300

Pudil, P., Novovicova, J., Kittler, J.V.,
Floating Search Methods in Feature-Selection,
PRL(15), No. 11, November 1994, pp. 1119-1125. BibRef 9411

Somol, P., Pudil, P., Novovicová, J., Paclík, P.,
Adaptive floating search methods in feature selection,
PRL(20), No. 11-13, November 1999, pp. 1157-1163.
PDF File. 0001
BibRef

Pudil, P., Ferri, F.J., Novovicova, J., Kittler, J.V.,
Floating Search Methods for Feature Selection with Nonmonotonic Criterion Functions,
ICPR94(B:279-283).
IEEE DOI BibRef 9400

Pudil, P., Novovicová, J., Somol, P.,
Feature selection toolbox software package,
PRL(23), No. 4, February 2002, pp. 487-492.
Elsevier DOI 0202
BibRef

Somol, P., Pudil, P.,
Feature selection toolbox,
PR(35), No. 12, December 2002, pp. 2749-2759.
Elsevier DOI 0209
BibRef
Earlier:
Oscillating Search Algorithms for Feature Selection,
ICPR00(Vol II: 406-409).
IEEE DOI 0009
BibRef

Novovicová, J.[Jana], Somol, P.[Petr], Pudil, P.[Pavel],
Oscillating Feature Subset Search Algorithm for Text Categorization,
CIARP06(578-587).
Springer DOI 0611
BibRef

Somol, P., Pudil, P.,
Multi-Subset Selection for Keyword Extraction and Other Prototype Search Tasks Using Feature Selection Algorithms,
ICPR06(II: 736-739).
IEEE DOI 0609
BibRef

Somol, P.[Petr], Pudil, P.[Pavel], Kittler, J.V.[Josef V.],
Fast Branch & Bound Algorithms for Optimal Feature Selection,
PAMI(26), No. 7, July 2004, pp. 900-912.
IEEE Abstract. 0406
Predict criterion values to improve search. BibRef

Somol, P., Novovicova, J., Grim, J., Pudil, P.,
Dynamic Oscillating Search algorithm for feature selection,
ICPR08(1-4).
IEEE DOI 0812
BibRef

Somol, P.[Petr], Novovicová, J.[Jana], Pudil, P.[Pavel],
Flexible-Hybrid Sequential Floating Search in Statistical Feature Selection,
SSPR06(632-639).
Springer DOI 0608
BibRef

Chen, X.W.[Xue-Wen],
An improved branch and bound algorithm for feature selection,
PRL(24), No. 12, August 2003, pp. 1925-1933.
Elsevier DOI 0304
BibRef

Krishnapuram, B.[Balaji], Hartemink, A.J.[Alexander J.], Carin, L.[Lawrence], Figueiredo, M.A.T.[Mario A.T.],
A Bayesian Approach to Joint Feature Selection and Classifier Design,
PAMI(26), No. 9, September 2004, pp. 1105-1111.
IEEE Abstract. 0409
Learn both optimal classifier and the subset of relevant features. BibRef

Iannarilli, F.J., Rubin, P.A.,
Feature selection for multiclass discrimination via mixed-integer linear programming,
PAMI(25), No. 6, June 2003, pp. 779-783.
IEEE Abstract. 0306
Recast branch-and-bound feature selection as linear programming. BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.,
Enhancing prototype reduction schemes with LVQ3-type algorithms,
PR(36), No. 5, May 2003, pp. 1083-1093.
Elsevier DOI 0301
BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On using prototype reduction schemes to optimize kernel-based nonlinear subspace methods,
PR(37), No. 2, February 2004, pp. 227-239.
Elsevier DOI 0311
BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On using prototype reduction schemes to optimize locally linear reconstruction methods,
PR(45), No. 1, 2012, pp. 498-511.
Elsevier DOI 1410
Prototype reduction schemes (PRS) BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On Utilizing Search Methods to Select Subspace Dimensions for Kernel-Based Nonlinear Subspace Classifiers,
PAMI(27), No. 1, January 2005, pp. 136-141.
IEEE Abstract. 0412
PCA. Determine the dimensions of the classifier. BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On Using Prototype Reduction Schemes and Classifier Fusion Strategies to Optimize Kernel-Based Nonlinear Subspace Methods,
PAMI(27), No. 3, March 2005, pp. 455-460.
IEEE Abstract. 0501
BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
Prototype reduction schemes applicable for non-stationary data sets,
PR(39), No. 2, February 2006, pp. 209-222.
Elsevier DOI 0512
BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On using prototype reduction schemes to optimize dissimilarity-based classification,
PR(40), No. 11, November 2007, pp. 2946-2957.
Elsevier DOI 0707
BibRef
Earlier:
On Optimizing Kernel-Based Fisher Discriminant Analysis Using Prototype Reduction Schemes,
SSPR06(826-834).
Springer DOI 0608
BibRef
And:
On Optimizing Dissimilarity-Based Classification Using Prototype Reduction Schemes,
ICIAR06(I: 15-28).
Springer DOI 0610
Dissimilarity representation, Dissimilarity-based classification, Prototype reduction schemes (PRSs), Mahalanobis distances (MDs)
See also On Optimizing Subclass Discriminant Analysis Using a Pre-clustering Technique. BibRef

Tahir, M.A.[Muhammad Atif], Smith, J.[Jim],
Creating diverse nearest-neighbour ensembles using simultaneous metaheuristic feature selection,
PRL(31), No. 11, 1 August 2010, pp. 1470-1480.
Elsevier DOI 1008
Tabu Search, 1NN classifier, Feature selection, Ensemble classifiers BibRef

Kim, S.W.[Sang-Woon],
An empirical evaluation on dimensionality reduction schemes for dissimilarity-based classifications,
PRL(32), No. 6, 15 April 2011, pp. 816-823.
Elsevier DOI 1103
Dissimilarity-based classifications, Dimensionality reduction schemes; Prototype selection methods, Linear discriminant analysis BibRef

Kim, S.W.[Sang-Woon], Oommen, B.J.[B. John],
On using prototype reduction schemes to enhance the computation of volume-based inter-class overlap measures,
PR(42), No. 11, November 2009, pp. 2695-2704.
Elsevier DOI 0907
Prototype reduction schemes (PRS),, k-nearest neighbor (k-NN) classifier, Data complexity, Class-overlapping BibRef

Kim, S.W.[Sang-Woon], Gao, J.[Jian],
A Dynamic Programming Technique for Optimizing Dissimilarity-Based Classifiers,
SSPR08(654-663).
Springer DOI 0812
BibRef
And:
On Using Dimensionality Reduction Schemes to Optimize Dissimilarity-Based Classifiers,
CIARP08(309-316).
Springer DOI 0809
BibRef

Oh, I.S.[Il-Seok], Lee, J.S.[Jin-Seon], Moon, B.R.[Byung-Ro],
Hybrid Genetic Algorithms for Feature Selection,
PAMI(26), No. 11, November 2004, pp. 1424-1437.
IEEE Abstract. 0410
BibRef
Earlier:
Local search-embedded genetic algorithms for feature selection,
ICPR02(II: 148-151).
IEEE DOI 0211
BibRef

Krishnapuram, B.[Balaji], Carin, L.[Lawrence], Figueiredo, M.A.T.[Mario A.T.], Hartemink, A.J.[Alexander J.],
Sparse Multinomial Logistic Regression: Fast Algorithms and Generalization Bounds,
PAMI(27), No. 6, June 2005, pp. 957-968.
IEEE Abstract. 0505
Sparse learning. Multiclass formulation based on regression, combine using optimization and a component update procedure. BibRef

Liu, Y.[Yi], Zheng, Y.F.[Yuan F.],
FS_SFS: A novel feature selection method for support vector machines,
PR(39), No. 7, July 2006, pp. 1333-1345.
Elsevier DOI 0606
Sequential forward search, Support vector machines BibRef

Wang, X.Y.[Xiang-Yang], Yang, J.[Jie], Teng, X.L.[Xiao-Long], Xia, W.J.[Wei-Jun], Jensen, R.[Richard],
Feature selection based on rough sets and particle swarm optimization,
PRL(28), No. 4, 1 March 2007, pp. 459-471.
Elsevier DOI 0701
Feature selection, Rough sets, Reduct, Genetic algorithms; Particle swarm optimization, Hill-climbing method, Stochastic method BibRef

Zhang, P.[Ping], Verma, B.[Brijesh], Kumar, K.[Kuldeep],
Neural vs. statistical classifier in conjunction with genetic algorithm based feature selection,
PRL(26), No. 7, 15 May 2005, pp. 909-919.
Elsevier DOI 0506
BibRef

Hong, J.H.[Jin-Hyuk], Cho, S.B.[Sung-Bae],
Efficient huge-scale feature selection with speciated genetic algorithm,
PRL(27), No. 2, 15 January 2006, pp. 143-150.
Elsevier DOI 0512
BibRef

Huang, J.J.[Jin-Jie], Cai, Y.Z.[Yun-Ze], Xu, X.M.[Xiao-Ming],
A hybrid genetic algorithm for feature selection wrapper based on mutual information,
PRL(28), No. 13, 1 October 2007, pp. 1825-1844.
Elsevier DOI 0709
BibRef
Earlier:
A Wrapper for Feature Selection Based on Mutual Information,
ICPR06(II: 618-621).
IEEE DOI 0609
Machine learning, Hybrid genetic algorithm, Feature selection, Mutual information BibRef

Nakariyakul, S.[Songyot], Casasent, D.P.[David P.],
Adaptive branch and bound algorithm for selecting optimal features,
PRL(28), No. 12, 1 September 2007, pp. 1415-1427.
Elsevier DOI 0707
Branch and bound algorithm, Dimensionality reduction, Feature selection; Optimal subset search BibRef

Gavrilis, D.[Dimitris], Tsoulos, I.G.[Ioannis G.], Dermatas, E.[Evangelos],
Selecting and constructing features using grammatical evolution,
PRL(29), No. 9, 1 July 2008, pp. 1358-1365.
Elsevier DOI 0711
Keywords: Artificial neural networks, Feature selection, Feature construction, Genetic programming, Grammatical evolution BibRef

Nakariyakul, S.[Songyot], Casasent, D.P.[David P.],
An improvement on floating search algorithms for feature subset selection,
PR(42), No. 9, September 2009, pp. 1932-1940.
Elsevier DOI 0905
Dimensionality reduction, Feature selection, Floating search methods; Weak feature replacement BibRef

Nakariyakul, S.[Songyot],
Suboptimal branch and bound algorithms for feature subset selection: A comparative study,
PRL(45), No. 1, 2014, pp. 62-70.
Elsevier DOI 1407
BibRef
Earlier:
A new feature selection algorithm for multispectral and polarimetric vehicle images,
ICIP09(2865-2868).
IEEE DOI 0911
Branch and bound algorithm BibRef

Hong, Y.[Yi], Kwong, S.[Sam],
To combine steady-state genetic algorithm and ensemble learning for data clustering,
PRL(29), No. 9, 1 July 2008, pp. 1416-1423.
Elsevier DOI 0711
Clustering analysis, Ensemble learning, Genetic-guided clustering algorithms BibRef

Hong, Y.[Yi], Kwong, S.[Sam], Wang, H.[Hanli], Ren, Q.S.[Qing-Sheng],
Resampling-based selective clustering ensembles,
PRL(30), No. 3, 1 February 2009, pp. 298-305.
Elsevier DOI 0804
Clustering analysis, Clustering ensembles, Resampling technique BibRef

Yusta, S.C.[Silvia Casado],
Different metaheuristic strategies to solve the feature selection problem,
PRL(30), No. 5, 1 April 2009, pp. 525-534.
Elsevier DOI 0903
Feature selection, Floating search, Genetic Algorithm, GRASP, Tabu Search, Memetic Algorithm BibRef

Wang, Y.[Yong], Li, L.[Lin], Ni, J.[Jun], Huang, S.H.[Shu-Hong],
Feature selection using tabu search with long-term memories and probabilistic neural networks,
PRL(30), No. 7, 1 May 2009, pp. 661-670.
Elsevier DOI 0904
Feature selection, Tabu Search, Probabilistic neural network, Smoothing parameter BibRef

Park, M.S.[Myoung Soo], Choi, J.Y.[Jin Young],
Theoretical analysis on feature extraction capability of class-augmented PCA,
PR(42), No. 11, November 2009, pp. 2353-2362.
Elsevier DOI 0907
Feature extraction, CA-PCA (class-augmented principal component analysis), Class information, PCA (principal component analysis); Classification BibRef

Sun, Y.J.[Yi-Jun], Todorovic, S.[Sinisa], Goodison, S.[Steve],
Local-Learning-Based Feature Selection for High-Dimensional Data Analysis,
PAMI(32), No. 9, September 2010, pp. 1610-1626.
IEEE DOI 1008
BibRef

Cebe, M.[Mumin], Gunduz-Demir, C.[Cigdem],
Qualitative test-cost sensitive classification,
PRL(31), No. 13, 1 October 2010, pp. 2043-2051.
Elsevier DOI 1003
Cost-sensitive learning, Qualitative decision theory, Feature extraction cost, Feature selection BibRef

Rodriguez-Lujan, I., Cruz, C.S.[C. Santa], Huerta, R.,
On the equivalence of Kernel Fisher discriminant analysis and Kernel Quadratic Programming Feature Selection,
PRL(32), No. 11, 1 August 2011, pp. 1567-1571.
Elsevier DOI 1108
Kernel Fisher discriminant, Quadratic Programming Feature Selection; Feature selection, Kernel methods BibRef

Shah, M.[Mohak], Marchand, M.[Mario], Corbeil, J.[Jacques],
Feature Selection with Conjunctions of Decision Stumps and Learning from Microarray Data,
PAMI(34), No. 1, January 2012, pp. 174-186.
IEEE DOI 1112
Finding features that are consistent and reliable. BibRef

Liu, J.[Jing], Zhao, F.[Feng], Liu, Y.[Yi],
Learning kernel parameters for kernel Fisher discriminant analysis,
PRL(34), No. 9, July 2013, pp. 1026-1031.
Elsevier DOI 1305
Kernel Fisher discriminant analysis (KFDA), Kernel parameter optimization, Feature extraction, Spectral regression kernel discriminant analysis (SRKDA) BibRef

Liu, B.[Bo], Fang, B.[Bin], Liu, X.W.[Xin-Wang], Chen, J.[Jie], Huang, Z.H.[Zheng-Hong], He, X.P.[Xi-Ping],
Large Margin Subspace Learning for feature selection,
PR(46), No. 10, October 2013, pp. 2798-2806.
Elsevier DOI 1306
Feature selection, l 2 , 1 - norm regularization, Large margin maximization, Subspace learning BibRef

Shu, W.H.[Wen-Hao], Shen, H.[Hong],
Incremental feature selection based on rough set in dynamic incomplete data,
PR(47), No. 12, 2014, pp. 3890-3906.
Elsevier DOI 1410
Feature selection BibRef

Shu, W.H.[Wen-Hao], Shen, H.[Hong],
Multi-criteria feature selection on cost-sensitive data with missing values,
PR(51), No. 1, 2016, pp. 268-280.
Elsevier DOI 1601
Feature selection BibRef

Naghibi, T., Hoffmann, S., Pfister, B.,
A Semidefinite Programming Based Search Strategy for Feature Selection with Mutual Information Measure,
PAMI(37), No. 8, August 2015, pp. 1529-1541.
IEEE DOI 1507
Approximation algorithms BibRef

Ben Brahim, A.[Afef], Limam, M.[Mohamed],
A hybrid feature selection method based on instance learning and cooperative subset search,
PRL(69), No. 1, 2016, pp. 28-34.
Elsevier DOI 1601
Feature selection BibRef

Huang, D., Cabral, R.S., de la Torre, F.,
Robust Regression,
PAMI(38), No. 2, February 2016, pp. 363-375.
IEEE DOI 1601
Computational modeling BibRef

Wang, W., Yan, Y., Winkler, S., Sebe, N.,
Category Specific Dictionary Learning for Attribute Specific Feature Selection,
IP(25), No. 3, March 2016, pp. 1465-1478.
IEEE DOI 1602
Dictionaries BibRef

Wang, W.[Wei], Yan, Y.[Yan], Nie, F.P.[Fei-Ping], Yan, S.C.[Shui-Cheng], Sebe, N.[Nicu],
Flexible Manifold Learning With Optimal Graph for Image and Video Representation,
IP(27), No. 6, June 2018, pp. 2664-2675.
IEEE DOI 1804
eigenvalues and eigenfunctions, graph theory, image classification, image representation, iterative methods, graph embedding BibRef

Wang, W.[Wei], Yan, Y.[Yan], Nie, F.P.[Fei-Ping], Pineda, X.[Xavier], Yan, S.C.[Shui-Cheng], Sebe, N.[Nicu],
Projective Unsupervised Flexible Embedding with Optimal Graph,
BMVC16(xx-yy).
HTML Version. 1805
BibRef

Mohsenzadeh, Y.[Yalda], Sheikhzadeh, H.[Hamid], Nazari, S.[Sobhan],
Incremental relevance sample-feature machine: A fast marginal likelihood maximization approach for joint feature selection and classification,
PR(60), No. 1, 2016, pp. 835-848.
Elsevier DOI 1609
Sparse Bayesian learning BibRef

Wang, X.D.[Xiao-Dong], Chen, R.C.[Rung-Ching], Yan, F.[Fei], Zeng, Z.Q.[Zhi-Qiang],
Semi-supervised feature selection with exploiting shared information among multiple tasks,
JVCIR(41), No. 1, 2016, pp. 272-280.
Elsevier DOI 1612
Semi-supervised learning BibRef

Wang, X.D.[Xiao-Dong], Chen, R.C.[Rung-Ching], Hong, C.Q.[Chao-Qun], Zeng, Z.Q.[Zhi-Qiang],
Unsupervised feature analysis with sparse adaptive learning,
PRL(102), 2018, pp. 89-94.
Elsevier DOI 1802
Unsupervised learning, Feature selection, Adaptive structure learning, -Norm BibRef

Zeng, Z.Q.[Zhi-Qiang], Wang, X.D.[Xiao-Dong], Chen, Y.M.[Yu-Ming],
Multimedia annotation via semi-supervised shared-subspace feature selection,
JVCIR(48), No. 1, 2017, pp. 386-395.
Elsevier DOI 1708
Semi-supervised, learning BibRef

Barbu, A.[Adrian], She, Y.Y.[Yi-Yuan], Ding, L.J.[Liang-Jing], Gramajo, G.[Gary],
Feature Selection with Annealing for Computer Vision and Big Data Learning,
PAMI(39), No. 2, February 2017, pp. 272-286.
IEEE DOI 1702
Algorithm design and analysis BibRef

Zhou, H.J.[Hong-Jun], You, M.Y.[Ming-Yu], Liu, L.[Lei], Zhuang, C.[Chao],
Sequential data feature selection for human motion recognition via Markov blanket,
PRL(86), No. 1, 2017, pp. 18-25.
Elsevier DOI 1702
Sequential data BibRef

Piza-Davila, I.[Ivan], Sanchez-Diaz, G.[Guillermo], Lazo-Cortes, M.S.[Manuel S.], Rizo-Dominguez, L.[Luis],
A CUDA-based hill-climbing algorithm to find irreducible testors from a training matrix,
PRL(95), No. 1, 2017, pp. 22-28.
Elsevier DOI 1708
Pattern recognition BibRef

Wang, K.Z.[Kun-Zhe], Xiao, H.T.[Huai-Tie],
Sparse kernel feature extraction via support vector learning,
PRL(101), No. 1, 2018, pp. 67-73.
Elsevier DOI 1801
Kernel principal component analysis BibRef

Zhao, Y.[Yue], You, X.G.[Xin-Ge], Yu, S.J.[Shu-Jian], Xu, C.[Chang], Yuan, W.[Wei], Jing, X.Y.[Xiao-Yuan], Zhang, T.P.[Tai-Ping], Tao, D.C.[Da-Cheng],
Multi-view manifold learning with locality alignment,
PR(78), 2018, pp. 154-166.
Elsevier DOI 1804
discover the low dimensional space where the input high dimensional data are embedded. Manifold learning, Multi-view learning, Locality alignment BibRef

Liu, J.H.[Jing-Hua], Lin, Y.J.[Yao-Jin], Li, Y.[Yuwen], Weng, W.[Wei], Wu, S.X.[Shun-Xiang],
Online multi-label streaming feature selection based on neighborhood rough set,
PR(84), 2018, pp. 273-287.
Elsevier DOI 1809
Online feature selection, Multi-label learning, Neighborhood rough set, Granularity BibRef

Peng, Y.[Yali], Sehdev, P.[Paramjit], Liu, S.G.[Shi-Gang], Li, J.[Jun], Wang, X.L.[Xi-Li],
L_2,1-norm minimization based negative label relaxation linear regression for feature selection,
PRL(116), 2018, pp. 170-178.
Elsevier DOI 1812
BibRef

Yang, X.L.[Xiang-Lin], Wang, Y.J.[Yu-Jing], Ou, Y.[Yang], Tong, Y.H.[Yun-Hai],
Three-Fast-Inter Incremental Association Markov Blanket learning algorithm,
PRL(122), 2019, pp. 73-78.
Elsevier DOI 1904
Markov blanket, IAMB, Bayesian network BibRef

Li, C.S.[Chang-Sheng], Wang, X.F.[Xiang-Feng], Dong, W.S.[Wei-Shan], Yan, J.C.[Jun-Chi], Liu, Q.S.[Qing-Shan], Zha, H.Y.[Hong-Yuan],
Joint Active Learning with Feature Selection via CUR Matrix Decomposition,
PAMI(41), No. 6, June 2019, pp. 1382-1396.
IEEE DOI 1905
Feature selection. Feature extraction, Matrix decomposition, Image reconstruction, Iterative methods, Labeling, Optimization, matrix factorization BibRef

Shang, R.H.[Rong-Hua], Meng, Y.[Yang], Wang, W.B.[Wen-Bing], Shang, F.H.[Fan-Hua], Jiao, L.C.[Li-Cheng],
Local discriminative based sparse subspace learning for feature selection,
PR(92), 2019, pp. 219-230.
Elsevier DOI 1905
Local discriminant model, Subspace learning, Sparse constraint, Feature selection BibRef

Shah, M.H., Dang, X.,
Novel Feature Selection Method Using Bhattacharyya Distance for Neural Networks Based Automatic Modulation Classification,
SPLetters(27), 2020, pp. 106-110.
IEEE DOI 2001
Modulation, Feature extraction, Training, Neural networks, Probability distribution, Signal processing algorithms, CNN BibRef

Yu, K.[Kui], Liu, L.[Lin], Li, J.Y.[Jiu-Yong], Ding, W.[Wei], Le, T.D.[Thuc Duy],
Multi-Source Causal Feature Selection,
PAMI(42), No. 9, September 2020, pp. 2240-2256.
IEEE DOI 2008
Feature extraction, Diseases, Training, Search problems, Reliability, Predictive models, Markov processes, Causal feature selection, causal invariance BibRef

Song, X.F.[Xian-Fang], Zhang, Y.[Yong], Gong, D.W.[Dun-Wei], Sun, X.Y.[Xiao-Yan],
Feature selection using bare-bones particle swarm optimization with mutual information,
PR(112), 2021, pp. 107804.
Elsevier DOI 2102
Feature selection, Particle swarm, Swarm initialization, Mutual information, Local search BibRef

Ma, W.P.[Wen-Ping], Zhou, X.B.[Xiao-Bo], Zhu, H.[Hao], Li, L.W.[Long-Wei], Jiao, L.C.[Li-Cheng],
A two-stage hybrid ant colony optimization for high-dimensional feature selection,
PR(116), 2021, pp. 107933.
Elsevier DOI 2106
Feature selection, Ant colony optimization, High-dimensional data, Classification, Optimal feature subset size BibRef

Zhao, J.[Jie], Ling, Y.[Yun], Huang, F.[Faliang], Wang, J.[Jiahai], See-To, E.W.K.[Eric W.K.],
Incremental feature selection for dynamic incomplete data using sub-tolerance relations,
PR(148), 2024, pp. 110125.
Elsevier DOI 2402
Incremental feature selection, Tolerance rough set, Sub-tolerance relation, Significance measure BibRef


Caiafa, C.F.[Cesar F.], Wang, Z.[Ziyao], Solé-Casals, J.[Jordi], Zhao, Q.B.[Qi-Bin],
Learning from Incomplete Features by Simultaneous Training of Neural Networks and Sparse Coding,
LLID21(2621-2630)
IEEE DOI 2109
Training, Deep learning, Sufficient conditions, Dictionaries, Simulation, Supervised learning, Neural networks BibRef

Shi, H.L.[Hai-Lin], Zhu, X.Y.[Xiang-Yu], Lei, Z.[Zhen], Liao, S.C.[Sheng-Cai], Li, S.Z.[Stan Z.],
Learning Discriminative Features with Class Encoder,
Robust16(1119-1125)
IEEE DOI 1612
BibRef

Sato, Y.[Yoshikuni], Kozuka, K.[Kazuki], Sawada, Y.[Yoshihide], Kiyono, M.[Masaki],
Learning Multiple Complex Features Based on Classification Results,
ICPR14(3369-3373)
IEEE DOI 1412
Accuracy BibRef

Xin, X.[Xin], Li, Z.[Zhu], Ma, Z.[Zhan], Katsaggelos, A.K.[Aggelos K.],
Robust feature selection with self-matching score,
ICIP13(4363-4366)
IEEE DOI 1402
compact visual descriptor;mobile visual search;self-matching score BibRef

Rodrigues, D.[Douglas], Pereira, L.A.M.[Luis A. M.],
Optimizing Feature Selection through Binary Charged System Search,
CAIP13(377-384).
Springer DOI 1308
BibRef

Chang, Y.J.[Yao-Jen], Chen, T.H.[Tsu-Han],
Semi-supervised learning with kernel locality-constrained linear coding,
ICIP11(2977-2980).
IEEE DOI 1201
For low levels of labeled data, both labeled and unlabeled data. BibRef

Cortazar, E.[Esteban], Mery, D.[Domingo],
A Probabilistic Iterative Local Search Algorithm Applied to Full Model Selection,
CIARP11(675-682).
Springer DOI 1111
For combinations of methods for supervised learning. BibRef

Sousa, R.[Ricardo], Oliveira, H.P.[Hélder P.], Cardoso, J.S.[Jaime S.],
Feature Selection with Complexity Measure in a Quadratic Programming Setting,
IbPRIA11(524-531).
Springer DOI 1106
BibRef

Duin, R.P.W.[Robert P. W.], Loog, M.[Marco], Pelkalska, E.[Elzabieta], Tax, D.M.J.[David M. J.],
Feature-Based Dissimilarity Space Classification,
ICPR-Contests10(46-55).
Springer DOI 1008
BibRef

Shen, J.F.[Ji-Feng], Yang, W.K.[Wan-Kou], Sun, C.Y.[Chang-Yin],
Learning Discriminative Features Based on Distribution,
ICPR10(1401-1404).
IEEE DOI 1008
BibRef

Kundu, P.P.[Partha Pratim], Mitra, S.[Sushmita],
Multi-objective Evolutionary Feature Selection,
PReMI09(74-79).
Springer DOI 0912
BibRef

Ramirez, R.[Rafael], Puiggros, M.[Montserrat],
A Genetic Programming Approach to Feature Selection and Classification of Instantaneous Cognitive States,
EvoIASP07(311-319).
Springer DOI 0704
BibRef

Azhar, H.B.[Hannan Bin], Dimond, K.[Keith],
A Stochastic Search Algorithm to Optimize an N-tuple Classifier by Selecting Its Inputs,
ICIAR04(I: 556-563).
Springer DOI 0409
BibRef

Chapter on Pattern Recognition, Clustering, Statistics, Grammars, Learning, Neural Nets, Genetic Algorithms continues in
Ranking .


Last update:Mar 16, 2024 at 20:36:19