Research Article

Dimensionality Reduction with Ensemble Learning using Growing Hierarchical Adaptive Self Organizing Maprg

by  Chiraz Jlassi, Ameni Filali, Najet Arous
journal cover
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 187 - Issue 82
Published: February 2026
Authors: Chiraz Jlassi, Ameni Filali, Najet Arous
10.5120/ijca2026926428
PDF

Chiraz Jlassi, Ameni Filali, Najet Arous . Dimensionality Reduction with Ensemble Learning using Growing Hierarchical Adaptive Self Organizing Maprg. International Journal of Computer Applications. 187, 82 (February 2026), 17-23. DOI=10.5120/ijca2026926428

                        @article{ 10.5120/ijca2026926428,
                        author  = { Chiraz Jlassi,Ameni Filali,Najet Arous },
                        title   = { Dimensionality Reduction with Ensemble Learning using Growing Hierarchical Adaptive Self Organizing Maprg },
                        journal = { International Journal of Computer Applications },
                        year    = { 2026 },
                        volume  = { 187 },
                        number  = { 82 },
                        pages   = { 17-23 },
                        doi     = { 10.5120/ijca2026926428 },
                        publisher = { Foundation of Computer Science (FCS), NY, USA }
                        }
                        %0 Journal Article
                        %D 2026
                        %A Chiraz Jlassi
                        %A Ameni Filali
                        %A Najet Arous
                        %T Dimensionality Reduction with Ensemble Learning using Growing Hierarchical Adaptive Self Organizing Maprg%T 
                        %J International Journal of Computer Applications
                        %V 187
                        %N 82
                        %P 17-23
                        %R 10.5120/ijca2026926428
                        %I Foundation of Computer Science (FCS), NY, USA
Abstract

Feature selection is a technique designed to reduce the complexity of learning models by selecting the most relevant features, thereby enhancing the interpretability of the models while maintaining strong generalization performance. This study addresses the challenge of choosing a subset of the most important features for each cluster within a dataset. The proposed method extends the Random Forests approach by incorporating growing hierarchical adaptive self-organizing maps (GH_AdSOM) variant for unlabelled data. It assesses the out-of-bag feature importance across multiple partitions, each generated through various bootstrap samples and a random subset of features. The GH_AdSOM represents a neural network architecture that synergizes the benefits of two key enhancements to the self-organizing map: dynamic growth and hierarchical structure. This approach allows for adaptability in map size as well as a layered organization, resulting in a powerful and flexible neural network model.

References
  • Kaur, S., Kumar, Y., Koul, A., and Kumar Kamboj, S. 2023. A systematic review on metaheuristic optimization techniques for feature selections in disease diagnosis, open issues and challenges. Archives of Computational Methods in Engineering, 30(3), pp 1863-1895.
  • Fan, Y., Liu, J., Tang, J., Liu, P., Lin, Y., and Du, Y. 2024. Learning correlation information for multi-label feature selection. Pattern Recognition, 145, 109899.
  • Strehl, A., and Ghosh, J. 2002. Cluster ensembles-a knowledge reuse framework for combining multiple partitions. Journal of Machine Learning.
  • Dong, X., Yu, Z., Cao, W., Shi, Y., and Ma, Q. 2020. A survey on ensemble learning. Frontiers of Computer Science, 14, 241-258.
  • Deng, S., Zhu, Y., Yu, Y., and Huang, X. 2024. An integrated approach of ensemble learning methods for stock index prediction using investor sentiments. Expert Systems with Applications, 238, 121710.
  • Nie, F., Huang, H., Cai, X., and Ding, C. 2010. Efficient and robust feature selection via joint l2,1-norms minimization, in Proc. Adv. NIPS, pp. 1813–1821.
  • Xu, Z., King, I., Lyu, M. R.-T and. Jin, R. 2010. Discriminative semisupervised feature selection via manifold regularization, IEEE Trans. Neural Netw.21(7) 1033–1047.
  • Li, Z., Yang, Y., Liu, J., Zhou, X. and Lu, H. 2012. Unsupervised feature selection using nonnegative spectral analysis, inProc. Conf. AAAI, pp. 1026–1032.
  • Fred, A. and Jain, A. 2008. Combining multiple clusterings using evidence accumulation, IEEE Trans. Pattern Anal. Mach. Intell.27(6) pp 835–850.
  • Li, F., and Yang, Y. 2005. Analysis of recursive feature elimination methods. In Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval pp. 633-634.
  • Nogales, R.E., Benalcázar, M.E. 2023. Analysis and Evaluation of Feature Selection and Feature Extraction Methods. Int J Comput Intell Syst 16, 153.
  • Deschênes, T., Tohoundjona, F. W. E., Plante, P. L., Di Marzo, V., and Raymond, F. 2023. Gene-based microbiome representation enhances host phenotype classification. Msystems, 8(4), e00531-23.
  • Bommert, A., Sun, X., Bischl, B., Rahnenführer, J., and Lang, M. 2020. Benchmark for filter methods for feature selection in high-dimensional classification data. Computational Statistics & Data Analysis, 143, 106839.
  • Liu, H., and Setiono, R. 2022. Feature selection and classification–a probabilistic wrapper approach. In Industrial and engineering applications or artificial intelligence and expert systems (pp. 419-424). CRC Press.
  • Dudoit, S. and Fridlyand, J. 2003. Bagging to improve the accuracy of a clustering procedure, Bioinformatics, Vol. 19, No. 9, pp.1090–1099.
  • Breiman, L. 2001. Random forests, Mach. Learn.45(1) 5–32.
  • Guyon, I. and Elissee, A. 2003. An introduction to variable and feature selection (kernelmachines section). J. Mach. Learn. Res.31157–1182.
  • Mundra, P. A., and Rajapakse, J. C. (2009). SVM-RFE with MRMR filter for gene selection. IEEE transactions on nanobioscience, 9(1), 31-37.
  • Nakao, H., Imaoka, M., Hida, M., Imai, R., Nakamura, M., Matsumoto, K., and Kita, K. 2023. Determination of individual factors associated with hallux valgus using SVM-RFE. BMC Musculoskeletal Disorders, 24(1), 534.
  • Dittenbach, M., Merkl, D., Rauber, A. 2000. The growing hierarchical self-organizing map, in proceedings of the International Joint Conference on Neural Networks (IJCNN), Vol. 6, pp 15-19.
  • Kohonen, T. 2001. Self-organizing Maps, third edition, Springer.
  • Meila, M. 2005. Comparing clusterings: An axiomatic view, inProc. 22nd Int. Conf. Machine Learning (ICML), Bonn, Germany, pp. 577–584.
  • Filali, A., Jlassi, C., and Arous, N. 2017. Recursive Feature Elimination with Ensemble Learning Using SOM Variants. International Journal of Computational Intelligence and Applications (IJCIA), Vol. 16, No. 01, 1750004.
Index Terms
Computer Science
Information Sciences
No index terms available.
Keywords

Growing hierarchical adaptive self-organizing map; random forest; feature selection; recursive feature elimination

Powered by PhDFocusTM