Optimization of Feature Selection using Firework Algorithm for Machine Learning Algorithm
DOI:
https://doi.org/10.21015/vtcs.v13i2.2128Abstract
In machine learning and deep learning, the optimal feature selection plays an important role in enhancing performance, decreasing computational cost, and improving the interpretability of the algorithm. The performance of the machine learning algorithm is impacted by the noisy, redundant, and meaningless features found in the majority of classification problem datasets. Feature selection is the process of selecting a feature subset and a search method for finding the optimal subset of features from many features using a fitness function to improve the accuracy and execution time. This research focuses on the application of the fireworks algorithm as a useful tool for optimizing feature selection. By navigating the feature space, the recommended technique finds the optimum subset of features that maximize a model's performance. By analyzing the fitness function, which combines the complexity of the model with the predictive ability of the chosen features, the program repeatedly improves the feature subset. The paper makes use of widely known datasets on breast cancer that include a limited number of characteristics. The classification performance of selected feature subsets is evaluated using classification techniques: Support vector machine, logistic regression, and bagging classifier. The proposed algorithms are better than the particle swarm optimization algorithm, ant colony optimization algorithm, principal component analysis, and so on. The results indicate that a certain feature subset may be chosen with higher accuracy through the use of the recommended techniques, as opposed to using all characteristics. An optimal subset of feature selection techniques enhances the accuracy and decreases the number of features. Enhancements in classification accuracy are supported by a substantial decrease in the number of features with a higher weight on the fast reduction of the fitness function.
References
L. Wang, M. Han, X. Li, N. Zhang, and H. Cheng, “Review of classification methods on unbalanced data sets,” IEEE Access, vol. 9, pp. 64606–64628, 2021.
I. H. Sarker, “Machine learning: Algorithms, real-world applications and research directions,” SN Computer Science, vol. 2, no. 3, p. 160, 2021.
V. Gupta, V. K. Mishra, P. Singhal, and A. Kumar, “An overview of supervised machine learning algorithms,” in Proc. 11th Int. Conf. System Modeling & Advancement in Research Trends (SMART), IEEE, 2022, pp. 87–92.
P. C. Sen, M. Hajra, and M. Ghosh, “Supervised classification algorithms in machine learning: A survey and review,” in Emerging Technology in Modelling and Graphics, Springer, 2019, pp. 99–111.
W. Li, G.-G. Wang, and A. H. Gandomi, “A survey of learning-based intelligent optimization algorithms,” Archives of Computational Methods in Engineering, vol. 28, no. 5, pp. 3781–3799, 2021.
R. Zebari, A. Abdulazeez, D. Zeebaree, D. Zebari, and J. Saeed, “A comprehensive review of dimensionality reduction techniques for feature selection and feature extraction,” Journal of Applied Science and Technology Trends, vol. 1, no. 1, pp. 56–70, 2020.
P. Dhal and C. Azad, “A comprehensive survey on feature selection in various fields of machine learning,” Applied Intelligence, vol. 52, no. 4, pp. 4543–4581, 2022.
O. O. Akinola, A. E. Ezugwu, J. O. Agushaka, R. A. Zitar, and L. Abualigah, “Multiclass feature selection with metaheuristic optimization algorithms: A review,” Neural Computing and Applications, vol. 34, no. 22, pp. 19751–19790, 2022.
S. Tkatek, O. Bahti, Y. Lmzouari, and J. Abouchabaka, “Artificial intelligence for improving the optimization of NP-hard problems: A review,” International Journal of Advanced Trends in Computer Science and Applications, vol. 9, no. 5, pp. 8042–8051, 2020.
B. M. S. Hasan and A. M. Abdulazeez, “A review of principal component analysis algorithm for dimensionality reduction,” Journal of Soft Computing and Data Mining, vol. 2, no. 1, pp. 20–30, 2021.
A.-D. Li, B. Xue, and M. Zhang, “Improved binary particle swarm optimization for feature selection with new initialization and search space reduction strategies,” Applied Soft Computing, vol. 106, p. 107302, 2021.
Y. Lin, J. Wang, X. Li, Y. Zhang, and S. Huang, “An improved artificial bee colony algorithm for feature selection in QSAR,” Algorithms, vol. 14, no. 4, p. 120, 2021.
A. C. Pandey, D. S. Rajpoot, and M. Saraswat, “Feature selection method based on hybrid data transformation and binary binomial cuckoo search,” Journal of Ambient Intelligence and Humanized Computing, vol. 11, no. 2, pp. 719–738, 2020.
S. Eskandari and M. Seifaddini, “Online and offline streaming feature selection methods with bat algorithm for redundancy analysis,” Pattern Recognition, vol. 133, p. 109007, 2023.
J. Li and Y. Tan, “A comprehensive review of the fireworks algorithm,” ACM Computing Surveys, vol. 52, no. 6, pp. 1–28, 2019.
Y. Zhang, D.-W. Gong, X.-Z. Gao, T. Tian, and X.-Y. Sun, “Binary differential evolution with self-learning for multi-objective feature selection,” Information Sciences, vol. 507, pp. 67–85, 2020.
A. Adamu, M. Abdullahi, S. B. Junaidu, and I. H. Hassan, “A hybrid particle swarm optimization with crow search algorithm for feature selection,” Machine Learning with Applications, vol. 6, p. 100108, 2021.
D. Kumar et al., “Feature extraction and selection of kidney ultrasound images using GLCM and PCA,” Procedia Computer Science, vol. 167, pp. 1722–1731, 2020.
J. Balakumar and S. V. Mohan, “Artificial bee colony algorithm for feature selection and improved support vector machine for text classification,” Information Discovery and Delivery, vol. 47, no. 3, pp. 154–170, 2019.
O. S. Qasim and Z. Y. Algamal, “Feature selection using different transfer functions for binary bat algorithm,” International Journal of Mathematical, Engineering and Management Sciences, vol. 5, no. 4, p. 697, 2020.
F. Amini and G. Hu, “A two-layer feature selection method using genetic algorithm and elastic net,” Expert Systems with Applications, vol. 166, p. 114072, 2021.
N. Mehrabi, S. P. Haeri Boroujeni, and E. Pashaei, “An efficient high-dimensional gene selection approach based on the binary horse herd optimization algorithm for biological data classification,” Iran Journal of Computer Science, vol. 7, no. 2, pp. 279–309, 2024.
E. A. Zaimoğlu, N. Yurtay, H. Demirci, and Y. Yurtay, “A binary chaotic horse herd optimization algorithm for feature selection,” Engineering Science and Technology, an International Journal, vol. 44, p. 101453, 2023.
A. A. Bidgoli and S. Rahnamayan, “Large-scale multi-objective feature selection: A multi-phase search space shrinking approach,” arXiv preprint arXiv:2410.21293, 2024.
X. Wang, H. Shangguan, F. Huang, S. Wu, and W. Jia, “MEL: Efficient multi-task evolutionary learning for high-dimensional feature selection,” IEEE Transactions on Knowledge and Data Engineering, vol. 36, no. 8, pp. 4020–4033, 2024.
E. Kraev, B. Koseoglu, L. Traverso, and M. Topiwalla, “SHAP-select: Lightweight feature selection using SHAP values and regression,” arXiv preprint arXiv:2410.06815, 2024.
K. Yang, L. Liu, and Y. Wen, “The impact of Bayesian optimization on feature selection,” Scientific Reports, vol. 14, no. 1, p. 3948, 2024.
N. Kapure, H. Joshi, P. Kumari, R. Mistri, and M. Mali, “FRAME: Forward recursive adaptive model extraction—A technique for advanced feature selection,” arXiv preprint arXiv:2501.11972, 2025.
Z. Zhang, Q. An, Y. Wang, C. Wu, B. Dong, and C. Zhou, “A high-dimensional feature selection algorithm based on multi-objective differential evolution,” arXiv preprint arXiv:2505.05727, 2025.
P. Sarker, J.-J. Tiang, and A.-A. Nahid, “Metaheuristic-driven feature selection for human activity recognition on KU-HAR dataset using XGBoost classifier,” Sensors, vol. 25, no. 17, p. 5303, 2025.
E. Ehsaeyan and A. Zolghadrasli, “FOA: Fireworks optimization algorithm,” Multimedia Tools and Applications, vol. 81, no. 23, pp. 33151–33170, 2022.
Downloads
Published
How to Cite
Issue
Section
License
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (CC-By) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
This work is licensed under a Creative Commons Attribution License CC BY