Metaheuristic-Based Hyperparameter Optimization for Machine Learning Classification: An Applied Experimental Study

Authors

  • ahmed majid University of Information Technology and Communications

DOI:

https://doi.org/10.25195/ijci.v52i1.734

Keywords:

Metaheuristic optimization; Hyperparameter tuning; Machine learning classification; Particle Swarm Optimization; Grey Wolf Optimizer; Hybrid optimization.

Abstract

The selection of hyperparameters is a key factor in the predictive performance and the overall generalization of machine learning models. In real-life scenarios, poor hyperparameter selection tends to result in suboptimal performance, despite the use of sophisticated learning algorithms. Intelligent optimization techniques are commonly adopted when conventional tuning methods, such as grid search and random search, become computationally infeasible. In this paper, an experimental study involving the use of metaheuristic-based hyperparameter optimization for machine learning classification is outlined. The study used Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and Grey Wolf Optimizer (GWO) to optimize the hyperparameters of popular classifiers, such as Support Vector Machine, Random Forest, and k-Nearest Neighbors. A hybrid PSO–GWO framework is also proposed to accommodate complementary exploration and exploitation behaviors and improve convergence stability and optimization effectiveness. Experiments on multiple benchmark datasets of the UCI Machine Learning Repository revealed that hyperparameter optimization using metaheuristics consistently outperforms default configurations in classification tasks. In addition, the developed hybrid method is more accurate and exhibits more stable convergence behavior, compared to individual optimizers. These results show that hybrid metaheuristic approaches are relevant and scalable to improve machine learning classification in applied settings.

Downloads

Download data is not yet available.

Downloads

Published

2026-03-11