TY - GEN
T1 - Comparative Analysis of Hyperparameter Tuning Methods in Classification Models For Ensemble Learning
AU - Dabool, Hamzah
AU - Alashwal, Hany
AU - Alnuaimi, Hamda
AU - Alhouqani, Asma
AU - Alkaabi, Shaikha
AU - Al Ahbabi, Amal
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Hyperparameter tuning plays a critical role in optimizing machine learning models, directly impacting their accuracy and generalization capabilities. In this paper, we implement and compare four prominent hyperparameter tuning algorithms: Grid Search, Random Search, Bayesian Optimization, and Genetic Algorithm. Our goal is to evaluate these methods on multiclass classification task, assessing them based on tuning time, computational complexity, accuracy score, and ease of use. Through an extensive experimental analysis, we identify the strengths and limitations of each approach, providing insights into their ideal use cases. The results reveal trade-offs between exhaustive search methods like Grid Search, which offer higher accuracy at the cost of time, and more efficient alternatives like Random Search and Bayesian Optimization, which balance exploration and exploitation. Genetic Algorithms, while less commonly used, show potential in discovering global optima.
AB - Hyperparameter tuning plays a critical role in optimizing machine learning models, directly impacting their accuracy and generalization capabilities. In this paper, we implement and compare four prominent hyperparameter tuning algorithms: Grid Search, Random Search, Bayesian Optimization, and Genetic Algorithm. Our goal is to evaluate these methods on multiclass classification task, assessing them based on tuning time, computational complexity, accuracy score, and ease of use. Through an extensive experimental analysis, we identify the strengths and limitations of each approach, providing insights into their ideal use cases. The results reveal trade-offs between exhaustive search methods like Grid Search, which offer higher accuracy at the cost of time, and more efficient alternatives like Random Search and Bayesian Optimization, which balance exploration and exploitation. Genetic Algorithms, while less commonly used, show potential in discovering global optima.
KW - Bayesian Search
KW - Cross Validation
KW - Genetic Search
KW - Grid Search
KW - Hyperparameter tuning
KW - Hyperparameters
KW - Random Search
KW - XGBoost
UR - http://www.scopus.com/inward/record.url?scp=105000635986&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=105000635986&partnerID=8YFLogxK
U2 - 10.1109/ACAI63924.2024.10899492
DO - 10.1109/ACAI63924.2024.10899492
M3 - Conference contribution
AN - SCOPUS:105000635986
T3 - ACAI 2024 - 2024 7th International Conference on Algorithms, Computing and Artificial Intelligence
BT - ACAI 2024 - 2024 7th International Conference on Algorithms, Computing and Artificial Intelligence
A2 - Wang, Zenghui
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 7th International Conference on Algorithms, Computing and Artificial Intelligence, ACAI 2024
Y2 - 20 December 2024 through 22 December 2024
ER -