Rafael Gomes Mantovani
Better trees: an empirical study on hyperparameter tuning of classification decision tree induction algorithms
Mantovani, Rafael Gomes; Horváth, Tomáš; Rossi, André L. D.; Cerri, Ricardo; Barbon Junior, Sylvio; Vanschoren, Joaquin; de Carvalho, André C. P. L. F.
Authors
Tomáš Horváth
André L. D. Rossi
Ricardo Cerri
Sylvio Barbon Junior
Joaquin Vanschoren
André C. P. L. F. de Carvalho
Abstract
Machine learning algorithms often contain many hyperparameters whose values affect the predictive performance of the induced models in intricate ways. Due to the high number of possibilities for these hyperparameter configurations and their complex interactions, it is common to use optimization techniques to find settings that lead to high predictive performance. However, insights into efficiently exploring this vast space of configurations and dealing with the trade-off between predictive and runtime performance remain challenging. Furthermore, there are cases where the default hyperparameters fit the suitable configuration. Additionally, for many reasons, including model validation and attendance to new legislation, there is an increasing interest in interpretable models, such as those created by the decision tree (DT) induction algorithms. This paper provides a comprehensive approach for investigating the effects of hyperparameter tuning for the two DT induction algorithms most often used, CART and C4.5. DT induction algorithms present high predictive performance and interpretable classification models, though many hyperparameters need to be adjusted. Experiments were carried out with different tuning strategies to induce models and to evaluate hyperparameters’ relevance using 94 classification datasets from OpenML. The experimental results point out that different hyperparameter profiles for the tuning of each algorithm provide statistically significant improvements in most of the datasets for CART, but only in one-third for C4.5. Although different algorithms may present different tuning scenarios, the tuning techniques generally required few evaluations to find accurate solutions. Furthermore, the best technique for all the algorithms was the Irace. Finally, we found out that tuning a specific small subset of hyperparameters is a good alternative for achieving optimal predictive performance.
Citation
Mantovani, R. G., Horváth, T., Rossi, A. L. D., Cerri, R., Barbon Junior, S., Vanschoren, J., & de Carvalho, A. C. P. L. F. (2024). Better trees: an empirical study on hyperparameter tuning of classification decision tree induction algorithms. Data Mining and Knowledge Discovery, 38, 1364–1416. https://doi.org/10.1007/s10618-024-01002-5
Journal Article Type | Article |
---|---|
Acceptance Date | Jan 1, 2024 |
Online Publication Date | Jan 31, 2024 |
Publication Date | 2024 |
Deposit Date | Mar 27, 2024 |
Print ISSN | 1384-5810 |
Electronic ISSN | 1573-756X |
Publisher | Springer |
Peer Reviewed | Peer Reviewed |
Volume | 38 |
Pages | 1364–1416 |
DOI | https://doi.org/10.1007/s10618-024-01002-5 |
Keywords | Decision tree induction algorithms, Hyperparameter tuning, Hyperparameter profile, J48, CART |
Public URL | http://researchrepository.napier.ac.uk/Output/3577299 |
You might also like
A Comparative Study of Assessment Metrics for Imbalanced Learning
(2023)
Presentation / Conference Contribution
Squared Symmetric Formal Contexts and Their Connections with Correlation Matrices
(2023)
Presentation / Conference Contribution
NCC: Neural concept compression for multilingual document recommendation
(2023)
Presentation / Conference Contribution
Hyper-parameter initialization of classification algorithms using dynamic time warping: A perspective on PCA meta-features
(2022)
Presentation / Conference Contribution
Downloadable Citations
About Edinburgh Napier Research Repository
Administrator e-mail: repository@napier.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search