Machine Learning Research

Submit a Manuscript

Publishing with us to make your research visible to the widest possible audience.

Propose a Special Issue

Building a community of authors and readers to discuss the latest research and develop new ideas.

Automated Machine Learning Models and State-Of-The-Art Effort in Mitigating Combined Algorithm Selection and Hyperparameter Optimization Problems: A Review

Automated machine learning (AutoML) models is one of several machine learning algorithms that can be used to automate the solution of real-world problems. It automates the selection, composition, and parameterization processes of the machine learning models in particular. Machine learning could be more user-friendly when it is automated, and it often produces faster, and more accurate results than hand-coded machine learning methods. For more than ten years, AutoML for supervised learning has been the main focus of research under the discipline of artificial intelligence, and significant progress has been made theeafter; consider the usefulness of AutoML methods in the most popular machine learning toolkits, as well as the AutoML mechanisms in large scale platforms such as Microsoft Azure. This paper provides a methodical analysis of the AutoML workflow as well as the state-of-the-art effort in dealing with the challenges involving Combined Algorithm Selection and Hyperparameter Optimization by gathering information about AutoML from several published articles from different online repositories in order to delve more into the methods used in different domains and the level of accuracy obtained. Findings revealed that the next generation of machine learning and artificial intelligence research is focused on automating the other phases of the whole end-to-end machine learning pipeline, from data comprehension to model deployment. With significantly better deep learning algorithms and big datasets, AutoML is predicted to be able to handle most of the data cleaning process in the future. AutoML will evolve into a highly human-competitive system that will change the way we think about data research.

Transfer Learning, Machine Learning, Hyperparameter, Automation, Artificial Intelligence

Nwokonkwo Obi Chukwuemeka, John-Otumu Adetokunbo MacGregor, Nnadi Leonard Chukwualuka, Ogene Ferguson. (2022). Automated Machine Learning Models and State-Of-The-Art Effort in Mitigating Combined Algorithm Selection and Hyperparameter Optimization Problems: A Review. Machine Learning Research, 7(1), 1-7.

Copyright © 2022 Authors retain the copyright of this article.
This article is an open access article distributed under the Creative Commons Attribution License ( which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

1. Chauhan, K., Jani, S., Thakkar, D., Dave, R., Bhatia, J., Tanwar, S., & Obaidat, M. S. (2020). Automated Machine Learning: The New Wave of Machine Learning. (Icimia), 205–212.
2. Tuggener, L., Amirian, M., Rombach, K., Stefan, L., Varlet, A., Westermann, C., … Ag, P. (2019). Automated Machine Learning in Practice: State of the Art and Recent Results. 31–36.
3. Lutkevich, B. (2020). What is automated machine learning (AutoML). Retrieved from
4. Silver, D., Huang, A., Maddison, C. J., Guez, A., Sifre, L., Van Den Driessche, G., and Hassabis, D. (2016). Mastering the game of Go with deep neural networks and tree search. Nature, 529 (7587), 484–489.
5. He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-Decem, 770–778.
6. Yao, Q., Wang, M., Chen, Y., Dai, W., Li, Y., Tu, W., … Yu, Y. (2018). Taking Human out of Learning Applications: A Survey on Automated Machine Learning. 1–20. Retrieved from
7. Mirtaheri, S. M., & Dinc, M. E. (2014). A Brief History of Web Crawlers. (May).
8. Feurer, M., Klein, A., Eggensperger, K., Springenberg, J. T., Blum, M., & Hutter, F. (2019). Auto-sklearn: Efficient and Robust Automated Machine Learning.
9. Thornton, C., & Leyton-brown, K. (2013). Auto-WEKA: Combined Selection and Hyperparameter Optimization of Classification Algorithms. 847–855.
10. Escalante, H. J. (2020). Automated Machine Learning -- a brief review at the end of the early years. 1–17. Retrieved from
11. Olson, R. S., Bartley, N., Urbanowicz, R. J., & Moore, J. H. (2016). Evaluation of a Tree-based Pipeline Optimization Tool for Automating Data Science.
12. Elsken, T., Metzen, J. H., & Hutter, F. (2019). Neural Architecture Search: A Survey. 20, 1–21.
13. Cai, H., Chen, T., Zhang, W., Yu, Y., & Wang, J. (2017). Efficient Architecture Search by Network Transformation. 2787–2794.
14. Donges, N. (2021). What Is Transfer Learning A Simple Guide Built In. Retrieved from
15. Pan, S. J., & Fellow, Q. Y. (2009). A Survey on Transfer Learning. 1–15.
16. Kaul, A., Maheshwary, S., & Pudi, V. (2017). AutoLearn - Automated Feature Generation and Selection.
17. Krauß, J., Pacheco, B. M., Zang, H. M., Heinrich, R., Design, C., Stief, P., … Siadat, A. (2020). ScienceDirect ScienceDirect methodology to analyze the functional and physical architecture Automated machine learning for predictive quality in production existing products for an assembly oriented product family identification. Procedia CIRP, 93, 443–448.
18. Yang, C., Fan, J., Wu, Z., & Udell, M. (2020). AutoML Pipeline Selection: Efficiently Navigating the Combinatorial Space.
19. Swearingen, T., Drevo, W., Cyphers, B., Cuesta-Infante, A., Ross, A., & Veeramachaneni, K. (2017). ATM: A distributed, collaborative, scalable system for automated machine learning. Proceedings - 2017 IEEE International Conference on Big Data, Big Data 2017, 2018-January (December), 151–162.
20. Liang, J., Meyerson, E., Hodjat, B., Fink, D., Mutch, K., & Miikkulainen, R. (2019). Evolutionary neural automl for deep learning. GECCO 2019 - Proceedings of the 2019 Genetic and Evolutionary Computation Conference, 401–409.
21. Feurer, M., & Hutter, F. (2019). Hyperparameter Optimization. 3–33.
22. Imbrea, A. (2021). Automated Machine Learning Techniques for Data Streams. Retrieved from
23. Mohr, F., Wever, M., & Hüllermeier, E. (2018). ML-Plan: Automated machine learning via hierarchical planning. Machine Learning.
24. He, X., Zhao, K., & Chu, X. (2021). AutoML: A survey of the state-of-the-art. Knowledge-Based Systems, 212 (Dl).
25. Pan, W., Xiang, E. W., Liu, N. N., & Yang, Q. (2010). Transfer Learning in Collaborative Filtering for Sparsity Reduction. 230–235.
26. Lam, H. T., Thiebaut, J., & Sinn, M. (2016). One button machine for automating feature engineering in relational databases.
27. Katz, G., & Song, D. (2016). ExploreKit: Automatic Feature Generation and Selection.
28. Nargesian, F., Samulowitz, H., Khurana, U., Khalil, E. B., & Turaga, D. (2017). Learning Feature Engineering for Classification. 2529–2535.
29. Garrido-merch, E. C., Tom, F., Hern, D., & Tom, F. (2016). Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints.
30. Li, L., Jamieson, K., & Desalvo, G. (2018). Hyperband: A Novel Bandit-Based Approach to. 18, 1–52.
31. Baker, B., Gupta, O., and Raskar, N. N. (2018). Accelerating neural architecture search using performance prediction. 6th International Conference on Learning Representations, ICLR 2018 - Workshop Track Proceedings, 1–19.
32. Mendoza, H. (2016). Towards Automatically-Tuned Neural Networks. 1–8.
33. Guyon, I., Sun-hosoya, L., Boullé, M., Escalante, H., Liu, Z., Jajetic, D., … Guyon, I. (2018). Analysis of the AutoML Challenge series 2015-2018 To cite this version: HAL Id: hal-01906197 Analysis of the AutoML Challenge series 2015-2018.
34. Hutter, F., Hoos, H. H., & Leyton-brown, K. (2011). Sequential Model-Based Optimization for General Algorithm Configuration.
35. Jamieson, K., & Talwalkar, A. (2016). Non-stochastic best arm identification and hyperparameter optimization. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, 240–248.
36. Maclaurin, D., Duvenaud, D., and R. A. (2015). Hyper-Parameter Initialization for Squared Exponential Kernel-based Gaussian Process Regression Nalika Ulapane - Academia. Retrieved from