TY - GEN
T1 - Parameter Optimization with Conscious Allocation (POCA)
AU - Inman, Joshua
AU - Khandait, Tanmay
AU - Pedrielli, Giulia
AU - Sankar, Lalitha
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - The performance of modern machine learning algorithms depends upon the selection of a set of hyperparameters. Common examples of hyperparameters are learning rate and the number of layers in a dense neural network. Auto-ML is a branch of optimization that has produced important contributions in this area. Within Auto-ML, hyperband-based approaches, which eliminate poorly-performing configurations after evaluating them at low budgets, are among the most effective. However, the performance of these algorithms strongly depends on how effectively they allocate the computational budget to various hyperparameter configurations. We present the new Parameter Optimization with Conscious Allocation (POCA), a hyperband-based algorithm that adaptively allocates the inputted budget to the hyperparameter configurations it generates following a Bayesian sampling scheme. We compare POCA to its nearest competitor at optimizing the hyperparameters of an artificial toy function and a deep neural network and find that POCA finds strong configurations faster in both settings.
AB - The performance of modern machine learning algorithms depends upon the selection of a set of hyperparameters. Common examples of hyperparameters are learning rate and the number of layers in a dense neural network. Auto-ML is a branch of optimization that has produced important contributions in this area. Within Auto-ML, hyperband-based approaches, which eliminate poorly-performing configurations after evaluating them at low budgets, are among the most effective. However, the performance of these algorithms strongly depends on how effectively they allocate the computational budget to various hyperparameter configurations. We present the new Parameter Optimization with Conscious Allocation (POCA), a hyperband-based algorithm that adaptively allocates the inputted budget to the hyperparameter configurations it generates following a Bayesian sampling scheme. We compare POCA to its nearest competitor at optimizing the hyperparameters of an artificial toy function and a deep neural network and find that POCA finds strong configurations faster in both settings.
UR - http://www.scopus.com/inward/record.url?scp=85185382664&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85185382664&partnerID=8YFLogxK
U2 - 10.1109/WSC60868.2023.10407962
DO - 10.1109/WSC60868.2023.10407962
M3 - Conference contribution
AN - SCOPUS:85185382664
T3 - Proceedings - Winter Simulation Conference
SP - 3436
EP - 3447
BT - 2023 Winter Simulation Conference, WSC 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 Winter Simulation Conference, WSC 2023
Y2 - 10 December 2023 through 13 December 2023
ER -