TY - GEN
T1 - CGPT
T2 - 2023 Winter Simulation Conference, WSC 2023
AU - Jiang, Mengrui Mina
AU - Khandait, Tanmay
AU - Pedrielli, Giulia
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - In black-box optimization problems, Bayesian optimization algorithms are often applied by generating inputs and measure values to discover hidden structure and determine where to sample sequentially. However, in some situations, information about system properties can be available, such as the trajectory of a dynamical system, discrete states executed during a simulation, the model generating the trajectories. In different learning tasks, we may know that the objective is the minimum of functions, or a network. In this paper we consider the case where the structure of the objective function can be encoded as a tree. In particular, each node of the tree performs a computation on the input and based on the outcome, a different branch is chosen. We propose the new Conditional Gaussian Process tree (CGPT) model for "tree functions"to embed the function structure and improving the prediction power of the Gaussian process. We utilize the intermediate information made available at the tree nodes, to formulate a novel likelihood for the estimation of the CGPT parameters under different levels of knowledge of the structure. We formulate the learning and investigate the performance of the proposed approach with a preliminary investigation. Our study shows that CGPT always outperforms a single Gaussian process model.
AB - In black-box optimization problems, Bayesian optimization algorithms are often applied by generating inputs and measure values to discover hidden structure and determine where to sample sequentially. However, in some situations, information about system properties can be available, such as the trajectory of a dynamical system, discrete states executed during a simulation, the model generating the trajectories. In different learning tasks, we may know that the objective is the minimum of functions, or a network. In this paper we consider the case where the structure of the objective function can be encoded as a tree. In particular, each node of the tree performs a computation on the input and based on the outcome, a different branch is chosen. We propose the new Conditional Gaussian Process tree (CGPT) model for "tree functions"to embed the function structure and improving the prediction power of the Gaussian process. We utilize the intermediate information made available at the tree nodes, to formulate a novel likelihood for the estimation of the CGPT parameters under different levels of knowledge of the structure. We formulate the learning and investigate the performance of the proposed approach with a preliminary investigation. Our study shows that CGPT always outperforms a single Gaussian process model.
UR - http://www.scopus.com/inward/record.url?scp=85185372460&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85185372460&partnerID=8YFLogxK
U2 - 10.1109/WSC60868.2023.10408654
DO - 10.1109/WSC60868.2023.10408654
M3 - Conference contribution
AN - SCOPUS:85185372460
T3 - Proceedings - Winter Simulation Conference
SP - 564
EP - 575
BT - 2023 Winter Simulation Conference, WSC 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 10 December 2023 through 13 December 2023
ER -