TY - GEN
T1 - Using SDP to Parameterize Universal Kernel Functions
AU - Colbert, Brendon K.
AU - Peet, Matthew M.
N1 - Funding Information:
*This work was supported by the National Science Foundation under grant No. 1931270, No. 1935453 and No. 026257-001 1Brendon K. Colbert is with the Department of Mechanical Engineering, Arizona State University, Tempe, AZ, 85298 US brendon.colbert@asu.edu 2 Matthew M. Peet is with Faculty of Mechanical Engineering, Arizona State University Tempe, AZ, 85298 US mpeet@asu.edu
Publisher Copyright:
© 2019 IEEE.
PY - 2019/12
Y1 - 2019/12
N2 - We propose a new class of universal kernel functions which admit a linear parametrization using positive semidefinite matrices. We refer to kernels of this class as Tessellated Kernels (TKs) due to the observation that if applied to kernel-based learning algorithms, the resulting discriminants are defined by continuous piecewise-polynomial functions with hyper-rectangular domains whose vertices are determined by the training data. The number of parameters used to define these TKs is determined by the length of an associated monomial basis. However, even for a single monomial basis function the TKs are universal in the sense that the resulting discriminants occupy a hypothesis space which is dense in L. This implies that the use of TKs for learning the kernel (aka kernel learning) can obviate the need for Gaussian kernels and associated problem of selecting bandwidth - a conclusion verified through extensive numerical testing on soft margin Support Vector Machine (SVM) problems. Furthermore, our results show that when the ratio of the number of training data to features is high, the proposed method will significantly outperform other algorithms for learning the kernel. Finally, TKs can be integrated efficiently with existing Multiple Kernel Learning (MKL) algorithms such as SimpleMKL.
AB - We propose a new class of universal kernel functions which admit a linear parametrization using positive semidefinite matrices. We refer to kernels of this class as Tessellated Kernels (TKs) due to the observation that if applied to kernel-based learning algorithms, the resulting discriminants are defined by continuous piecewise-polynomial functions with hyper-rectangular domains whose vertices are determined by the training data. The number of parameters used to define these TKs is determined by the length of an associated monomial basis. However, even for a single monomial basis function the TKs are universal in the sense that the resulting discriminants occupy a hypothesis space which is dense in L. This implies that the use of TKs for learning the kernel (aka kernel learning) can obviate the need for Gaussian kernels and associated problem of selecting bandwidth - a conclusion verified through extensive numerical testing on soft margin Support Vector Machine (SVM) problems. Furthermore, our results show that when the ratio of the number of training data to features is high, the proposed method will significantly outperform other algorithms for learning the kernel. Finally, TKs can be integrated efficiently with existing Multiple Kernel Learning (MKL) algorithms such as SimpleMKL.
UR - http://www.scopus.com/inward/record.url?scp=85082455655&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85082455655&partnerID=8YFLogxK
U2 - 10.1109/CDC40024.2019.9030084
DO - 10.1109/CDC40024.2019.9030084
M3 - Conference contribution
AN - SCOPUS:85082455655
T3 - Proceedings of the IEEE Conference on Decision and Control
SP - 4622
EP - 4629
BT - 2019 IEEE 58th Conference on Decision and Control, CDC 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 58th IEEE Conference on Decision and Control, CDC 2019
Y2 - 11 December 2019 through 13 December 2019
ER -