TY - GEN
T1 - (aD, aG)-GANs
T2 - 2023 IEEE International Symposium on Information Theory, ISIT 2023
AU - Welfert, Monica
AU - Otstot, Kyle
AU - Kurri, Gowtham R.
AU - Sankar, Lalitha
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - In an effort to address the training instabilities of GANs, we introduce a class of dual-objective GANs with different value functions (objectives) for the generator (G) and discriminator (D). In particular, we model each objective using a-loss, a tunable classification loss, to obtain (aD, aG)-GANs, parameterized by (aD, aG) ? (0, 8]2. For sufficiently large number of samples and capacities for G and D, we show that the resulting non-zero sum game simplifies to minimizing an f-divergence under appropriate conditions on (aD, aG). In the finite sample and capacity setting, we define estimation error to quantify the gap in the generator's performance relative to the optimal setting with infinite samples and obtain upper bounds on this error, showing it to be order optimal under certain conditions. Finally, we highlight the value of tuning (aD, aG) in alleviating training instabilities for the synthetic 2D Gaussian mixture ring and the Stacked MNIST datasets.
AB - In an effort to address the training instabilities of GANs, we introduce a class of dual-objective GANs with different value functions (objectives) for the generator (G) and discriminator (D). In particular, we model each objective using a-loss, a tunable classification loss, to obtain (aD, aG)-GANs, parameterized by (aD, aG) ? (0, 8]2. For sufficiently large number of samples and capacities for G and D, we show that the resulting non-zero sum game simplifies to minimizing an f-divergence under appropriate conditions on (aD, aG). In the finite sample and capacity setting, we define estimation error to quantify the gap in the generator's performance relative to the optimal setting with infinite samples and obtain upper bounds on this error, showing it to be order optimal under certain conditions. Finally, we highlight the value of tuning (aD, aG) in alleviating training instabilities for the synthetic 2D Gaussian mixture ring and the Stacked MNIST datasets.
UR - http://www.scopus.com/inward/record.url?scp=85168834716&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85168834716&partnerID=8YFLogxK
U2 - 10.1109/ISIT54713.2023.10206844
DO - 10.1109/ISIT54713.2023.10206844
M3 - Conference contribution
AN - SCOPUS:85168834716
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 915
EP - 920
BT - 2023 IEEE International Symposium on Information Theory, ISIT 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 25 June 2023 through 30 June 2023
ER -