TY - GEN
T1 - Random minibatch projection algorithms for convex feasibility problems
AU - Nedic, Angelia
AU - Necoara, Ion
N1 - Funding Information:
This research was supported by the Executive Agency for Higher Education, Research and Innovation Funding (UEFISCDI), Romania, PNIII-P4-PCE-2016-0731, project ScaleFreeNet, no. 39/2017.
Publisher Copyright:
© 2019 IEEE.
PY - 2019/12
Y1 - 2019/12
N2 - This paper deals with the convex feasibility problem where the feasible set is given as the intersection of a (possibly infinite) number of closed convex sets. We assume that each set is specified algebraically as a convex inequality, where the associated convex function may be even non-differentiable. We present and analyze a random minibatch projection algorithm using special subgradient iterations for solving the convex feasibility problem described by the functional constraints. The updates are performed based on parallel random observations of several constraint components. For this minibatch method we derive asymptotic convergence results and, under some linear regularity condition for the functional constraints, we prove linear convergence rate. We also derive conditions under which the rate depends explicitly on the minibatch size. To the best of our knowledge, this work is the first proving that random minibatch subgradient based projection updates have a better complexity than their single-sample variants.
AB - This paper deals with the convex feasibility problem where the feasible set is given as the intersection of a (possibly infinite) number of closed convex sets. We assume that each set is specified algebraically as a convex inequality, where the associated convex function may be even non-differentiable. We present and analyze a random minibatch projection algorithm using special subgradient iterations for solving the convex feasibility problem described by the functional constraints. The updates are performed based on parallel random observations of several constraint components. For this minibatch method we derive asymptotic convergence results and, under some linear regularity condition for the functional constraints, we prove linear convergence rate. We also derive conditions under which the rate depends explicitly on the minibatch size. To the best of our knowledge, this work is the first proving that random minibatch subgradient based projection updates have a better complexity than their single-sample variants.
KW - Convex sets
KW - convergence analysis
KW - feasibility problem
KW - functional constraints
KW - random minibatch projections
UR - http://www.scopus.com/inward/record.url?scp=85082453339&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85082453339&partnerID=8YFLogxK
U2 - 10.1109/CDC40024.2019.9029233
DO - 10.1109/CDC40024.2019.9029233
M3 - Conference contribution
AN - SCOPUS:85082453339
T3 - Proceedings of the IEEE Conference on Decision and Control
SP - 1507
EP - 1512
BT - 2019 IEEE 58th Conference on Decision and Control, CDC 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 58th IEEE Conference on Decision and Control, CDC 2019
Y2 - 11 December 2019 through 13 December 2019
ER -