TY - GEN
T1 - An efficient algorithm for a class of fused Lasso problems
AU - Liu, Jun
AU - Yuan, Lei
AU - Ye, Jieping
PY - 2010
Y1 - 2010
N2 - The fused Lasso penalty enforces sparsity in both the coefficients and their successive differences, which is desirable for applications with features ordered in some meaningful way. The resulting problem is, however, challenging to solve, as the fused Lasso penalty is both non-smooth and non-separable. Existing algorithms have high computational complexity and do not scale to large-size problems. In this paper, we propose an Efficient Fused Lasso Algorithm (EFLA) for optimizing this class of problems. One key building block in the proposed EFLA is the Fused Lasso Signal Approximator (FLSA). To efficiently solve FLSA, we propose to reformulate it as the problem of finding an "appropriate" subgradient of the fused penalty at the minimizer, and develop a Subgradient Finding Algorithm (SFA). We further design a restart technique to accelerate the convergence of SFA, by exploiting the special "structures" of both the original and the reformulated FLSA problems. Our empirical evaluations show that, both SFA and EFLA significantly outperform existing solvers. We also demonstrate several applications of the fused Lasso.
AB - The fused Lasso penalty enforces sparsity in both the coefficients and their successive differences, which is desirable for applications with features ordered in some meaningful way. The resulting problem is, however, challenging to solve, as the fused Lasso penalty is both non-smooth and non-separable. Existing algorithms have high computational complexity and do not scale to large-size problems. In this paper, we propose an Efficient Fused Lasso Algorithm (EFLA) for optimizing this class of problems. One key building block in the proposed EFLA is the Fused Lasso Signal Approximator (FLSA). To efficiently solve FLSA, we propose to reformulate it as the problem of finding an "appropriate" subgradient of the fused penalty at the minimizer, and develop a Subgradient Finding Algorithm (SFA). We further design a restart technique to accelerate the convergence of SFA, by exploiting the special "structures" of both the original and the reformulated FLSA problems. Our empirical evaluations show that, both SFA and EFLA significantly outperform existing solvers. We also demonstrate several applications of the fused Lasso.
KW - Fused Lasso
KW - Restart
KW - Subgradient
KW - ℓ regularization
UR - http://www.scopus.com/inward/record.url?scp=77956206508&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=77956206508&partnerID=8YFLogxK
U2 - 10.1145/1835804.1835847
DO - 10.1145/1835804.1835847
M3 - Conference contribution
AN - SCOPUS:77956206508
SN - 9781450300551
T3 - Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
SP - 323
EP - 332
BT - KDD'10 - Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data
T2 - 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD-2010
Y2 - 25 July 2010 through 28 July 2010
ER -