Results 1 
7 of
7
Convergence Analysis of Alternating Direction Method of Multipliers for a Family of Nonconvex Problems
"... ar ..."
On the Convergence Rate of MultiBlock ADMM
, 2014
"... The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems. Despite of its success in practice, the convergence properties of the standard ADMM for minimizing the sum of N (N ≥ 3) convex functions with N block variables linked by linear c ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems. Despite of its success in practice, the convergence properties of the standard ADMM for minimizing the sum of N (N ≥ 3) convex functions with N block variables linked by linear constraints, have remained unclear for a very long time. In this paper, we present convergence and convergence rate results for the standard ADMM applied to solve Nblock (N ≥ 3) convex minimization problem, under the condition that one of these functions is convex (not necessarily strongly convex) and the other N − 1 functions are strongly convex. Specifically, in that case the ADMM is proven to converge with rate O(1/t) in a certain ergodic sense, and o(1/t) in nonergodic sense, where t denotes the number of iterations.
Iteration complexity analysis of multiblock ADMM for a family of convex minimization without strong convexity
, 2015
"... The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems due to its superior practical performance. On the theoretical side however, a counterexample was shown in [7] indicating that the multiblock ADMM for minimizing the sum of N (N ≥ ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems due to its superior practical performance. On the theoretical side however, a counterexample was shown in [7] indicating that the multiblock ADMM for minimizing the sum of N (N ≥ 3) convex functions with N block variables linked by linear constraints may diverge. It is therefore of great interest to investigate further sufficient conditions on the input side which can guarantee convergence for the multiblock ADMM. The existing results typically require the strong convexity on parts of the objective. In this paper, we present convergence and convergence rate results for the multiblock ADMM applied to solve certain Nblock (N ≥ 3) convex minimization problems without requiring strong convexity. Specifically, we prove the following two results: (1) the multiblock ADMM returns an optimal solution within O(1/2) iterations by solving an associated perturbation to the original problem; (2) the multiblock ADMM returns an optimal solution within O(1/) iterations when it is applied to solve a certain sharing problem, under the condition that the augmented Lagrangian function satisfies the Kurdyka Lojasiewicz property, which essentially covers most convex optimization models except for some pathological cases.
Parallel Direction Method of Multipliers
"... We consider the problem of minimizing blockseparable convex functions subject to linear constraints. While the Alternating Direction Method of Multipliers (ADMM) for twoblock linear constraints has been intensively studied both theoretically and empirically, in spite of some preliminary work, ef ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
We consider the problem of minimizing blockseparable convex functions subject to linear constraints. While the Alternating Direction Method of Multipliers (ADMM) for twoblock linear constraints has been intensively studied both theoretically and empirically, in spite of some preliminary work, effective generalizations of ADMM to multiple blocks is still unclear. In this paper, we propose a randomized block coordinate method named Parallel Direction Method of Multipliers (PDMM) to solve the optimization problems with multiblock linear constraints. PDMM randomly updates some primal blocks in parallel, behaving like parallel randomized block coordinate descent. We establish the global convergence and the iteration complexity for PDMM with constant step size. We also show that PDMM can do randomized block coordinate descent on overlapping blocks. Experimental results show that PDMM performs better than stateofthearts methods in two applications, robust principal component analysis and overlapping group lasso. 1
Global Convergence of Unmodified 3Block ADMM for a Class of Convex Minimization Problems
, 2015
"... The alternating direction method of multipliers (ADMM) has been successfully applied to solve structured convex optimization problems due to its superior practical performance. The convergence properties of the 2block ADMM have been studied extensively in the literature. Specifically, it has been p ..."
Abstract
 Add to MetaCart
The alternating direction method of multipliers (ADMM) has been successfully applied to solve structured convex optimization problems due to its superior practical performance. The convergence properties of the 2block ADMM have been studied extensively in the literature. Specifically, it has been proven that the 2block ADMM globally converges for any penalty parameter γ> 0. In this sense, the 2block ADMM allows the parameter to be free, i.e., there is no need to restrict the value for the parameter when implementing this algorithm in order to ensure convergence. However, for the 3block ADMM, Chen et al. [4] recently constructed a counterexample showing that it can diverge if no further condition is imposed. The existing results on studying further sufficient conditions on guaranteeing the convergence of the 3block ADMM usually require γ to be smaller than a certain bound, which is usually either difficult to compute or too small to make it a practical algorithm. In this paper, we show that the 3block ADMM still globally converges with any penalty parameter γ> 0 when applied to solve a class of commonly encountered problems to be called regularized least squares decomposition (RLSD) in this paper, which covers many important applications in practice.
Robust Network Compressive Sensing
"... Networks are constantly generating an enormous amount of rich diverse information. Such information creates exciting opportunities for network analytics. However, a major challenge to enable effective network analytics is the presence of missing data, measurement errors, and anomalies. Despite si ..."
Abstract
 Add to MetaCart
Networks are constantly generating an enormous amount of rich diverse information. Such information creates exciting opportunities for network analytics. However, a major challenge to enable effective network analytics is the presence of missing data, measurement errors, and anomalies. Despite significant work in network analytics, fundamental issues remain: (i) the existing works do not explicitly account for anomalies or measurement noise, and incur serious performance degradation under significant noise or anomalies, and (ii) they assume network matrices have lowrank structure, which may not hold in reality. To address these issues, in this paper we develop LENS decomposition, a novel technique to accurately decompose a network matrix into a lowrank matrix, a sparse anomaly matrix, an error matrix, and a small noise matrix. LENS has the following nice properties: (i) it is general: it can effectively support matrices with or without anomalies, and having lowrank or not, (ii) its parameters are self tuned so that it can adapt to different types of data, (iii) it is accurate by incorporating domain knowledge, such as temporal locality, spatial locality, and initial estimate (e.g., obtained frommodels), (iv) it is versatile and can support many applications including missing value interpolation, prediction, and anomaly detection. We apply LENS to a wide range of network matrices from 3G, WiFi, mesh, sensor networks, and the Internet. Our results show that LENS significantly outperforms stateoftheart compressive sensing schemes.
Coordination in Security Games via ADMM
"... Gametheoretic algorithms for the protection of critical infrastructure sites have been widely deployed in recent years. Important sites are protected by multiple agencies that assign their resources almost independently; but limited coordination has been shown to result in significant ineffici ..."
Abstract
 Add to MetaCart
(Show Context)
Gametheoretic algorithms for the protection of critical infrastructure sites have been widely deployed in recent years. Important sites are protected by multiple agencies that assign their resources almost independently; but limited coordination has been shown to result in significant inefficiencies. Encouraging the agencies to coordinate requires developing mechanisms that can assign their resources simultaneously while sharing a minimal amount of sensitive information across agencies. We draw on the Alternating Direction Method of Multipliers (ADMM) — a distributed convex optimization method that plays a key role in machine learning research — to develop such coordination mechanisms for two agencies with provably optimal efficiency. 1.