On the Global and Linear Convergence of the Generalized Alternating Direction Method of Multipliers
The formulation min f(x)+g(y) subject to Ax+By=b arises in many application areas such as signal processing, imaging and image processing, statistics, and machine learning either naturally or after variable splitting. In many common problems, one of the two objective functions is strongly convex and has Lipschitz continuous gradient. On this kind of problem, a very effective approach is the alternating direction method of multipliers (ADM, also known as ADMM), which solves a sequence of f/g-decoupled subproblems. However, its effectiveness has not been matched by a provably fast rate of convergence; only sublinear rates such as O(1/k) and O(1/k^2) were recently established in the literature, though these rates do not require strong convexity. This paper shows that global linear convergence can be guaranteed under the above assumptions on strong convexity and Lipschitz gradient on one of the two functions, along with certain rank assumptions on A and B. The result applies to the generalized ADMs that allow the subproblems to be solved faster and less exactly in certain manners. In addition, the rate of convergence provides some theoretical guidance for optimizing the ADM parameters.
Citable link to this pagehttps://hdl.handle.net/1911/102203
MetadataShow full item record
- CAAM Technical Reports