Now showing items 1-5 of 5
Block Stochastic Gradient Iteration for Convex and Noncovex Optimization
The stochastic gradient (SG) method can minimize an objective function composed of a large number of differentiable functions or solve a stochastic optimization problem, very quickly to a moderate accuracy. The block ...
Learning Circulant Sensing Kernels
In signal acquisition, Toeplitz and circulant matrices are widely used as sensing operators. They correspond to discrete convolutions and are easily or even naturally realized in various applications. For compressive ...
Low-Rank Matrix Recovery using Unconstrained Smoothed-Lq Minimization
A low-rank matrix can be recovered from a small number of its linear measurements. As a special case, the matrix completion problem aims to recover the matrix from a subset of its entries. Such problems share many common ...
An Alternating Direction Algorithm for Matrix Completion with Nonnegative Factors
This paper introduces a novel algorithm for the nonnegative matrix factorization and completion problem, which aims to nd nonnegative matrices X and Y from a subset of entries of a nonnegative matrix M so that XY approximates ...
A Block Coordinate Descent Method for Multi-Convex Optimization with Applications to Nonnegative Tensor Factorization and Completion
This paper considers block multi-convex optimization, where the feasible set and objective function are generally non-convex but convex in each block of variables. We review some of its interesting examples and propose a ...