Results

**1 - 2**of**2**### Semi-Proximal Mirror Prox Background Key Components Semi-MP Algorithm Experiments

"... First-order methods for composite minimization minx∈X f (x) + h(x) f and h are convex, f is smooth, h is simple. (Acc)Proximal gradient methods (when h proximal-friendly) Proximal operator: proxh(η) = argminx∈X { 12‖x − η‖22 + h(x)} For example, when h(x) = ‖x‖1, reduces to soft thresholding. Wors ..."

Abstract
- Add to MetaCart

First-order methods for composite minimization minx∈X f (x) + h(x) f and h are convex, f is smooth, h is simple. (Acc)Proximal gradient methods (when h proximal-friendly) Proximal operator: proxh(η) = argminx∈X { 12‖x − η‖22 + h(x)} For example, when h(x) = ‖x‖1, reduces to soft thresholding. Worst complexity bound for first-order oracles is O(1/√). Conditional gradient methods (when h is LMO-friendly) (Composite) linear minimization oracles(LMO): LMOh(η) = argminx∈X{〈η, x〉+ h(x)} For example, when h(x) = ‖x‖nuc or δ‖x‖nuc≤1(x), reduces to computing top pair of singular vectors. Worst (also optimal) complexity bound for LMOs is O(1/).

### 5.3. FlipFlop: Fast Lasso-based Isoform Prediction as a Flow Problem 6

"... Vision, perception and multimedia interpretation Table of contents ..."

(Show Context)