• il y a 11 ans
Large-scale learning with conditional gradient algorithms
We consider convex optimization problems arising in machine learning in large-scale settings. For several important learning problems, such as e.g. noisy matrix completion or multi-class classification, state-of-the-art optimization approaches such as composite minimization (a.k.a. proximal-gradient) algorithms are difficult to apply and do not scale up to large datasets. We propose three extensions of the conditional gradient algorithm (a.k.a. Frank-Wolfe's algorithm), suitable for large-scale problems, and establish their finite-time convergence guarantees. Promising experimental results are presented on large-scale real-world datasets.

Recommandations