文件名称:Block-Coordinate Gradient Descent method
文件大小:506KB
文件格式:PDF
更新时间:2015-04-05 14:41:03
block coordinate gradient descent
We consider the problem of minimizing the weighted sum of a smooth function f and a convex function P of n real variables subject to m linear equality constraints. We propose a block-coordinate gradient descent method for solving this problem, with the coordinate block chosen by a Gauss-Southwell-q rule based on sufficient predicted descent. We establish global convergence to first-order stationarity for this method and, under a local error bound assumption, linear rate of convergence. If f is convex with Lipschitz continuous gradient, then the method terminates in O(n2/) iterations with an -optimal solution. If P is separable, then the Gauss- Southwell-q rule is implementable in O(n) operations when m = 1 and in O(n2) operations when m>1. In the special case of support vector machines training, for which f is convex quadratic, P is separable, and m = 1,