期刊
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
卷 71, 期 1, 页码 115-145出版社
SPRINGER
DOI: 10.1007/s10589-018-9987-0
关键词
Decomposition algorithm; Big data; Support vector machines; Parallel computing
We consider the convex quadratic linearly constrained problem with bounded variables and with huge and dense Hessian matrix that arises in many applications such as the training problem of bias support vector machines. We propose a decomposition algorithmic scheme suitable to parallel implementations and we prove global convergence under suitable conditions. Focusing on support vector machines training, we outline how these assumptions can be satisfied in practice and we suggest various specific implementations. Extensions of the theoretical results to general linearly constrained problem are provided. We included numerical results on support vector machines with the aim of showing the viability and the effectiveness of the proposed scheme.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据