期刊
NEUROCOMPUTING
卷 128, 期 -, 页码 4-14出版社
ELSEVIER
DOI: 10.1016/j.neucom.2013.03.051
关键词
Extreme learning machine; Dual exterior penalty problem; Feedforward neural networks; Linear programming problem; Newton method
In this paper, a novel 1-norm extreme learning machine (ELM) for regression and multiclass classification is proposed as a linear programming problem whose solution is obtained by solving its dual exterior penalty problem as an unconstrained minimization problem using a fast Newton method. The algorithm converges from any starting point and can be easily implemented in MATLAB. The main advantage of the proposed approach is that it leads to a sparse model representation meaning that many components of the optimal solution vector will become zero and therefore the decision function can be determined using much less number of hidden nodes in comparison to ELM. Numerical experiments were performed on a number of interesting real-world benchmark datasets and their results are compared with ELM using additive and radial basis function (RBE) hidden nodes, optimally pruned ELM (OP-ELM) and support vector machine (SVM) methods. Similar or better generalization performance of the proposed method on the test data over ELM, OP-ELM and SVM clearly illustrates its applicability and usefulness. (C) 2013 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据