期刊
NEURAL COMPUTING & APPLICATIONS
卷 22, 期 3-4, 页码 559-567出版社
SPRINGER
DOI: 10.1007/s00521-011-0798-9
关键词
Extreme learning machine; Generalized Hessian matrix; Newton method; Single hidden layer feedforward neural networks; Smoothing technique; Support vector regression
资金
- Council of Scientific and Industrial Research, India
In this paper, extreme learning machine (ELM) for epsilon-insensitive error loss function-based regression problem formulated in 2-norm as an unconstrained optimization problem in primal variables is proposed. Since the objective function of this unconstrained optimization problem is not twice differentiable, the popular generalized Hessian matrix and smoothing approaches are considered which lead to optimization problems whose solutions are determined using fast Newton-Armijo algorithm. The main advantage of the algorithm is that at each iteration, a system of linear equations is solved. By performing numerical experiments on a number of interesting synthetic and real-world datasets, the results of the proposed method are compared with that of ELM using additive and radial basis function hidden nodes and of support vector regression (SVR) using Gaussian kernel. Similar or better generalization performance of the proposed method on the test data in comparable computational time over ELM and SVR clearly illustrates its efficiency and applicability.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据