4.6 Article

On extreme learning machine for ε-insensitive regression in the primal by Newton method

期刊

NEURAL COMPUTING & APPLICATIONS
卷 22, 期 3-4, 页码 559-567

出版社

SPRINGER
DOI: 10.1007/s00521-011-0798-9

关键词

Extreme learning machine; Generalized Hessian matrix; Newton method; Single hidden layer feedforward neural networks; Smoothing technique; Support vector regression

资金

  1. Council of Scientific and Industrial Research, India

向作者/读者索取更多资源

In this paper, extreme learning machine (ELM) for epsilon-insensitive error loss function-based regression problem formulated in 2-norm as an unconstrained optimization problem in primal variables is proposed. Since the objective function of this unconstrained optimization problem is not twice differentiable, the popular generalized Hessian matrix and smoothing approaches are considered which lead to optimization problems whose solutions are determined using fast Newton-Armijo algorithm. The main advantage of the algorithm is that at each iteration, a system of linear equations is solved. By performing numerical experiments on a number of interesting synthetic and real-world datasets, the results of the proposed method are compared with that of ELM using additive and radial basis function hidden nodes and of support vector regression (SVR) using Gaussian kernel. Similar or better generalization performance of the proposed method on the test data in comparable computational time over ELM and SVR clearly illustrates its efficiency and applicability.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据