Journal
EXPERT SYSTEMS WITH APPLICATIONS
Volume 40, Issue 16, Pages 6438-6446Publisher
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2013.05.053
Keywords
Neural networks; Asymmetric activation function; Free parameter; Simulated annealing; Tabu search; BPM algorithm; LM algorithm; Time series
Categories
Funding
- CNPq
- CAPES
- FACEPE
Ask authors/readers for more resources
The use of neural network models for time series forecasting has been motivated by experimental results that indicate high capacity for function approximation with good accuracy. Generally, these models use activation functions with fixed parameters. However, it is known that the choice of activation function strongly influences the complexity and neural network performance and that a limited number of activation functions has been used in general. We describe the use of an asymmetric activation functions family with free parameter for neural networks. We prove that the activation functions family defined, satisfies the requirements of the universal approximation theorem We present a methodology for global optimization of the activation functions family with free parameter and the connections between the processing units of the neural network. The main idea is to optimize, simultaneously, the weights and activation function used in a Multilayer Perceptron (MLP), through an approach that combines the advantages of simulated annealing, tabu search and a local learning algorithm. We have chosen two local learning algorithms: the backpropagation with momentum (BPM) and Levenberg-Marquardt (LM). The overall purpose is to improve performance in time series forecasting. (c) 2013 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available