Article
Mathematics, Applied
T. Niri, M. Heydari, M. M. Hosseini
Summary: The paper introduces two nonmonotone versions of adaptive cubic regularized (ARC) method, which are a combination of the ARC algorithm with nonmonotone line search methods. The global convergence analysis for these iterative algorithms is established, and numerical examples demonstrate the efficiency and robustness of the proposed methods.
INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS
(2021)
Article
Operations Research & Management Science
T. Dehghan Niri, M. Heydari, M. M. Hosseini
Summary: The current study introduces two modified adaptive cubic regularized (ARC) methods for solving unconstrained optimization problems. These methods combine ARC with the nonmonotone Armijo-type line search and calculate step lengths in different paths. The global convergence properties of the proposed methods are analyzed, and the results demonstrate their superior performance compared to other existing algorithms.
Article
Mathematics, Applied
Wenqing Ouyang, Jiong Tao, Andre Milzarek, Bailin Deng
Summary: Anderson acceleration (AA) is a popular method for accelerating fixed-point iterations, but it may suffer from instability and stagnation. We propose a globalization method for AA that improves stability and achieves unified global and local convergence. Unlike existing AA globalization approaches that rely on safeguarding operations and might hinder fast local convergence, our method adopts a nonmonotone trust-region framework and introduces an adaptive quadratic regularization together with a tailored acceptance mechanism. We prove global convergence and show that our algorithm attains the same local convergence as AA under appropriate assumptions. The effectiveness of our method is demonstrated in several numerical experiments.
JOURNAL OF SCIENTIFIC COMPUTING
(2023)
Article
Mathematics, Applied
Ting Zhao, Hongwei Liu, Zexian Liu
Summary: This paper proposes two new subspace minimization conjugate gradient methods based on p-regularization models, introducing new solutions by analyzing special scaled norms and deriving new directions in a two-dimensional subspace. The methods exhibit global convergence and are also analyzed for R-linear convergence. Numerical results demonstrate the superiority of the proposed methods over four existing conjugate gradient methods in the CUTEr library.
NUMERICAL ALGORITHMS
(2021)
Article
Mathematics, Applied
Gonglin Yuan, Zhan Wang, Pengyuan Li
Summary: This paper presents a nonmonotone Broyden family method for unconstrained optimization problems, which possesses global convergence and better performance.
COMPUTATIONAL & APPLIED MATHEMATICS
(2022)
Article
Operations Research & Management Science
Xiaobo Li, Xianfu Wang, Manish Krishan Lal
Summary: The paper introduces a nonmonotone trust region method for unconstrained optimization problems on Riemannian manifolds, showing global convergence to first-order stationary points and establishing local R-linear, super-linear, and quadratic convergence rates. Preliminary experiments suggest the algorithm's efficiency.
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS
(2021)
Article
Mathematics, Applied
Keke Zhang, Hongwei Liu, Zexian Liu
Summary: This paper introduces and analyzes a new adaptive subspace minimization three-term conjugate gradient algorithm, which computes search directions by minimizing a quadratic approximation of the objective function, proposes an adaptive rule for choosing different directions, and ensures that each choice of direction satisfies sufficient descent conditions. Through numerical experiments, it is proven that the algorithm is globally convergent for general nonlinear functions.
JOURNAL OF COMPUTATIONAL MATHEMATICS
(2021)
Article
Operations Research & Management Science
A. L. Custodio, R. Garmanjani, M. Raydan
Summary: We propose a derivative-free separable quadratic modeling and cubic regularization technique for solving smooth unconstrained minimization problems. The approach focuses on building a quadratic model, which can be generated by numerical interpolation or using a minimum Frobenius norm approach, when there are not enough points available to build a complete quadratic model. This model plays a crucial role in generating the approximate gradient vector and Hessian matrix of the objective function at each iteration. We also introduce a specialized cubic regularization strategy to minimize the quadratic model at each iteration, leveraging separability. Convergence results, including worst case complexity, are discussed for reaching first-order stationary points. Preliminary numerical results are presented to demonstrate the robustness of the specialized separable cubic algorithm.
4OR-A QUARTERLY JOURNAL OF OPERATIONS RESEARCH
(2023)
Article
Mathematics, Applied
Hassan Mohammad, Kamaluddeen Umar Danmalam
Summary: This work proposes a matrix-free algorithm that combines structured spectral parameters with a nonmonotone line search to solve nonlinear least squares problems. The structured Hessian matrix associated with the Newton's-like direction is approximated with a scalar multiple of identity, where the scalar is a convex combination of the spectral parameters, approximately satisfying the structured quasi-Newton condition. By using a nonmonotone line search with appropriate conditions, the global convergence of the sequence generated by the proposed algorithm is presented. The numerical result demonstrates the competitiveness of the algorithm compared to existing algorithms in the literature.
APPLIED NUMERICAL MATHEMATICS
(2023)
Article
Operations Research & Management Science
Jingyong Tang, Jinchuan Zhou, Hongchao Zhang
Summary: In this paper, an accelerated smoothing Newton method (ASNM) is proposed for solving the weighted complementarity problem (wCP) by reformulating it as a system of nonlinear equations using a smoothing function. ASNM computes an additional approximate Newton step when the iterates are close to the solution set of the nonlinear system. This additional step can be obtained with a much reduced computational cost when a Lipschitz continuous condition holds on the gradient of the smoothing function at two checking points. The numerical experiments verify the local cubic convergence rate of ASNM and show its improved computational efficiency compared with some benchmark smoothing Newton methods.
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS
(2023)
Article
Mathematics, Applied
Suthep Suantai, Pachara Jailoka, Adisak Hanjing
Summary: This study introduces an accelerated viscosity-type algorithm for solving a convex minimization problem of the sum of two convex functions in a Hilbert space. The algorithm does not require any Lipschitz continuity assumption on the gradient and establishes a strong convergence result under some control conditions. Numerical experiments demonstrate that the method is more efficient than well-known methods in the literature.
JOURNAL OF INEQUALITIES AND APPLICATIONS
(2021)
Article
Mathematics, Applied
Yonggang Pei, Shaofang Song, Detong Zhu
Summary: In this paper, a novel filter sequential adaptive regularization algorithm (ARC) is proposed for solving nonlinear equality constrained optimization. The algorithm employs composite step methods and reduced Hessian methods to handle linearized constraints, and determines the new iteration using ARC framework and filter methods. Experimental results demonstrate the global convergence of the algorithm.
NUMERICAL ALGORITHMS
(2022)
Article
Mathematics, Applied
Xing Li, Wen-li Dong, Zheng Peng
Summary: A new nonmonotone trust region BB method is proposed in this paper, which combines a modified Metropolis criterion, BB-stepsize, and trust region method. The new method uses the reciprocal of BB-stepsize to approximate the Hessian matrix of the objective function and accepts some bad solutions based on the modified Metropolis criterion. Preliminary numerical results show that the new method is more efficient compared to the existing trust region BB method.
ACTA MATHEMATICAE APPLICATAE SINICA-ENGLISH SERIES
(2021)
Article
Computer Science, Interdisciplinary Applications
Abubakar Sani Halilu, Arunava Majumder, Mohammed Yusuf Waziri, Kabiru Ahmed, Aliyu Muhammed Awwal
Summary: The research aims to propose a new choice of nonnegative parameter t in the Dai-Liao conjugate gradient method. By combining conjugate gradient algorithms with the Newton method approach, a derivative-free method is presented for solving constrained monotone and general systems of nonlinear equations. The new algorithm has not been submitted or published elsewhere, and has been successfully applied to motion control of two joint planar robotic manipulators.
ENGINEERING COMPUTATIONS
(2022)
Article
Operations Research & Management Science
Zexian Liu, Wangli Chu, Hongwei Liu
Summary: This paper proposes an efficient gradient method with approximately optimal step sizes for unconstrained optimization using regularization models. Numerical experiments show the promising performance and efficiency of the proposed method.
RAIRO-OPERATIONS RESEARCH
(2022)
Article
Operations Research & Management Science
Tommaso Bianconcini, Giampaolo Liuzzi, Benedetta Morini, Marco Sciandrone
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
(2015)
Article
Engineering, Civil
Luca Kubin, Tommaso Bianconcini, Douglas Coimbra de Andrade, Matteo Simoncini, Leonardo Taccari, Francesco Sambo
Summary: The article introduces a novel deep learning method for detecting vehicle accidents from on-board sensor data, which outperforms other approaches in terms of accuracy and can be easily deployed on embedded devices. The method utilizes neural architecture, multimodal self-supervised training, and data augmentation techniques to improve generalization capabilities and counteract extreme class imbalance.
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS
(2022)
Article
Operations Research & Management Science
Tommaso Bianconcini, David Di Lorenzo, Alessandro Lori, Fabio Schoen, Leonardo Taccari
EURO JOURNAL ON TRANSPORTATION AND LOGISTICS
(2018)