Article
Automation & Control Systems
Lu Zhang, Bob Carpenter, Andrew Gelman, Aki Vehtari
Summary: Pathfinder is a variational method for approximately sampling from differentiable probability densities. It locates normal approximations to the target density along an optimization path and returns draws with the lowest estimated KL divergence. Compared to other methods, Pathfinder has higher efficiency and better sampling performance.
JOURNAL OF MACHINE LEARNING RESEARCH
(2022)
Article
Automation & Control Systems
Haishan Ye, Luo Luo, Zhihua Zhang
Summary: The paper proposes a unified framework to analyze the local and global convergence properties of second order methods, bridging the gap between current convergence theory and empirical performance in real applications.
JOURNAL OF MACHINE LEARNING RESEARCH
(2021)
Article
Operations Research & Management Science
Matteo Lapucci, Pierluigi Mansueto
Summary: In this paper, a Limited Memory Quasi-Newton method is introduced for solving unconstrained multi-objective optimization problems. The algorithm approximates the convex combination of the Hessian matrices of the objectives using a positive definite matrix. It has been shown to be well defined and capable of global and R-linear convergence to Pareto optimality for twice continuously differentiable strongly convex problems. Experimental results demonstrate that the proposed approach outperforms state-of-the-art Newton and Quasi-Newton approaches in most cases, making it efficient and effective for Pareto front approximation within a global optimization framework.
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
(2023)
Article
Automation & Control Systems
Sifan Liu, Art B. Owen
Summary: Our study demonstrates that using randomized quasi-Monte Carlo sampling can improve optimization in machine learning problems, especially in cases where second order methods are not effective. In cases where the sampling method has a low root mean squared error, RQMC can achieve better optimization results.
JOURNAL OF MACHINE LEARNING RESEARCH
(2021)
Article
Engineering, Electrical & Electronic
Ting-Jui Chang, Shahin Shahrampour
Summary: In this paper, the authors explore large-scale finite-sum minimization over a reproducing kernel Hilbert space (RKHS) using kernel approximation to speed up the Newton method. They provide a novel second-order algorithm with local superlinear convergence and global linear convergence, and derive the theoretical lower bound for the number of random features required for the approximated Hessian.
IEEE TRANSACTIONS ON SIGNAL PROCESSING
(2022)
Article
Mathematics, Applied
Wah June Leong, Sharareh Enshaei, Sie Long Kek
Summary: This paper introduces a class of low memory quasi-Newton methods with standard backtracking line search for large-scale unconstrained minimization. These methods are derived through least change updating technique, with convergence properties established for specific members and sufficient conditions for superlinear convergence given. Numerical results are presented to demonstrate the utility of these methods in large-scale minimization.
NUMERICAL ALGORITHMS
(2021)
Article
Operations Research & Management Science
Donghui Li, Xiaozhou Wang, Jiajian Huang
Summary: This paper introduces two diagonal BFGS-type updates and proves their global convergence under appropriate conditions when minimizing convex or non-convex functions. Additionally, the diagonal quasi-Newton method with inverse diagonal BFGS update can achieve superlinear convergence if the function to be minimized is uniformly convex and completely separable. Furthermore, the efficiency of the L-BFGS methods with diagonal BFGS updates is demonstrated through numerical results.
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
(2022)
Article
Computer Science, Software Engineering
Christian Kanzow, Daniel Steck
Summary: This paper discusses regularized Newton methods, which are flexible unconstrained optimization algorithms that compete with line search and trust region methods and potentially combine their attractive elements. The focus is on combining regularization with limited memory quasi-Newton methods by exploiting their special structure. The paper shows the global convergence of regularization methods under mild assumptions and discusses the details of regularized limited memory quasi-Newton updates, including their compact representations. Numerical results using large-scale test problems indicate that the regularized version of L-BFGS is competitive with state-of-the-art line search and trust-region L-BFGS algorithms and may outperform some of them, especially when dealing with nonmonotonicity.
MATHEMATICAL PROGRAMMING COMPUTATION
(2023)
Article
Geosciences, Multidisciplinary
Meng-Xue Dai, Bing-Shou He, Wei-Sen Huang
Summary: Full waveform inversion (FWI) is a non-linear optimization problem that seeks to obtain quantitative information of subsurface structure by minimizing the difference between observed seismic data and the predicted wavefield. The limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) method, a type of quasi-Newton method, is efficient in FWI due to its high inversion efficiency with low calculation and storage requirements. However, the conventional L-BFGS method neglects the available objective function value, and a modified L-BFGS method that incorporates objective function information achieves higher-order accuracy in approximating the Hessian matrix. Numerical results show that the modified L-BFGS method has a higher convergence rate, better inversion results, and stronger anti-noise ability compared to the conventional L-BFGS method.
FRONTIERS IN EARTH SCIENCE
(2023)
Article
Operations Research & Management Science
David Ek, Anders Forsgren
Summary: This paper focuses on exact linesearch methods for minimizing a quadratic function with a positive definite Hessian. It introduces a class of limited-memory quasi-Newton Hessian approximations, showing their finite termination in exact arithmetic and performance in finite precision arithmetic through numerical simulations. It also provides a compact representation of Hessian approximations for general unconstrained optimization problems in the full Broyden class.
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
(2021)
Article
Engineering, Electrical & Electronic
Jiaojiao Zhang, Huikang Liu, Anthony Man-Cho So, Qing Ling
Summary: This work investigates stochastic quasi-Newton methods for minimizing a finite sum of cost functions over a decentralized network. A general algorithmic framework is developed where each node constructs a local, inexact quasi-Newton direction that approaches the global, exact one asymptotically. Two fully decentralized stochastic quasi-Newton methods are designed to construct Hessian inverse approximations with uniformly bounded positive eigenvalues, without requiring extra sampling or communication. Numerical results demonstrate that the proposed methods are much faster than existing decentralized stochastic first-order algorithms.
IEEE TRANSACTIONS ON SIGNAL PROCESSING
(2023)
Article
Multidisciplinary Sciences
Hanger Liu, Yan Li, Maojun Zhang
Summary: This paper presents a stochastic quasi-Newton algorithm for nonconvex stochastic optimization, derived from a classical modified BFGS formula. The update formula can be extended to the limited memory scheme. Numerical experiments on machine learning problems demonstrate the great prospects of the proposed algorithm.
Article
Computer Science, Software Engineering
Nicolas Boutet, Joris Degroote, Rob Haelterman
Summary: This paper investigates how to find an estimate of the Jacobian matrix as close as possible to the real matrix in quasi-Newton methods. The author proposes a method that combines multi-secant and symmetric methods into a single update formula. The novelty of this method lies in grouping and ordering secant equations based on their relative importance. The method can be applied in various applications involving multiple secant equations.
OPTIMIZATION METHODS & SOFTWARE
(2022)
Article
Mathematics, Applied
Wei Zhou, Jianrong Wu
Summary: This paper studies a version of the Ekeland variational principle in fuzzy quasi-normed spaces and applies it to Caristi's fixed point theorem and Takahashi minimization theorem. Moreover, the equivalence relations among these theorems are also proved.
Article
Mathematics
Xinsheng Wang, Weisheng Wu, Yujun Zhu
Summary: This paper investigates the local entropy and preimage structure for noninvertible maps. Various relative and local versions of preimage entropies are introduced and studied for a topological dynamical system and its factor. The relationships between these quantities are discussed, and variational principles are established. Furthermore, it is shown that each local version of these relative preimage entropies coincides with its corresponding global version for a system with uniform separation of preimages.
JOURNAL OF DIFFERENTIAL EQUATIONS
(2022)
Article
Mathematics
Wah June Leong
BULLETIN OF THE MALAYSIAN MATHEMATICAL SCIENCES SOCIETY
(2016)
Article
Environmental Sciences
Chee Kong Yap, Amiruddin Jusoh, Wah June Leong, Ali Karami, Ghim Hock Ong
ENVIRONMENTAL MONITORING AND ASSESSMENT
(2015)
Article
Engineering, Multidisciplinary
Fakhreddin Abedi, Wah June Leong, Mohammad Abedi
MATHEMATICAL PROBLEMS IN ENGINEERING
(2015)
Article
Astronomy & Astrophysics
A. Kazemi Nasab, A. Kilicman, Z. Pashazadeh Atabakan, W. J. Leong
Article
Computer Science, Theory & Methods
Shamsollah Ghanbari, Mohamed Othman, Mohd Rizam Abu Bakar, Wah June Leong
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE
(2016)
Article
Mathematics, Applied
Ibrahim Arzuka, Mohd R. Abu Bakar, Wah June Leong
JOURNAL OF INEQUALITIES AND APPLICATIONS
(2016)
Article
Automation & Control Systems
Fakhreddin Abedi, Wah June Leong
EUROPEAN JOURNAL OF CONTROL
(2017)
Article
Mathematics, Applied
Mohd Asrul Hery Ibrahim, Mustafa Mamat, Wah June Leong
ABSTRACT AND APPLIED ANALYSIS
(2014)
Article
Multidisciplinary Sciences
Farzin Modarres Khiyabani, Wah June Leong
ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING
(2014)
Article
Operations Research & Management Science
Aliyu Usman Moyi, Andwah June Leong
Article
Operations Research & Management Science
Sharareh Enshaei, Mahboubeh Farid, Wah June Leong, S. Mohsen Hashemi Ardestani
Article
Operations Research & Management Science
Chuei Yee Chen, Abdul Hakim Ghazali, Wah June Leong
Summary: The study proposes a scaling function in some Weierstrass-like parallel iterative methods to reduce dependency on generated midpoints, leading to a more efficient search for zeros while decreasing interval widths. Testing on 120 problems shows that the proposed procedures outperform original methods by reducing final interval widths with fewer iterations.
Article
Automation & Control Systems
Gillian Yi Han Woo, Hong Seng Sim, Yong Kheng Goh, Wah June Leong
Summary: In this research, the l 0-norm sparse optimization problem is tackled by using an underdetermined system as a constraint and applying the Lagrangian and proximal variable metric methods. This approach reduces the memory requirement by using a diagonal matrix approximation for the full rank Hessian matrix. The proposed method is compared against existing versions of proximal gradient methods on simulated and real-world datasets, and it is shown to be more robust and stable for finding sparse solutions.
JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS
(2023)
Proceedings Paper
Mathematics, Applied
Hong Seng Sim, Wah June Leong, Chuei Yee Chen, Siti Nur Iqmal Ibrahim
ADVANCES IN INDUSTRIAL AND APPLIED MATHEMATICS
(2016)
Article
Multidisciplinary Sciences
Choong Boon Ng, Wah June Leong, Mansor Monsi
PERTANIKA JOURNAL OF SCIENCE AND TECHNOLOGY
(2014)