A scalable surrogate L0 sparse regression method for generalized linear models with applications to large scale data
Published 2020 View Full Article
- Home
- Publications
- Publication Search
- Publication Details
Title
A scalable surrogate L0 sparse regression method for generalized linear models with applications to large scale data
Authors
Keywords
Generalized linear models, High dimensional massive sample size data, penalty, Ridge regression, Variable selection
Journal
JOURNAL OF STATISTICAL PLANNING AND INFERENCE
Volume 213, Issue -, Pages 262-281
Publisher
Elsevier BV
Online
2020-12-18
DOI
10.1016/j.jspi.2020.12.001
References
Ask authors/readers for more resources
Related references
Note: Only part of the references are listed.- Empirical confidence interval calibration for population-level effect estimation studies in observational healthcare data
- (2018) Martijn J. Schuemie et al. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA
- Broken adaptive ridge regression and its asymptotic properties
- (2018) Linlin Dai et al. JOURNAL OF MULTIVARIATE ANALYSIS
- Exact post-selection inference, with application to the lasso
- (2016) Jason D. Lee et al. ANNALS OF STATISTICS
- Efficient Regularized Regression withL0Penalty for Variable Selection and Network Construction
- (2016) Zhenqiu Liu et al. Computational and Mathematical Methods in Medicine
- Adaptive conditional feature screening
- (2016) Lu Lin et al. COMPUTATIONAL STATISTICS & DATA ANALYSIS
- An Adaptive Ridge Procedure for L0 Regularization
- (2016) Florian Frommlet et al. PLoS One
- Regularization Paths for Cox's Proportional Hazards Model via Coordinate Descent
- (2015) Noah Simon et al. Journal of Statistical Software
- Strong oracle optimality of folded concave penalized estimation
- (2014) Jianqing Fan et al. ANNALS OF STATISTICS
- A significance test for the lasso
- (2014) Richard Lockhart et al. ANNALS OF STATISTICS
- Feature Selection for Varying Coefficient Models With Ultrahigh-Dimensional Covariates
- (2014) Jingyuan Liu et al. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
- The Sparse MLE for Ultrahigh-Dimensional Feature Screening
- (2014) Chen Xu et al. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
- Massive Parallelization of Serial Inference Algorithms for a Complex Generalized Linear Model
- (2013) Marc A. Suchard et al. ACM Transactions on Modeling and Computer Simulation
- Quantile-adaptive model-free variable screening for high-dimensional heterogeneous data
- (2013) Xuming He et al. ANNALS OF STATISTICS
- Robust rank correlation based screening
- (2012) Gaorong Li et al. ANNALS OF STATISTICS
- Model-Free Feature Screening for Ultrahigh-Dimensional Data
- (2012) Li-Ping Zhu et al. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
- Likelihood-Based Selection and Sharp Parameter Estimation
- (2012) Xiaotong Shen et al. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
- Feature Screening via Distance Correlation Learning
- (2012) Runze Li et al. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- (2011) Jianqing Fan et al. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
- Sure independence screening in generalized linear models with NP-dimensionality
- (2010) Jianqing Fan et al. ANNALS OF STATISTICS
- Forward Regression for Ultra-High Dimensional Variable Screening
- (2010) Hansheng Wang JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
- Coordinate descent algorithms for lasso penalized regression
- (2008) Tong Tong Wu et al. Annals of Applied Statistics
- Extended Bayesian information criteria for model selection with large model spaces
- (2008) J. Chen et al. BIOMETRIKA
- Sure independence screening for ultrahigh dimensional feature space
- (2008) Jianqing Fan et al. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY
Publish scientific posters with Peeref
Peeref publishes scientific posters from all research disciplines. Our Diamond Open Access policy means free access to content and no publication fees for authors.
Learn MoreAsk a Question. Answer a Question.
Quickly pose questions to the entire community. Debate answers and get clarity on the most important issues facing researchers.
Get Started