4.1 Article

A mixture of g-priors for variable selection when the number of regressors grows with the sample size

期刊

TEST
卷 26, 期 2, 页码 377-404

出版社

SPRINGER
DOI: 10.1007/s11749-016-0516-0

关键词

Model selection consistency; Misspecified models; General class of distributions of errors; Kullback-Leibler divergence

向作者/读者索取更多资源

We consider the problem of variable selection in linear regression using mixtures of g-priors. A number of mixtures have been proposed in the literature which work well, especially when the number of regressors p is fixed. In this paper, we propose a mixture of g-priors suitable for the case when p grows with the sample size n, more specifically when , . The marginal density based on the proposed mixture has a nice approximation with a closed form expression, which makes application of the method as tractable as an information criterion-based method. The proposed method satisfies fundamental properties like model selection consistency when the true model lies in the model space, and also consistency in an appropriate sense, under misspecified models setup. The method is quite robust in the sense that the above properties are not confined to normal linear models; they continue to hold under reasonable conditions for a general class of error distributions. Finally, we compare the performance of the proposed prior theoretically with that of some other mixtures of g-priors. We also compare it with several other Bayesian methods of model selection using simulated data sets. Theoretically, as well as in simulations, it emerges that unlike most of the other methods of model selection, the proposed prior is competent enough while selecting the true model irrespective of its dimension.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据