4.5 Article

Parallel Bayesian Additive Regression Trees

Journal

JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
Volume 23, Issue 3, Pages 830-852

Publisher

AMER STATISTICAL ASSOC
DOI: 10.1080/10618600.2013.841584

Keywords

Big Data; Markov chain Monte Carlo; Nonlinear; Scalable; Statistical computing

Funding

  1. U.S. Department of Energy Office of Science, Office of Advanced Scientific Computing Research, Scientific Discovery through Advanced Computing (SciDAC) program
  2. Division Of Mathematical Sciences
  3. Direct For Mathematical & Physical Scien [1107046] Funding Source: National Science Foundation

Ask authors/readers for more resources

Bayesian additive regression trees (BART) is a Bayesian approach to flexible nonlinear regression which has been shown to be competitive with the best modern predictive methods such as those based on bagging and boosting. BART offers some advantages. For example, the stochastic search Markov chain Monte Carlo (MCMC) algorithm can provide a more complete search of the model space and variation across MCMC draws can capture the level of uncertainty in the usual Bayesian way. The BART prior is robust in that reasonable results are typically obtained with a default prior specification. However, the publicly available implementation of the BART algorithm in the R package BayesTree is not fast enough to be considered interactive with over a thousand observations, and is unlikely to even run with 50,000 to 100,000 observations. In this article we show how the BART algorithm may be modified and then computed using single program, multiple data (SPMD) parallel computation implemented using the Message Passing Interface (MPI) library. The approach scales nearly linearly in the number of processor cores, enabling the practitioner to perform statistical inference on massive datasets. Our approach can also handle datasets too massive to fit on any single data repository.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available