4.5 Article

Checking for Prior-Data Conflict Using Prior-to-Posterior Divergences

Journal

STATISTICAL SCIENCE
Volume 35, Issue 2, Pages 234-253

Publisher

INST MATHEMATICAL STATISTICS
DOI: 10.1214/19-STS731

Keywords

Bayesian inference; model checking; prior data-conflict; variational Bayes; Bayesian inference

Funding

  1. Singapore Ministry of Education Academic Research Fund Tier 2 grant [R-155000-143-112]
  2. Singapore Ministry of Education [MOE2012-T3-1009]
  3. National Research Foundation of Singapore
  4. Natural Sciences and Engineering Research Council of Canada [10671]

Ask authors/readers for more resources

y When using complex Bayesian models to combine information, checking consistency of the information contributed by different components of the model for inference is good statistical practice. Here a new method is developed for detecting prior-data conflicts in Bayesian models based on comparing the observed value of a prior-to-posterior divergence to its distribution under the prior predictive distribution for the data. The divergence measure used in our model check is a measure of how much beliefs have changed from prior to posterior, and can be thought of as a measure of the overall size of a relative belief function. It is shown that the proposed method is intuitive, has desirable properties, can be extended to hierarchical settings, and is related asymptotically to Jeffreys' and reference prior distributions. In the case where calculations are difficult, the use of variational approximations as a way of relieving the computational burden is suggested. The methods are compared in a number of examples with an alternative but closely related approach in the literature based on the prior predictive distribution of a minimal sufficient statistic.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

Article Statistics & Probability

Marginally Calibrated Deep Distributional Regression

Nadja Klein, David J. Nott, Michael Stanley Smith

Summary: This article presents an approach to construct predictive distributions that are marginally calibrated, improving uncertainty quantification. By combining DNN regression models with implicit copula processes, the method enhances accuracy in applications like ecological time series analysis.

JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS (2021)

Article Statistics & Probability

Assessment and Adjustment of Approximate Inference Algorithms Using the Law of Total Variance

Xuejun Yu, David J. Nott, Minh-Ngoc Tran, Nadja Klein

Summary: The article introduces a moment-based alternative method for assessing and adjusting approximate inference methods by relating prior and posterior expectations and covariances. The method adjusts approximate inferences to maintain correct prior to posterior relationships. Examples include using an auxiliary model in likelihood-free inference, corrections for variational Bayes approximations in a deep neural network GLMM, and using a deep neural network surrogate for approximating Gaussian process regression predictive inference.

JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS (2021)

Article Statistics & Probability

Checking for model failure and for prior-data conflict with the constrained multinomial model

Berthold-Georg Englert, Michael Evans, Gun Ho Jang, Hui Khoon Ng, David Nott, Yi-Lin Seah

Summary: Multinomial models can be challenging to use with probability constraints. An exact model checking procedure has been developed based on a uniform prior, and a consistency theorem has been proven for inference with a nonuniform prior. The study also introduces a new elicitation methodology for multinomial models with ordered probabilities.

METRIKA (2021)

Article Economics

Fast and accurate variational inference for models with many latent variables

Ruben Loaiza-Maya, Michael Stanley Smith, David J. Nott, Peter J. Danaher

Summary: Models with a large number of latent variables can be difficult to estimate, but variational inference methods provide an attractive solution. This study proposes a tractable variational approximation that combines a parsimonious approximation for the parameter posterior with the exact conditional posterior of the latent variables. The method is more accurate and faster to calibrate, and allows for the implementation of data sub-sampling in variational inference.

JOURNAL OF ECONOMETRICS (2022)

Article Multidisciplinary Sciences

Quantum Dynamical Simulation of a Transversal Stern-Gerlach Interferometer

Mikolaj M. Paraniak, Berthold-Georg Englert

Summary: This study revisits the fundamental question of the reversibility of evolution in quantum mechanics, showing that the quantum dynamics of the transversal Stern-Gerlach interferometer is fundamentally irreversible even under ideal conditions with perfect control of the associated magnetic fields and beams.

SYMMETRY-BASEL (2021)

Article Economics

Variational Bayes approximation of factor stochastic volatility models

David Gunawan, Robert Kohn, David Nott

Summary: The study focuses on developing fast and accurate variational Bayes methods to approximate the posterior distribution of states and parameters in high dimensional multivariate factor stochastic volatility models, and extends it for prediction purposes; validated on simulated and real datasets, it shows to produce faster results compared to traditional approaches.

INTERNATIONAL JOURNAL OF FORECASTING (2021)

Article Statistics & Probability

Bayesian Inference Using Synthetic Likelihood: Asymptotics and Adjustments

David T. Frazier, David J. Nott, Christopher Drovandi, Robert Kohn

Summary: Implementing Bayesian inference in complex models with difficult likelihood calculations can be challenging. Synthetic likelihood, which constructs an approximate likelihood by simulating from the model, offers a computationally efficient alternative. This article demonstrates that the Bayesian implementation of synthetic likelihood is more efficient than approximate Bayesian computation and provides adjusted inference methods to further speed up computation. Supplementary materials are available online.

JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION (2022)

Article Computer Science, Theory & Methods

Variational inference and sparsity in high-dimensional deep Gaussian mixture models

Lucas Kock, Nadja Klein, David J. Nott

Summary: This research introduces a new clustering method that combines mixtures of factor analyzers, sparse priors, and Bayesian inference to handle high-dimensional problems. The experimental results demonstrate the effectiveness of the proposed method in both simulated and real data.

STATISTICS AND COMPUTING (2022)

Article Chemistry, Physical

Contactium: A strongly correlated model system

Jerzy Cioslowski, Berthold-Georg Englert, Martin-Isbjoern Trappe, Jun Hao Hue

Summary: At the limit of infinite confinement strength, the ground state of a system containing two interacting fermions or bosons in harmonic confinement remains strongly correlated. The natural orbitals of this system exhibit peculiar properties, such as nonzero collective occupancies for all angular momenta and a relationship with eigenfunctions and eigenvalues of a zero-energy Schrodinger equation with an attractive Gaussian potential. These properties have implications for the decay behavior and energy contributions of the system.

JOURNAL OF CHEMICAL PHYSICS (2023)

Article Quantum Science & Technology

Randomized Linear Gate-Set Tomography

Yanwu Gu, Rajesh Mishra, Berthold-Georg Englert, Hui Khoon Ng

Summary: Randomized linear gate-set tomography is an easy-to-implement procedure that combines the idea of state-preparation-and-measurement-error-free characterization with no-design randomized tomographic circuits, showing comparable performance to standard gate-set tomography in simulations and experiments while requiring less computational time.

PRX QUANTUM (2021)

Article Mathematics, Interdisciplinary Applications

Using Prior Expansions for Prior-Data Conflict Checking

David J. Nott, Max Seah, Luai Al-Labadi, Michael Evans, Hui Khoon Ng, Berthold-Georg Englert

Summary: In Bayesian analysis, combining information from different model components and detecting conflicts between sources of information are crucial. By expanding the prior used for analysis into a larger family of priors and considering a marginal likelihood score statistic, it is possible to gain insights into the nature of conflicts and choose appropriate expansions for sensitive conflict checks. Extensions to hierarchically specified priors and connections with other approaches are considered, with illustrations of implementation in complex situations using two applications.

BAYESIAN ANALYSIS (2021)

Article Optics

Direct estimation of minimum gate fidelity

Yiping Lu, Jun Yan Sim, Jun Suzuki, Berthold-Georg Englert, Hui Khoon Ng

PHYSICAL REVIEW A (2020)

Article Optics

User-specified random sampling of quantum channels and its applications

Jun Yan Sim, Jun Suzuki, Berthold-Georg Englert, Hui Khoon Ng

PHYSICAL REVIEW A (2020)

No Data Available