4.7 Article

Markov chain Monte Carlo methods for hierarchical clustering of dynamic causal models

Journal

HUMAN BRAIN MAPPING
Volume 42, Issue 10, Pages 2973-2989

Publisher

WILEY
DOI: 10.1002/hbm.25431

Keywords

computational psychiatry; functional magnetic resonance imaging; generative embedding; Markov chain Monte Carlo sampling; model inversion

Funding

  1. University of Zurich
  2. Rene und Susanne Braginsky Foundation

Ask authors/readers for more resources

This article addresses the technical challenges of applying MCMC to hierarchical models for clustering in the space of latent parameters. Specifically focusing on dynamic causal models for fMRI and effective brain connectivity clustering, the study proposes a solution to improve convergence by introducing proposal distributions capturing dependencies between clustering and subject-wise generative model parameters. Validated on synthetic and real-world datasets, the proposed solution shows good convergence performance and superior runtime compared to state-of-the-art Monte Carlo techniques.
In this article, we address technical difficulties that arise when applying Markov chain Monte Carlo (MCMC) to hierarchical models designed to perform clustering in the space of latent parameters of subject-wise generative models. Specifically, we focus on the case where the subject-wise generative model is a dynamic causal model (DCM) for functional magnetic resonance imaging (fMRI) and clusters are defined in terms of effective brain connectivity. While an attractive approach for detecting mechanistically interpretable subgroups in heterogeneous populations, inverting such a hierarchical model represents a particularly challenging case, since DCM is often characterized by high posterior correlations between its parameters. In this context, standard MCMC schemes exhibit poor performance and extremely slow convergence. In this article, we investigate the properties of hierarchical clustering which lead to the observed failure of standard MCMC schemes and propose a solution designed to improve convergence but preserve computational complexity. Specifically, we introduce a class of proposal distributions which aims to capture the interdependencies between the parameters of the clustering and subject-wise generative models and helps to reduce random walk behaviour of the MCMC scheme. Critically, these proposal distributions only introduce a single hyperparameter that needs to be tuned to achieve good performance. For validation, we apply our proposed solution to synthetic and real-world datasets and also compare it, in terms of computational complexity and performance, to Hamiltonian Monte Carlo (HMC), a state-of-the-art Monte Carlo technique. Our results indicate that, for the specific application domain considered here, our proposed solution shows good convergence performance and superior runtime compared to HMC.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available