4.5 Article

Automatic Differentiation is no Panacea for Phylogenetic Gradient Computation

Journal

GENOME BIOLOGY AND EVOLUTION
Volume 15, Issue 6, Pages -

Publisher

OXFORD UNIV PRESS
DOI: 10.1093/gbe/evad099

Keywords

phylogenetics; Bayesian inference; variational inference; gradient

Ask authors/readers for more resources

Gradients of probabilistic model likelihoods are crucial for computational statistics and machine learning. General-purpose machine-learning libraries like TensorFlow and PyTorch offer automatic differentiation for arbitrary models. However, for phylogenetic cases, these libraries may be slower compared to specialized code. This paper compares six gradient implementations and finds that automatic differentiation is slower than carefully implemented methods. A mixed approach combining phylogenetic libraries and machine learning libraries is recommended for optimal speed and model flexibility.
Gradients of probabilistic model likelihoods with respect to their parameters are essential for modern computational statistics and machine learning. These calculations are readily available for arbitrary models via automatic differentiation implemented in general-purpose machine-learning libraries such as TensorFlow and PyTorch. Although these libraries are highly optimized, it is not clear if their general-purpose nature will limit their algorithmic complexity or implementation speed for the phylogenetic case compared to phylogenetics-specific code. In this paper, we compare six gradient implementations of the phylogenetic likelihood functions, in isolation and also as part of a variational inference procedure. We find that although automatic differentiation can scale approximately linearly in tree size, it is much slower than the carefully implemented gradient calculation for tree likelihood and ratio transformation operations. We conclude that a mixed approach combining phylogenetic libraries with machine learning libraries will provide the optimal combination of speed and model flexibility moving forward.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available