View a PDF of the paper titled Bayesian neural networks via MCMC: a Python-based tutorial, by Rohitash Chandra and Joshua Simmons
Abstract:Bayesian inference provides a methodology for parameter estimation and uncertainty quantification in machine learning and deep learning methods. Variational inference and Markov Chain Monte-Carlo (MCMC) sampling methods are used to implement Bayesian inference. In the past three decades, MCMC sampling methods have faced some challenges in being adapted to larger models (such as in deep learning) and big data problems. Advanced proposal distributions that incorporate gradients, such as a Langevin proposal distribution, provide a means to address some of the limitations of MCMC sampling for Bayesian neural networks. Furthermore, MCMC methods have typically been constrained to statisticians and currently not well-known among deep learning researchers. We present a tutorial for MCMC methods that covers simple Bayesian linear and logistic models, and Bayesian neural networks. The aim of this tutorial is to bridge the gap between theory and implementation via coding, given a general sparsity of libraries and tutorials to this end. This tutorial provides code in Python with data and instructions that enable their use and extension. We provide results for some benchmark problems showing the strengths and weaknesses of implementing the respective Bayesian models via MCMC. We highlight the challenges in sampling multi-modal posterior distributions for the case of Bayesian neural networks and the need for further improvement of convergence diagnosis methods.
Submission history
From: Rohitash Chandra [view email]
[v1]
Sun, 2 Apr 2023 02:19:15 UTC (8,734 KB)
[v2]
Tue, 2 Apr 2024 12:38:25 UTC (4,665 KB)
[v3]
Mon, 26 Aug 2024 11:35:52 UTC (4,665 KB)
Source link
lol