08 — Bayesian and Probabilistic Machine Learning¶
This module explores the probabilistic foundations of machine learning, focusing on uncertainty quantification, high-dimensional sampling, and the infinite-width limit of neural networks. We bridge the gap between classical Bayesian inference and modern deep learning, providing tools for robust and reliable AI systems.
Prerequisite Tier: Tier 2-3 — Intermediate / Advanced (Requires Probability Theory, Calculus, and basic Linear Algebra)
📚 Course Modules¶
- Lecture: Unified Mathematical Foundations
- Practice: Exercises and Coding Tasks
- Project: BNNs and Conformal Prediction Implementation
📄 Core Literature¶
- Neal, R. M. (1996): Bayesian Learning for Neural Networks - The foundational text for BNNs and HMC.
- Lee, J., et al. (2018): Deep Neural Networks as Gaussian Processes - Establishing the NNGP framework.
- Angelopoulos, A. N. & Bates, S. (2021): A Gentle Introduction to Conformal Prediction - The modern standard for distribution-free uncertainty.
- Blei, D. M., et al. (2017): Variational Inference: A Review for Statisticians - A comprehensive guide to VI.