Discusses the Dirichlet distribution, Bayesian inference, posterior mean and variance, conjugate priors, and predictive distribution in the Dirichlet-Multinomial model.
Delves into the fundamental limits of gradient-based learning on neural networks, covering topics such as binomial theorem, exponential series, and moment-generating functions.
Introduces Bayesian estimation, covering classical versus Bayesian inference, conjugate priors, MCMC methods, and practical examples like temperature estimation and choice modeling.
Covers topic models, focusing on Latent Dirichlet Allocation, clustering, GMMs, Dirichlet distribution, LDA learning, and applications in digital humanities.