Explores the consistency and asymptotic properties of the Maximum Likelihood Estimator, including challenges in proving its consistency and constructing MLE-like estimators.
Explores computing density of states and Bayesian inference using importance sampling, showcasing lower variance and parallelizability of the proposed method.
Delves into the intersection of physics and data in machine learning models, covering topics like atomic cluster expansion force fields and unsupervised learning.