Bayesian Model Selection
About
The course is dedicated to the basis of Bayesian Methods in Machine Learning, Bayesian Models’ construction and inference. The notion of optimal bayesian forecast is introduced, and the methods for constructing the one are studied. The methods for handling non-linearities and inhomogeneities in data are discussed. Handling of missing data and optimal model evolution with time are considered from a Bayesian perspective.
Syllabus
- Introduction: Reminding key definitions from Probability Theory and Statistics.
- Multiple Hypotheses testing and prior election.
- Naive Bayes, its generalizations and optimal bayesian forecast.
- Exponential family of distributions and sufficient statistics.
- Bayesian Linear Regression. Evidence.
- Bayesian Logistics Regression and Feature selection via Max-evidence principle.
- EM-algorithm and Variational EM-algorithm.
- Gaussian process and optimal model evolution with time.
- Constructing adequate multimodels.
- Markov Chain Monte-Carlo (MCMC) methods.
- Hamiltonian Markov Chain Monte-Carlo (HMC) methods.
- Bayesian Optimization.
Labworks
4 theoretical tasks (performed individually), 1 practical task and 1 competition (performed in teams), 2 tests. Written and oral exam (the latter can be substituted by the presentation on a selected topic).
Grading
Each theoretical task gives at most 50 / 100 points, while practical one and competition each give up to 50 points. The score for the TOP1 student in each task is doubled (extra points). Written and oral exams add up to 250 together. The final mark is computed by scaling the total score by maximal one assuming no extra points gained.
Prerequisites
Probability, Basic Machine Learning, Linear Algebra, Optimization.