Course detail
Bayesian Models for Machine Learning (in English)
FIT-BAYaAcad. year: 2019/2020
Probability theory and probability distributions, Bayesian Inference, Inference in Bayesian models with conjugate priors, Inference in Bayesian Networks, Expectation-Maximization algorithm, Approximate inference in Bayesian models using Gibbs sampling, Variational Bayes inference, Stochastic VB, Infinite mixture models, Dirichlet Process, Chinese Restaurant Process, Pitman-Yor Process for Language modeling, Expectation propagation, Gaussian Process, Auto-Encoding Variational Bayes, Practical applications of Bayesian inference
Supervisor
Nabízen zahradničním studentům
Všech fakult
Learning outcomes of the course unit
Not applicable.
Prerequisites
Not applicable.
Co-requisites
Not applicable.
Recommended optional programme components
Not applicable.
Recommended or required reading
http://www.fit.vutbr.cz/study/courses/BAYa/public/
C. Bishop: Pattern Recognition and Machine Learning, Springer, 2006
S. J. Gershman and D.M. Blei: A tutorial on Bayesian nonparametric models, Journal of Mathematical Psychology, 2012.
P Orbanz: Tutorials on Bayesian Nonparametrics: http://stat.columbia.edu/~porbanz/npb-tutorial.html
D.P. Kingma, M. Welling: Auto-Encoding Variational Bayes, ICLR, Banff, 2014
Planned learning activities and teaching methods
Not applicable.
Assesment methods and criteria linked to learning outcomes
- Half-semestral exam (24pts)
- Submission and presentation of project (25pts)
- Semestral exam, 51pts.
Language of instruction
English
Work placements
Not applicable.
Aims
To demonstrate the limitations of Deep Neural Nets (DNN) that have become a very popular machine learning tool successful in many areas, but that excel only when sufficient amount of well annotated training data is available. To present Bayesian models (BMs) allowing to make robust decisions even in cases of scarce training data as they take into account the uncertainty in the model parameter estimates. To introduce the concept of latent variables making BMs modular (i.e. more complex models can be built out of simpler ones) and well suitable for cases with missing data (e.g. unsupervised learning when annotations are missing). To introduce basic skills and intuitions about the BMs and to develop more advanced topics such as: approximate inference methods necessary for more complex models, infinite mixture models based on non-parametric BMs, or Auto-Encoding Variational Bayes. The course is taught in English.
Classification of course in study plans
- Programme IT-MGR-2 Master's
branch MGMe , any year of study, winter semester, 5 credits, compulsory-optional
- Programme MITAI Master's
specialization NADE , any year of study, winter semester, 5 credits, elective
specialization NBIO , any year of study, winter semester, 5 credits, elective
specialization NGRI , any year of study, winter semester, 5 credits, elective
specialization NNET , any year of study, winter semester, 5 credits, elective
specialization NVIZ , any year of study, winter semester, 5 credits, elective
specialization NCPS , any year of study, winter semester, 5 credits, elective
specialization NSEC , any year of study, winter semester, 5 credits, elective
specialization NEMB , any year of study, winter semester, 5 credits, elective
specialization NHPC , any year of study, winter semester, 5 credits, elective
specialization NISD , any year of study, winter semester, 5 credits, elective
specialization NIDE , any year of study, winter semester, 5 credits, elective
specialization NISY , any year of study, winter semester, 5 credits, elective
specialization NMAL , any year of study, winter semester, 5 credits, compulsory
specialization NMAT , any year of study, winter semester, 5 credits, elective
specialization NSEN , any year of study, winter semester, 5 credits, elective
specialization NVER , any year of study, winter semester, 5 credits, elective
specialization NSPE , any year of study, winter semester, 5 credits, elective - Programme IT-MGR-1H Master's
branch MGH , any year of study, winter semester, 5 credits, recommended
Type of course unit
Lecture
26 hours, optionally
Teacher / Lecturer
Syllabus
- Probability theory and probability distributions
- Bayesian Inference (priors, uncertainty of the parameter estimates, posterior predictive probability)
- Inference in Bayesian models with conjugate priors
- Inference in Bayesian Networks (loopy belief propagation)
- Expectation-Maximization algorithm (with application to Gaussian Mixture Model)
- Approximate inference in Bayesian models using Gibbs sampling
- Variational Bayes inference, Stochastic VB
- Infinite mixture models, Dirichlet Process, Chinese Restaurant Process
- Pitman-Yor Process for Language modeling
- Expectation propagation
- Gaussian Process
- Auto-Encoding Variational Bayes
- Practical applications of Bayesian inference
Fundamentals seminar
13 hours, compulsory
Teacher / Lecturer
Syllabus
Lectures will be immediately followed by demonstration exercises where examples in Python will be presented. Code and data of all demonstrations will be made available to the students and will constitute the basis for the project.
Project
13 hours, compulsory
Teacher / Lecturer
Syllabus
The project will follow on the demonstration exercises and will make the student work on provided (simulated or real) data. The students will work in teams in "evaluation" mode and present their results at the final lecture/exercise.