Teaching
Statistical Methods in AI - Monsoon 2018-19
The course serves as an introduction to statistical methods used in machine learning and AI. The topics covered includes pattern representation, classification, and clustering. A good working knowledge of Linear Algebra and Probability and Statistics is expected from the students as a prerequisite.
Topics
- Introduction, Feature Representation
- Random Variables, Probability Densities, Multivariate Densities
- Linear Discriminant Functions
- Perceptron Learning
- Minimum Squared Error Procedures
- Bayesian Decision Theory
- Naive Bayes Classifier
- Maximum Likelihood Estimation (MLE)
- Machine Learning Fundamentals
- Logistic Regression; Feature Selection
- Principal Component Analysis and Eigen Faces
- Linear Discriminant Analysis and Fischer Faces
- Nearest Neighbour Classifier
- Max-Margin Classification (SVM), SVM variants, Kernalization
- Neural Networks, Backpropagation, Training Methods
- Data Clustering, Kmeans (EM) and variants, Hierarchical Clustering
- Decision Trees, Random Forests
- Graphical Models, Bayesian Belief Networks
- Combining Classifiers, Boosting.
Reference Books
- Pattern Classification by Duda, Hart & Stork
- Neural Networks by Simon Haykin
- Machine Learning - A Probabilistic Perspective by Kevin Murphy (free ebook available online)
Scribes
- Introduction to Random Variables and Density Functions
- Matrix Algebra
- Introduction
- Mathematics for ML
- Linear Methods-I
- Linear Methods-II
- Linear Methods-III
- Linear Methods-IV
- Linear Methods-VI
- Linear Dimensionality Reduction
- Linear SVMs
- Kernels and Nonlinear SVMs
- More on Dimensionality Reduction: LDA and KPCA
- Multi Layer Perceptrons (MLP)