CSE 427: Machine Learning
This is a foundational course on Machine Learning. The first part of the course discusses PAC Learning and different learnability paradigms. We try to understand what is a learnable problem. The next part of the course focuses on traditional algorithms. The course is highly mathematical and meant to give students a very strong foundation for higher studies in machine learning and related subjects.
Anybody who understands this course, does the homework, and comes to class very regularly, should get a very good grade. The marking is as follows
- Attendance & Participation: TBD
- Homework & Quizzes: TBD
- Midterm: TBD
- Final: TBD
Because, this is in some sense a hard course, attendance is extremely important. Students who don’t attend class should expect to be lost. If you cannot attend class because you or your parents are very sick, or are traveling, send me a message on the Slack message group before the class to be missed.
There is a Slack message group for this class. All announcements and homework and discussions will be on Slack. Homework will be given every two weeks. Late homework will be heavily penalized.
- Understanding Machine Learning: From Theory to Algorithms, by Shai Shalev-Shwartz and Shai Ben-David, Cambridge University Press
We will follow this book very closely. There are plenty of other books on ML such as
- Introduction to Statistical Learning by James, Witten, Hastie, and Tibshirani, Springer Verlag.
- Machine Learning by Barber, Cambridge University Press
- Machine Learning: The Art and Science of Algorithms that Make Sense of Data by Flach, Cambridge University Press
- Introduction to Machine Learning by Alpaydin, MIT Press.
Nowadays, a student’s best friend is YouTube. You can learn almost anything from YouTube or various other online resources. Some very useful lectures online are:
- Shai Ben-David’s Waterloo lectures
- Lots and lots of other stuff online…
Some resources for the needed mathematical background are the texts
Machine Learning: Summer 2018
|CSE 427: Machine Learning: Section 1||Su, Tu||5:00-6:20||UB30101|
- Lecture 1
- Lecture 3
- Lecture 4
- Lecture 5
- Lecture 6
- Lecture 7: VC Dimension (complete!)
- Lecture 8: Nonuniform Learnability (complete!)
- Lecture 9: Computational Complexity of Learning
- Lecture 10: Linear Predictors (Part I)
- Lecture 10: Linear Predictors (Part II)
- Lecture 11: Boosting
- Lecture 11: Boosting (slides)
- Lecture 12: Convexity
- Lecture 13: More techniques
- Problem Set 1: Practical
- Problem Set 2: Practical
- Problem Set 3: Theory
- Problem Set 4: From UML: p35, 4.1 and 4.2, P42, 5.3, P54, 6.1 to 6.11
- What is Machine Learning? Examples and Motivation.
- Continued Motivation and Basic Definitions
- ERM Predictors
- PAC Learnability
- Uniform Convergence and PAC Learning
- No Free Lunch Theorem
- Problem Solving Class Finding m.
- Problem Solving Class: More examples
- VC Dimension
- Fundamental Theorem of Statistical Learning
- Sauer’s Lemma
- Nonuniform Learnability
- Computational Complexity of Learning
- Linear Predictors
- More techniques, SVMs, Kernels, etc.