CSE 427: Machine Learning

This is a foundational course on Machine Learning.  The first part of the course discusses PAC Learning and different learnability paradigms.  We try to understand what is a learnable problem.  The next part of the course focuses on traditional algorithms.  The course is highly mathematical and meant to give students a very strong foundation for higher studies in machine learning and related subjects.


Anybody who understands this course, does the homework, and comes to class very regularly, should get a very good grade.  The marking is as follows

  • Attendance & Participation: TBD
  • Homework & Quizzes: TBD
  • Midterm: TBD
  • Final: TBD

Because, this is in some sense a hard course, attendance is extremely important. Students who don’t attend class should expect to be lost. If you cannot attend class because you or your parents are very sick, or are traveling, send me a message on the Slack message group before the class to be missed.

Class structure

There is a Slack message group for this class.  All announcements and homework and discussions will be on Slack. Homework will be given every two weeks.  Late homework will be heavily penalized.

Course Book:

We will follow this book very closely. There are plenty of other books on ML such as

  • Introduction to Statistical Learning by James, Witten, Hastie, and Tibshirani, Springer Verlag.
  • Machine Learning by Barber, Cambridge University Press
  • Machine Learning: The Art and Science of Algorithms that Make Sense of Data by Flach, Cambridge University Press
  • Introduction to Machine Learning by Alpaydin, MIT Press.

Nowadays, a student’s best friend is YouTube.  You can learn almost anything from YouTube or various other online resources.  Some very useful lectures online are:

Some resources for the needed mathematical background are the texts

Machine Learning: Summer 2018

BRACU ClassDateTimeRoom
CSE 427: Machine Learning: Section 1Su, Tu5:00-6:20UB30101

Lecture Notes

  1. Lecture 1
  2. Lecture 3
  3. Lecture 4
  4. Lecture 5
  5. Lecture 6
  6. Lecture 7: VC Dimension (complete!)
  7. Lecture 8: Nonuniform Learnability (complete!)
  8. Lecture 9: Computational Complexity of Learning
  9. Lecture 10: Linear Predictors (Part I)
  10. Lecture 10: Linear Predictors (Part II)
  11. Lecture 11: Boosting
  12. Lecture 11: Boosting (slides)
  13. Lecture 12: Convexity
  14. Lecture 13: More techniques

Problem Sets





  1. What is Machine Learning? Examples and Motivation.
  2. Continued Motivation and Basic Definitions
  3. ERM Predictors
  4. PAC Learnability
  5. Uniform Convergence and PAC Learning
  6. No Free Lunch Theorem
  7. Problem Solving Class Finding m.
  8. Problem Solving Class: More examples
  9. VC Dimension
  10. Review
  11. Fundamental Theorem of Statistical Learning
  12. Sauer’s Lemma
  13. Nonuniform Learnability
  14. Computational Complexity of Learning
  15. Linear Predictors
  16. Boosting
  17. Convexity
  18. More techniques, SVMs, Kernels, etc.