View on GitHub

CS260 [Fall2022] Machine Learning Algorithms


This course introduces the foundational concepts and algorithms of machine learning and deep learning. The goal of this course is to endow the student with a) a solid understanding of the foundational concepts of machine learning, and b) morden machine learning techniques such as deep learning. Topics to be covered include empirical risk minimization, PAC learning, Agnostic PAC learning, perceptron, linear regression, boosting, stochastic gradient descent, support vector machines, multi-layer perceptron, convolutional neural networks, recurrent neural networks, attention mechanism. Slides and homework assignments will be released on this website. Homework solutions will only be released on Bruinlearn.


Calculus, linear algebra, probability and statistics, and Python programming.


Programming Language

Python, Pytorch


Grading Policy

Grades will be computed based on the following factors:


# Date Topics Reading Homework
1 9/26 Introduction (slides) Chapter 1, 2.1 of [SSBD]  
2 9/28 Empirical Risk Minimization, PAC Learning (slides) Chapter 2 of [SSBD] HW1 Out Latex Template
  9/30 TA Session Week 1 (1A slides)(1B slides)(1C slides)    
3 10/3 Agnostic PAC Learning (slides) Chapter 3, 4 of [SSBD]  
4 10/5 Bias-Complexity Tradeoff (slides) Chapter 5, 11 of [SSBD]  
  10/7 TA Session Week 2 (1A slides)(1B slides)(1C slides)    
5 10/10 Perceptron/Linear regression (slides) Chapter 9, 19 of [SSBD] HW1 Due,
6 10/12 Boosting (slides) Chapter 10 of [SSBD] HW2 Out
  10/14 TA Session Week 3(1A slides)(1B slides)(1C slides)    
7 10/17 Boosting, Convex Learning and SGD (slides) Chapter 12, 14 of [SSBD]  
8 10/19 AI4Database Guest lecture  
  10/21 TA Session Week 4 (1A slides)(1B slides)(1C slides)   HW2 Due,
9 10/24 Convex Learning and SGD (slides) Chapter 12, 14 of [SSBD] HW3 Out
10 10/26 Regularization and Stability, Support Vector Machines (slides) Chapter 13 and 15 of [SSBD]  
  10/28 TA Session Week 5 (1A slides)(1B slides) (1C slides)    
11 10/31 Kernel Methods (slides) Chapter 16 of [SSBD]  
12 11/2 Multi-layer Perceptron I (slides) Chapter 4 and 5 of [ZLLS] HW3 Due
  11/4 TA Session Week 6(1B slides)    
  11/7 Midterm Exam   HW4 Out
13 11/9 Multi-layer Perceptron II (slides) Chapter 4 and 5 of [ZLLS]  
  11/11 TA Session Week7    
14 11/14 Covolutional Neural Networks I (slides) Chapter 7 of [ZLLS]  
15 11/16 Covolutional Neural Networks II (slides) Chapter 8 of [ZLLS] HW4 Due HW5 Out
  11/18 TA Session Week 8 (1B slides)    
16 11/21 Recurrent Neural Networks I (slides) Chapter 9 of [ZLLS]  
17 11/23 Recurrent Neural Networks II (slides) Chapter 10 of [ZLLS] HW5 Due, HW6 Out
  11/28 Canceled Due to NeurIPS    
18 11/30 Attention Mechanisms (slides Chapter 11 of [ZLLS]  
  12/2 TA Session Week10 (slides    
  12/7     HW6 Due
  12/8 Final Project Presentation    
  12/11     Project Report/Slides Due

Academic Integrity Policy

Students are encouraged to read the UCLA Student Conduct Code for Academic Integrity.


There will be about 5 homework assignments during the semester as we cover the corresponding material. Homework consists of both mathematical derivation, algorithm analysis and programming. Homework is required to be written in Latex. Latex homework template can be found here. The lowest homework score will be dropped for you.

Unless otherwise indicated, you may talk to other students about the homework problems but each student must hand in their own answers and write their own code in the programming part. You also must indicate on each homework with whom you collaborated and cite any other sources you use including Internet sites. Students cannot use old solution sets for this class or solution manual to the textbook under any circumstances.

Homework assignments will be submitted through Gradescope.

Please submit your homework on time. Homework is worth full credit before the due date. It is worth zero credit after the due date.


There will be one in-class midterm on Oct 31.


There will be 6 in-class pop-up quiz for the purpose of reviewing the newly learned concepts. The quizzes are closed book and closed notes. No electronic aids or cheat sheets are allowed. We will drop the lowest quiz score for you.


Students are required to do a project in this class. The goal of the course project is to provide you an opportunity to either do machine learning research or solve a real-world problem using machine learning.

The best outcome of the project is a manuscript that is publishable in major machine learning conferences (COLT, ICML, NeurIPS, ICLR, AISTATS, UAI etc.) or journals (Journal of Machine Learning Research, Machine Learning).