CSCI 2952-M
The Works that Made and Changed Machine Learning
(Fall 2021)

Course Description
This seminar is aimed at current and potential future graduate students who want to gain technical depth and perspective on the field of statistical machine learning. Students will read, present, and discuss some of the original papers that had transformative impact on the development of machine learning. Topics will range from mathematical foundations, to major algorithmic, and breakthrough works on deep learning and its applications in vision and NLP. Ideal students will have a mix of: 1) motivation to learn how to read, present and evaluate technical papers, 2) mathematical maturity and basic ML background, 3) willingness to participate and contribute to discussions. Enrollment will be limited, and will be finalized after the first class. No formal prerequisite.
Essential Info
Instructor: Eli Upfal
Class Meetings: Mondays, 3:00-5:00pm, CIT 316.
Prerequisites: No formal prerequisite. Previous experience in statistical machine learning is recommended through CSCI 1550 or equivalent research experience.
IMPORTANT: Check Canvas for additional information on the first lecture (including Zoom link).
Important Links
Canvas for discussions, and additional class resources.
For questions, discussion, and other course-related posts, use Canvas.
If you have an atypical question that you are certain does not belong on Canvas, email the instructor.
Background Material
A.M. Turing. Computing machinery and intelligence. Original [PDF], easier to read [PDF].
P. W. L. Wong. How to Read a CS Research Paper?
S. Keshav. How to Read a Paper.
1. Paper(s) assigned for each meeting will be posted on the course website a week before the meeting.
2. Students will read the assigned paper(s) and submit a 1-2 page writeup before the meeting, answering the following questions:
   a. What is the main contribution of the paper?
   b. Why is the contribution important (or note)?
   c. What is not solved in the paper? What research would you suggest following this paper?
3. Each student will give at least one class presentation.
4. Presentations will be followed by a class discussion.
Course Papers
The seminar will cover a subset of the following papers, and possibly other papers suggested by the participants.
Perceptron (Before Deep Learning...)
SVM - Support Vector Machine (The First Modern ML Algorithm?)
VC – PAC Learning (The theory)
Boosting - ADABoost
Rademacher Complexity (The “modern” theory)
Transfer Learning
Deep Learning
NLP – Natural Language Processing
Reinforcement Learning