Deep Learning

Brown University – csci1470 – Fall 2017


Over the past few years, Deep Learning has become a popular area, with deep neural network methods obtaining state-of-the-art results on applications in computer vision (Self-Driving Cars), natural language processing (Google Translate), and reinforcement learning (AlphaGo). This course intends to give students a practical understanding of the field of Deep Learning, through lectures and labs covering both the theory and application of neural networks to the above areas (and more!). We introduce students to the core concepts of Deep Neural Networks, including the backpropagation algorithm for training neural networks, as well as specific operations like convolution (in the context of computer vision), and word embeddings and recurrent neural networks (in the context of natural language processing). We also teach the Tensorflow Framework for the expression of deep neural network models.

Professor: Eugene Charniak (
Time & Location: M/W/F 12–12:50pm in Salomon 001
Documents: Course Missive
Contact course staff:


Fill out this new form for lab sections: (
Please complete this form by 11:59pm on Friday, September 15th.

Sign the collaboration policy google form by Friday, September 15th — we can't grade your work unless you sign this!


Date Topic Announcements
Week 1 Introduction
Wed, 9/6 Introduction to Deep Learning!
Fri, 9/8 Introduction to Feed-Forward Neural Networks
Lab No Lab - Sign Up for Lab Sections!
Week 2 Feed-Forward Neural Networks
Mon, 9/11 Feed-Forward Networks (Architecture + Forward Pass)
Wed, 9/13 Feed-Forward Networks (Derivation of Cross-Entropy Loss + Motivation)
Fri, 9/15 Gradient Descent + Backpropagation
Lab (Optional) Python Setup, Python Tutorial + Linear Algebra Refresher
Week 3 Feed-Forward Neural Networks + Tensorflow Book Chapter 2
Mon, 9/18 Backprop (continued), Matrix Representation of Feed-Forward Networks Due: MNIST w/ NumPy
Out: MNIST w/ Tensorflow
Wed, 9/20 More Matrices + Introduction to Tensorflow
Fri, 9/22 Tensorflow Continued (Motivation of Sessions, the Computation Graph)
Lab [Code Tutorial] Tensorflow Setup, Hello World, Getting Comfortable with the TF API
Week 4 Convolutional Neural Networks (Week 1) Book Chapter 3
Mon, 9/25 Introduction to CNNs Due: MNIST w/ Tensorflow
Wed, 9/27 What is Convolution? Intuitive Explanation
Fri, 9/29 More Convolution: Pooling, Strides
Lab [Visualization] Visualizing Convolution in Neural Networks: What are our Filters actually doing?
Week 5 Word Embeddings + Language Modeling (Week 1) Book Chapter 4
Mon, 10/2 Introduction to Language Modeling + Perplexity
Wed, 10/4 Distributed/Vector Representations of Words (Motivation: Sparse to Dense Representations)
Fri, 10/6 Guest Lectures - Deep Learning Research on Campus
Lab [Case-Study] Training Very Deep Convolutional Networks
Week 6 Word Embeddings + Language Models (Week 2) Book Chapter 4
Mon, 10/9 NO LECTURE - Indigenous Peoples Day
Wed, 10/11 Feed-Forward Bigram/N-Gram Language Modeling
Fri, 10/13 Language and Perplexity
Lab Recurrent Neural Networks
Week 7 RNNs + Sequence-to-Sequence Models (Week 1) Book Chapter 5
Mon, 10/23 Introduction to Recurrent Neural Networks
Wed, 10/25 More Recurrent Neural Networks - GRU Cells
Fri, 10/27 Recurrrent Neural Network Conclusion
Lab [Case-Study] Word2Vec + Language Modeling


Professor: Eugene Charniak (ec)

HTA: Sidd Karamcheti (skaramch)

UTA: Arun Drelich (adrelich)

UTA: Dilip Arumugam (darumuga)

UTA: Evan Cater (ecater1)

UTA: Jacob Beck (jab11)

UTA: Melrose Roderick (maroderi)

UTA: Nathaniel Brennan (nbrennan)

UTA: Raphael Kargon (rkargon)

UTA: Sean Segal (ss97)

UTA: Zhenhao Hou (zhou)