Fields Academy Shared Graduate Course: Deep Learning for Natural Scientists
Description
Instructor: Prof. Joel Zylberberg
Email: joelzy@yorku.ca
Course Dates: September 6th - December 6th, 2023
Mid-Semester Break: October 9th-13th, 2023
Lecture Times: Mondays, Wednesdays, & Fridays | 11:30 AM - 12:20 PM (ET)
Office Hours: Mondays & Wednesdays | 3:00 PM - 4:00 PM (ET) | Location - York University, Keele Campus, Petrie Building, Room 243 & via Zoom
Registration Fee: PSU Students - Free | Other Students - CAD$500
Capacity Limit: 35 students
Format: Hybrid synchronous delivery
- In-Person - York University, Keele Campus, Lumbers Hall, Room 306
- Online - via Zoom
Course Description
Please view the course outline HERE.
Prerequisites: (recommended) Multivariable Calculus; Linear Algebra; Introduction to Computer Programming
Evaluation:
The tentative rubric for this course is:
- Homework Assignments [4 Assignments] - 30%
- Seminar / Paper Presentation - 10%
- Final Project: Proposal - 5%
- Final Project: Status Update - 10%
- Final Project: Final Report - 30%
- Final Project: Final Presentation - 15%
Homework Assignments: Four programming assignments will give students hands-on experience in implementing and evaluating simple machine learning algorithms. Students are encouraged to use Google Colab for these assignments: Colab is a free web-based tool that will let students run Python notebooks on Google servers that have both CPUs and GPUs. This avoids the need for students to set up and maintain their own Python environments; and avoids the need for students to have expensive GPU-enabled computers.
Seminar / Paper Presentation: Each student will identify one recent journal article (published in 2015 or later) that applies machine learning to a scientific research topic of their choosing. They will lead a class discussion of that paper: introducing the science problem addressed in the paper; explaining the machine method used; and discussing the effectiveness of that solution. These presentations will take place in mid-late November.
Final Project: Students will define a problem that can be solved using deep learning. They will implement a deep learning solution and evaluate that solution.
The project could focus on a “standard” machine learning problem (E.g., develop a new kind of artificial neuron “unit”, and test networks with that new unit on MNIST or CIFAR-10 classification tasks). Students are also encouraged to define their own problem, possibly one related to their thesis project.
Students will be required to submit a 1-page written proposal (Deadline: Oct 20), a 2-page written status update (Deadline: Nov. 17), and their final written report (5-10 pages, Deadline: Dec. 13). Finally, students will be required to give a 20-minute oral presentation at the end of the term where they will share their work with the class.
Resources:
The course textbook will be:
- Goodfellow, Bengio, Courville. Deep learning. MIT press, 2016. (https://www.deeplearningbook.org/)
Research papers and other resources will be posted.
Course Content: (Chapter numbers refer to the Goodfellow, Bengio, and Courville Deep Learning book)
- Machine Learning Basics (Ch. 5)
- Deep Feedforward Networks (Ch. 6)
- Regularization for Deep Learning (Ch. 7)
- Optimization for Training Deep Models (Ch. 8)
- Convolutional Networks (Ch. 9)
- Sequence Modeling: Recurrent and Recursive Nets (Ch. 10)
- Attention and Transformers
- Recent Applications of Deep Learning in the natural sciences (papers to be provided)