/timeseries2018

Course page for DS-GA 3001.001 Modeling Time Series Data

Primary LanguageJupyter Notebook

timeseries2018

DS-GA 3001.001 Modeling Time Series Data

Lecture

Tue, 3:20-5:00pm, in 60 5th Av, C10

Lab (required for all students)

Thu, 6.45- 7.35pm in 60 5th Av, C12

Instructor

Cristina Savin, csavin@nyu.edu Office hours: Tue, 5:00-6:00pm, Room 608

TA

Yiqiu (Artie) Shen, ys1001@nyu.edu Office hours: Thu, 11am-12pm, Room 660

Overview

This graduate level course presents fundamental tools for characterizing data with statistical dependencies over time, and using this knowledge for predicting future outcomes. These methods have broad applications from econometrics to neuroscience. The course emphasizes generative models for time series, and inference and learning in such models. We will cover range of approaches including AR(I)MA, Kalman Filtering, HMMs, Gaussian Processes, and their application to several kinds of data.

Note: information presented is tentative, syllabus may be subject to change as the course progresses.

Grading

problem sets (35%) + midterm exam (25%) + final project (25%) + participation (15%).

Piazza

We will use Piazza to answer questions and post announcements about the course. Students' use of Piazza, in particular answering other students' questions well, will contribute to the participation grade.

Online recordings

Lecture videos will be posted to NYU Classes. Class attendance is still required.

Schedule and detailed syllabus

Date Lecture Extras
Jan.23 Lecture 1: Logistics. Introduction. Basic statistics for characterizing time series. Shumway Stoffer Ch.1
Jan.25 Lab1: Simulating simple stochastic processes. Basic statistics.
Jan.30 Lecture 2: AR(I)MA Shumway Stoffer Ch.3
Febr.1 Lab 2: AR(I)MA Problem set 1, solution, due Febr. 12
Febr.6 Lecture 3: LDS; Kalman filtering kalmanderivations.pdf Brainstorm project ideas
Febr.8 Lab 3: Basic probability review. LDS inference Project proposal due Febr. 27
Febr.13 Lecture 4: EM. Particle filtering LDSlearning.pdf, particlefiltering.pdf
Febr.15 Lab 4: LSD learning
Febr.20 Lecture 5: HMMs hmm.pdf Problem set 2, due March 2
Febr.22 Lab 5: HMMs Problem set 3, due March 30
Febr.27 Lecture 6: An unified view of linear models. Beyond linear. Roweis and Ghahramani, 1999
March 1 Lab 6: Revisiting ARIMA, focus on applications arima.pdf
March 6 Midterm
March 8 No lab
March 20 Guest lecture: State space models in the brain, Il Memming Park email CS for slides
March 22 No lab
March 27 Lecture 8: GP basics
March 29 Lab: GP
April 3 Lecture 9: RNNs (Kyunghyun Cho)
April 5 Projects status discussion
April 12 Lab: GP Problem set 4, due April 26th
April 17 Lecture 10: Sparse GP methods
April 19 Lab no lab
April 24 Lecture 11: Spectral methods
April 26 Lab: Spectral methods
May 1 Projects presentation Instructions Final reports due May 8th!

Bibliography

There is no required textbook. Assigned readings will come from freely-available online material.

Core materials

  • Time series analysis and its applications, by Shumway and Stoffer, 4th edition (freely available pdf)
  • Pattern recognition and machine learning, Bishop
  • Gaussian processes Rassmussen & Williams, (materials freely available online, including gpml library)

Useful extras

Academic honesty

We expect you to try solving each problem set on your own. However, if stuck you should discuss things with other students in the class, subject to the following rules:

  • Brainstorming and verbally discussing the problem with other colleagues ok, going together through possible solutions, but should not involve one student telling another a complete solution.
  • Once you solve the homework, you must write up your solutions on your own.
  • You must write down the names of any person with whom you discussed it. This will not affect your grade.
  • Do not consult other people's solutions from similar courses.

Late submission policy

During the full semester you are allowed a total of maximum of 5 days extension on homework assignments. Each day comes with a penalty of 20% off your assignment.