/ACL2022_deepnets_tutorial

Materials for ACL-2022 tutorial: A Gentle Introduction to Deep Nets and Opportunities for the Future

Creative Commons Zero v1.0 UniversalCC0-1.0

T1: A Gentle Introduction to Deep Nets and Opportunities for the Future

Abstract

The first half of this tutorial will make deep nets more accessible to a broader audience, following “Deep Nets for Poets” and “A Gentle Introduction to Fine-Tuning.” We will also introduce, GFT (general fine tuning), a little language for fine tuning deep nets with short (one line) programs that are as easy to code as regression in statistics packages such as R using glm (general linear models). Based on the success of these methods on a number of benchmarks, one might come away with the impression that deep nets are all we need. However, we believe the glass is half-full: while there is much that can be done with deep nets, there is always more to do. The second half of this tutorial will discuss some of these opportunities

Quick Links:

  1. Videos 📽️
    1. 📽️ 10 minute TEASER (for both halves)
    2. 🆕📽️ First half (1 hour 16 minutes) UNABRIDGED (YouTube); mirror of above (Bilibili)
  2. gft code
  3. paper
  4. longer journal paper on gft (general fine-tuning)
  5. Slides
    1. pdf
    2. 3 pptx files and 1 more pptx file

Part A: Half FullPart B: Half Empty
Ken Church
Ken Church
Baidu, USA
Valia Kordoni
Valia Kordoni
Humboldt-Universitaet zu Berlin, Germany
YanjunMa.jpg
Yanjun Ma
Baidu, China
Gary Marcus
Gary Marcus
New York University
Zeyu Chen
Zeyu Chen
Baidu, China
Ernest Davis
Ernest Davis
New York University