Collection of Natural Language Processing Models and References
PAPER
Attention Is All You NeedPOST
The Illustrated TransformerPOST
Transformer Architecture: Attention Is All You Need
PAPER
BERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingVIDEO(Korean ver.)
PR-121: BERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingVIDEO(Korean ver.)
#BERT #충격적인논문 #ClovaAIVIDEO(Korean ver.)
BERT 세미나 - TmaxData NLP 박민호
VIDEO
CS224n: Natural Language Processing with Deep LearningVIDEO(Korean ver.)
딥러닝을 이용한 자연어 처리 - 조경현 교수(NYU)