Pinned Repositories
BERT-QE
Code and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".
DACON-Competition1
한국어 문서 추출요약 AI 경진 대회
DACON-Competition2
소설 작가 분류 AI 경진대회 - 상위 2%, 최종 1등
kobart-transformers
KoBART model on huggingface transformers
KoELECTRA
Pretrained ELECTRA Model for Korean
NLP-Study
NLP 공부 & 논문 리뷰
Projects
학교 수업 Projects
RNN-example
LSTM을 활용한 Topic Classification
transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
hys50001's Repositories
hys50001/DACON-Competition1
한국어 문서 추출요약 AI 경진 대회
hys50001/BERT-QE
Code and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".
hys50001/DACON-Competition2
소설 작가 분류 AI 경진대회 - 상위 2%, 최종 1등
hys50001/kobart-transformers
KoBART model on huggingface transformers
hys50001/KoELECTRA
Pretrained ELECTRA Model for Korean
hys50001/NLP-Study
NLP 공부 & 논문 리뷰
hys50001/Projects
학교 수업 Projects
hys50001/RNN-example
LSTM을 활용한 Topic Classification
hys50001/transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.