marcomoldovan's Stars
microsoft/MSMARCO-Document-Ranking
MS MARCO(Microsoft Machine Reading Comprehension) is a large scale dataset focused on machine reading comprehension, question answering, and passage/document ranking
microsoft/MSMARCO-Question-Answering
MS MARCO(Microsoft Machine Reading Comprehension) is a large scale dataset focused on machine reading comprehension and question answering
richliao/textClassifier
Text classifier for Hierarchical Attention Networks for Document Classification
ahkarami/Deep-Learning-in-Production
In this repository, I will share some useful notes and references about deploying deep learning-based models in production.
dangkhoasdc/awesome-ai-residency
List of AI Residency Programs
LiqunW/Long-document-dataset
isekulic/longformer-marco
Longformer for MS MARCO document re-ranking task.
amazon-science/wqa_tanda
This repo provides code and data used in our TANDA paper.
google-research/text-to-text-transfer-transformer
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
deepampatel/TwinBert
pytorch implementation of the TwinBert paper
jingtaozhan/RepBERT-Index
RepBERT is a competitive first-stage retrieval technique. It represents documents and queries with fixed-length contextualized embeddings. The inner products of them are regarded as relevance scores. Its efficiency is comparable to bag-of-words methods.
google-research/language
Shared repository for open-sourced projects from the Google AI Language team.
liangsi03/hibert_model
mhagiwara/100-nlp-papers
100 Must-Read NLP Papers
rajpurkar/SQuAD-explorer
Visually Explore the Stanford Question Answering Dataset
allenai/bi-att-flow
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
zzj0402/NQinSQuAD
Google's Natural Question dataset in SQuAD format
sohamray19/distilBertNQ
Question answering on the Google Natural Questions data corpus using HuggingFace's Transformers library
infinite007/Natural_Questions
Solving the problem of multi-hop open domain question answering on the Natural Questions dataset created by Google.
google-research-datasets/natural-questions
Natural Questions (NQ) contains real user questions issued to Google search, and answers found from Wikipedia by annotators. NQ is designed for the training and evaluation of automatic question answering systems.
dmolony3/SMITH
Pytorch implementation of SMITH - Siamese multi-depth transformer based hierarchical encoder
pandeykartikey/Hierarchical-Attention-Network
Implementation of Hierarchical Attention Networks in PyTorch
uvipen/Hierarchical-attention-networks-pytorch
Hierarchical Attention Networks for document classification
HLTCHKUST/dialogue-emotion
Hierarchical Attention for Dialogue Emotion Classification (SemEval, NAACL)
nlpyang/hiersumm
Code for paper Hierarchical Transformers for Multi-Document Summarization in ACL2019
Hellisotherpeople/CX_DB8
a contextual, biasable, word-or-sentence-or-paragraph extractive summarizer powered by the latest in text embeddings (Bert, Universal Sentence Encoder, Flair)
huggingface/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
flairNLP/flair
A very simple framework for state-of-the-art Natural Language Processing (NLP)
lipiji/hierarchical-encoder-decoder
Hierarchical encoder-decoder framework for sequences of words, sentences, paragraphs and documents using LSTM and GRU in Theano
sanyam5/skip-thoughts
The first public PyTorch implementation of Skip-Thought Vectors