Pinned Repositories
DSSA
Document Sequence with Subtopic Attention
FLARE
Forward-Looking Active REtrieval-augmented generation (FLARE)
LAMA
LAnguage Model Analysis
lm-calibration
LPAQA
Language model Prompt And Query Archive
oie_rank
Iterative Rank-Aware Open IE
OmniTab
Pretraining with Natural and Synthetic Data for Few-shot Table-based Question Answering
ReAtt
Retrieval as Attention
RelEnt
X-FACTR
jzbjyb's Repositories
jzbjyb/oie_rank
Iterative Rank-Aware Open IE
jzbjyb/LAMA
LAnguage Model Analysis
jzbjyb/imagehash
A Python Perceptual Image Hashing Module
jzbjyb/clickmodels
jzbjyb/allennlp
An open-source NLP research library, built on PyTorch.
jzbjyb/attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
jzbjyb/globalLinearModel
jzbjyb/gumbel-softmax
categorical variational autoencoder using the Gumbel-Softmax estimator
jzbjyb/hotTopic
jzbjyb/language-style-transfer
jzbjyb/rri
jzbjyb/rri_match
jzbjyb/supervised-oie
Code for training a Neural Open IE model (NAACL2018)
jzbjyb/tensorflow
Computation using data flow graphs for scalable machine learning
jzbjyb/Tensorflow-101
TensorFlow Tutorials
jzbjyb/weibo
jzbjyb/word2vec
This tool provides an efficient implementation of the continuous bag-of-words and skip-gram architectures for computing vector representations of words. These representations can be subsequently used in many natural language processing applications and for further research.