Pinned Repositories
Change-Agent
Change-Agent: Towards Interactive Comprehensive Remote Sensing Change Interpretation and Analysis
ChineseASR
examples
FastBiDAF
lhotse
Math-LLaVA
Code for Math-LLaVA: Bootstrapping Mathematical Reasoning for Multimodal Large Language Models
MG-LLaVA
Official repository for paper MG-LLaVA: Towards Multi-Granularity Visual Instruction Tuning(https://arxiv.org/abs/2406.17770).
neural-summ-cnndm-pytorch
记录真正的copy trick:是将source的oov也记录下来,假设词典1000, 那么oov的id就是1000,1001.。 然后维护两个encoder的输入x,一个是有unk的x,一个是把unk的id替换成oov id的x,copy的计算因为只需要跟hidden算,所以oov也有attention。然后在最后output的时候把oov拼接到1000的pred_y上,这样id是1001的词也就被copy出来了。 提升果然非常非常大。
nq_model
multi-spans extraction model based on BERT
pan-baidu-download
百度网盘下载脚本
helloword12345678's Repositories
helloword12345678/Change-Agent
Change-Agent: Towards Interactive Comprehensive Remote Sensing Change Interpretation and Analysis
helloword12345678/ChineseASR
helloword12345678/examples
helloword12345678/FastBiDAF
helloword12345678/lhotse
helloword12345678/Math-LLaVA
Code for Math-LLaVA: Bootstrapping Mathematical Reasoning for Multimodal Large Language Models
helloword12345678/MG-LLaVA
Official repository for paper MG-LLaVA: Towards Multi-Granularity Visual Instruction Tuning(https://arxiv.org/abs/2406.17770).
helloword12345678/neural-summ-cnndm-pytorch
记录真正的copy trick:是将source的oov也记录下来,假设词典1000, 那么oov的id就是1000,1001.。 然后维护两个encoder的输入x,一个是有unk的x,一个是把unk的id替换成oov id的x,copy的计算因为只需要跟hidden算,所以oov也有attention。然后在最后output的时候把oov拼接到1000的pred_y上,这样id是1001的词也就被copy出来了。 提升果然非常非常大。
helloword12345678/nq_model
multi-spans extraction model based on BERT
helloword12345678/pan-baidu-download
百度网盘下载脚本
helloword12345678/recurrentshop
Framework for building complex recurrent neural networks with Keras
helloword12345678/RocketQA
🚀 RocketQA, dense retrieval for information retrieval and question answering, including both Chinese and English state-of-the-art models.
helloword12345678/You-Only-Speak-Once
Deep Learning - one shot learning for speaker recognition using Filter Banks