Pinned Repositories
CLIP
Contrastive Language-Image Pretraining
euclidesprobationem.github.io
Machine Proof Group Website
leetcode-master
《代码随想录》LeetCode 刷题攻略:200道经典题目刷题顺序,共60w字的详细图解,视频难点剖析,50余张思维导图,支持C++,Java,Python,Go,JavaScript等多语言版本,从此算法学习不再迷茫!🔥🔥 来看看,你会发现相见恨晚!🚀
macaron-net
Codes for "Understanding and Improving Transformer From a Multi-Particle Dynamic System Point of View"
numeracy-literacy
Code for paper: "Numeracy Enhances the Literacy of Language Models"
SHU-selfreport
上海大学健康之路每日一报/每日两报自动打卡
SlowFast
PySlowFast: video understanding codebase from FAIR for reproducing state-of-the-art video models.
tensorpack
A Neural Net Training Interface on TensorFlow, with focus on speed + flexibility
TimeSformer
The official pytorch implementation of our paper "Is Space-Time Attention All You Need for Video Understanding?"
Transformer-Explainability
[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.
J1372628520's Repositories
J1372628520/CLIP
Contrastive Language-Image Pretraining
J1372628520/euclidesprobationem.github.io
Machine Proof Group Website
J1372628520/leetcode-master
《代码随想录》LeetCode 刷题攻略:200道经典题目刷题顺序,共60w字的详细图解,视频难点剖析,50余张思维导图,支持C++,Java,Python,Go,JavaScript等多语言版本,从此算法学习不再迷茫!🔥🔥 来看看,你会发现相见恨晚!🚀
J1372628520/macaron-net
Codes for "Understanding and Improving Transformer From a Multi-Particle Dynamic System Point of View"
J1372628520/numeracy-literacy
Code for paper: "Numeracy Enhances the Literacy of Language Models"
J1372628520/SHU-selfreport
上海大学健康之路每日一报/每日两报自动打卡
J1372628520/SlowFast
PySlowFast: video understanding codebase from FAIR for reproducing state-of-the-art video models.
J1372628520/tensorpack
A Neural Net Training Interface on TensorFlow, with focus on speed + flexibility
J1372628520/TimeSformer
The official pytorch implementation of our paper "Is Space-Time Attention All You Need for Video Understanding?"
J1372628520/Transformer-Explainability
[CVPR 2021] Official PyTorch implementation for Transformer Interpretability Beyond Attention Visualization, a novel method to visualize classifications by Transformer based networks.