facebookresearch/maws
Code and models for the paper "The effectiveness of MAE pre-pretraining for billion-scale pretraining" https://arxiv.org/abs/2303.13496
Jupyter NotebookNOASSERTION
Stargazers
- antx-code
- asetsuna
- cenkbircanogluAdevinta
- flrngel@Ainbr
- fly51flyPRIS
- g-moschetti
- GitHub30Osaka, Japan
- godlikeldh
- hkf
- imisraFacebook
- ipruningEdinburgh, UK ⇌ Shanghai, China
- jihaonewCUHK, MMLab
- karanotsingyuChina
- licongguanBeijing Jiaotong University
- mannatsinghFacebook
- maxi-wBerlin
- michalwolsNew York
- mmuckleyFAIR at Meta
- moein-shariatnia
- muzairkhattakEPFL
- QuentinDuvalMontreal
- riccardomusmeci-enel
- rohit-guptaUniversity of Central Florida
- rohitgirdhar@facebookresearch
- ryadav-ias
- senlinucBeijing
- seshurajup@dolcera
- sgjheywaGoogle DeepMind
- sinwoobang@a15t
- sour4bh
- tsb0601
- ucalyptus2Africa
- VibashanJohns Hopkins University
- willard-yuan@meituan -> @kwai
- xiaoxin83121Shanghai
- zhangquan920