Milabench Size
Opened this issue · 0 comments
Delaunay commented
- llama3 - 70B is big (132G)
- We could generate it on the fly #250
- One possible issue is that it generate 8 checkpoints for 8 GPUs
- Can we load the model on 4 GPUs ?
2.5M ./cache
325M ./data/processed
263G ./data/llama3_70B # <= Duplicated weights
252M ./data/raw
15G ./data/llama3_8B
25G ./data/FakeImageNet
304G ./data
2.8M ./runs
5.7G ./venv/gnn # <= gnn need their own pytorch
6.4G ./venv/torch
13G ./venv
316G .