Issues
- 0
Adding Lightning Attention 2 Support
#13 opened by James4Ever0 - 0
- 1
Problem installing
#12 opened by user-23xyz - 1
Maybe parametrize tokenizer input max length to rise to Andromeda's magnifcent potential
#11 opened by JacobFV - 1
- 0
Traceback (most recent call last): File "train_distributed_accelerate.py", line 664, in <module> main() File "train_distributed_accelerate.py", line 569, in main optim, train_loader, lr_scheduler = accelerator.prepare( File "/home/ubuntu/.local/lib/python3.8/site-packages/accelerate/accelerator.py", line 1139, in prepare result = self._prepare_deepspeed(*args) File "/home/ubuntu/.local/lib/python3.8/site-packages/accelerate/accelerator.py", line 1381, in _prepare_deepspeed raise ValueError( ValueError: You cannot create a DummyScheduler without specifying a scheduler in the config file.\
#4 opened by kyegomez - 0
Datasets scripts
#1 opened by kyegomez - 1
Build Dataset Script Fails
#2 opened by evannorstrand-mp