Utilities and experiments for training RAVE.
Currently, A10 instances on Lambda Cloud (cloud.lambdalabs.com) are the target deployment environment.
- Start your A10 instance on Lambda Cloud, creating and downloading a PEM file key if you haven't already
- Once running, launch the instance's Cloud IDE (JupyterLab)
- Make sure your audio data is in a directory called
/home/ubuntu/training-data
. You can do this e.g. by using the JupyterLab upload feature. - Create a new terminal in JupyterLab
- Clone this repository:
git clone https://github.com/becker929/rave-training.git && cd rave-training
- Make script executable:
chmod +x ./setup-rave-lambdalabs.sh
- Run script
./setup-rave-lambdalabs.sh
- To begin training, replace
<TRAINING-RUN-NAME>
and run the command:nohup /home/ubuntu/.pyenv/versions/3.10.11/bin/python3.10 ./rave-training.py --name="<TRAINING-RUN-NAME>" &
- Confirm that RAVE is training
- You may need to press
return
or open a new terminal - Run
tail -f /home/ubuntu/rave-training/nohup.out
- You may need to press
- In a terminal, run
tensorboard --logdir runs --port 6080
- In your local terminal, run ssh -i ~/path/to/lambda-key.pem -N -f -L localhost:16080:localhost:6080 ubuntu@<your.instance.ip>
- Now, you should be able to view the progress of the training via tensorboard by visiting
localhost:16080
in your browser - To export for use in MaxMSP etc., run
/home/ubuntu/.pyenv/versions/3.10.11/bin/rave export --run="/home/ubuntu/runs/<RUN-FOLDER>" --streaming
- Other uses may require that you remove the
--streaming
flag. See the official RAVE Readme for details.
- Other uses may require that you remove the
Too complicated? Reach out to me and I may be able to help.
The rave
subfolder contains a fork of RAVE/rave
.
Local changes to rave
:
- Apply Automatic Mixed Precision (amp) to forward pass
- Default V2 + Wasserstein config
Todo:
- Experiment with increased batch size (24 vs 8)
- Compilation using torch.compile (in progress)
- Checking for simple best practices
- Experiment with Composer
- Experiment with reimplementation in JAX / Haiku