[BUG] LLava-RLHF run with BFloat16 failed
Closed this issue · 1 comments
hxhcreate commented
In your implemetation at llava_rlhf_chat.py
, you set dtype and compute type ad torch.BFloat16
which failed with
mat1 and mat2 must have the same dtype but got Half and BFloat16
change it to torch.float16
solve this problem
zycheiheihei commented
Thanks for pointing out. We have fixed this.