MoE support
James4Ever0 opened this issue · 1 comments
James4Ever0 commented
Mixtral 8x7b is out. Is there any plan for RWKV to support MoE in the future, with inference speedup? Looking forward to it.
BlinkDL commented
Yes :) Likely in a few months.