Support other embeddings
Closed this issue · 1 comments
robvanvolt commented
You say: "Currently we only support models for inverting OpenAI text-embedding-ada-002 embeddings but are hoping to add more soon. (We can provide the GTR inverters used in the paper upon request.)".
I want to use a different embeddings model (in my case, BAAI/bge-m3 - using 1024 dimensions) - does the training procedure work the same?
Or would I need to adjust it / use different base models?
And could you share your code for training step 0?
jxmorris12 commented
Hi! The code can be used to train step 0 and has instructions on how to do so. You just have to swap out the embedding model. We also are hoping to release some more inversion models by the end of this year.