nomic-ai/pygpt4all

Accessing model weights for bending?

domkirke opened this issue · 3 comments

First, thank you so much for this work! It works amazingly in my computer (Mac M1, with gpt4all)

A little weird question : is there any way, by chance, to access the weights/activation of the model? I am working on a network bending library, and would love to apply that on llama.

Thank you so much!

Hi @domkirke,

Thanks for the kind words. `Glad you found it useful
Yeah, a weird question, but a good one 😆

I didn't try it but the idea I have on top of my mind is -if you have the llama model (.pth file)-, just load it with Pytorch, I think you can access the weights/activations of the model. Have you tried that ?

Hi @abdeladim-s,
thanks for the answer! Ok, I filled the Google Form and wait for the model. I assume that it will be sufficient afterwards to convert the model into a gmml one using convert-pth-to-ggml.py?

Hi @domkirke,

If you just want to access the weights, you don't need to convert it to ggml, just use Pytorch.

Another idea that I have on top of my mind (but not sure if it will work) is to do the inverse, use convert-ggml-to-pth.py to convert the ggml to .pth and load the model state with Pytorch!