aws-neuron/transformers-neuronx

Backward compatibility with saved llama 2 compiled artifacts

Opened this issue · 1 comments

After upgrading to transformers-neuronx == 0.9.474, I am not able to reload the compiled artifacts for my llama model I saved using transformers-neuronx == 0.8.268.

FileNotFoundError: Could not find a matching NEFF for your HLO in this directory. Ensure that the model you are trying to load is the same type and has the same parameters as the one you saved or call "save" on this model to reserialize it.

The model is identical and so are my parameters. Could you confirm that this is expected (I assume the LLamaForSampling hlo has been modified between the two versions) ?

Will future releases be backward compatible ? This is important to know when defining a strategy for saving neuron models.

@dacorvo - unfortunately we do not support this. The model needs to be recompiled. We are aware of this limitation but at the moment I cannot give you an ETA for the fix.