huggingface/peft

Backward compatibility on saved config.

xkszltl opened this issue · 5 comments

Feature request

Model trained on newer peft should be available on older version when possible.

Motivation

We found model trained on peft 0.10 cannot be loaded with older peft due to unknown entry layer_replication in adapter config json.
This entry is never used by us so it's probably just the default value.
Default value like this should not be exported.

Your contribution

N/A

What you are asking is not backwards compatibility, but forward compatibility. The case you describe is backwards compatible.

Removing defaults is not always a good idea, because it means we cannot change the defaults in the future if we find it necessary. Being explicit is better here.

For your use case, you can just manually delete the attributes you don't need and it should work.

It is backward compatibility on the model file schema.
New version of runtime should avoid breaking change to the resulting model.

And regarding changing default in the future, it’s better for future version to worry about that as part of its compatibility.
IMO it’s simply a bad idea to change implicit default without introducing a new API because that’s a breaking change.
If there’s a need for variable default, be explicit, have something like “auto” for user to enroll into that rolling behavior, or have a util func to generate the default value of that version.

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Not stale