There should be documentation of how to use nimble on Huggingface transformers
Closed this issue ยท 0 comments
lsb commented
๐ Feature
In much the same way that there is documentation of wrapping Nimble around a torchvision model, there should be documentation (and benchmarks?) around wrapping Nimble around ๐ค language models.
Unclear: whether this is a docs-only change.
Motivation
Nimble looks interesting and I am interesting in speeding up my NLP runs and I only see docs for torchvision
Pitch
- Adapt Nimble to ingest some ๐ค transformers or other models. (This may be a no-op)
- Write it up on the readme