This is the repository that contains source code for the Zero-TPrune website.
If you find Zero-TPrune useful for your work please cite:
@article{wang2023zero,
title={Zero-TPrune: Zero-Shot Token Pruning through Leveraging of the Attention Graph in Pre-Trained Transformers},
author={Wang, Hongjie and Dedhia, Bhishma and Jha, Niraj K.},
journal={arXiv preprint arXiv:2305.17328},
year={2023}
}
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.