PyTorch implementation of FNet: Mixing Tokens with Fourier Transforms.
Clone this repository.
git clone https://github.com/jaketae/fnet.git
Navigate to the cloned directory. You can start using the model via
>>> from fnet import FNet
>>> model = FNet()
By default, the model comes with the following parameters:
FNet(
d_model=256,
expansion_factor=2,
dropout=0.5,
num_layers=6,
)
While transformers have proven to be successful in various domains, its O(n^2)
computation complexity has been considered a structural weakness. Many attempts have been made to optimize the model architecture. The authors of the paper present FNet, a model that replaces self-attention with standard unparametrized Fourier transforms. Not only is FNet faster and computationaly more efficient than the classic transformer, but it also retains 92% of BERT's accuracy on the GLUE benchmark. Given a smaller number of parameters, FNet outperformed transformers.