ModuleNotFoundError: No module named 'aqt'
Opened this issue · 4 comments
I tried to run the first colab to finetune ViT-B_16 on cifar10, but got the error "ModuleNotFoundError: No module named 'aqt'" when importing bert from flaxformer.architectures.bert. It seems to be an issue about flaxformer, does anyone know how to fix it?
I just tried
https://colab.research.google.com/github/google-research/vision_transformer/blob/main/vit_jax.ipynb
on a fresh Colab VM and !pip install -qr vision_transformer/vit_jax/requirements.txt
completed without errors.
Maybe the error was transient?
You can specify a specific commit like flaxformer@git+https://github.com/google/flaxformer#5400cce007f3141c7f223a11846a2093b03a3588
(from their history) if you want to install an older version:
Line 33 in 62a446f
(unfortunately
flaxformers
does not provide tags for the Github repo, nor a versioned pypi package).Thanks for the kind reply.
Unfortunately, I think the error is not transient. I also tried the colab on fresh Colab VMs for several times, but the error always exits:
ModuleNotFoundError Traceback (most recent call last)
in <cell line: 15>()
13 from vit_jax import input_pipeline
14 from vit_jax import utils
---> 15 from vit_jax import models
16 from vit_jax import train
17 from vit_jax.configs import common as common_config4 frames
/content/./vision_transformer/vit_jax/models.py in
13 # limitations under the License.
14
---> 15 from vit_jax import models_lit
16 from vit_jax import models_mixer
17 from vit_jax import models_vit/content/./vision_transformer/vit_jax/models_lit.py in
29 from vit_jax import preprocess
30
---> 31 from flaxformer.architectures.bert import bert
32 from flaxformer.architectures.bert import configs
33/usr/local/lib/python3.10/dist-packages/flaxformer/architectures/bert/bert.py in
23
24 from flaxformer import transformer_common as common
---> 25 from flaxformer.architectures.bert import heads
26 from flaxformer.components import dense
27 from flaxformer.components import embedding/usr/local/lib/python3.10/dist-packages/flaxformer/architectures/bert/heads.py in
22 import jax.numpy as jnp
23
---> 24 from flaxformer.components import dense
25 from flaxformer.components import initializers
26 from flaxformer.types import Activation/usr/local/lib/python3.10/dist-packages/flaxformer/components/dense.py in
21 from typing import Any, Callable, Iterable, Mapping, Optional, Sequence, Tuple, Union
22
---> 23 from aqt.jax_legacy.jax import flax_layers as aqt_flax_layers
24 from aqt.jax_legacy.jax import quant_config as aqt_config
25 from aqt.jax_legacy.jax import quantization as aqtModuleNotFoundError: No module named 'aqt'
Since I'm using ViT models and the error seems to only appear when importing LiT models, I commented the lines about LiT in models.py (line 15, 28, 37 and 38) to bypass the error.
this error happens to me as well when using a copy of https://colab.research.google.com/github/google-research/vision_transformer/blob/main/vit_jax.ipynb @ZigeW 's solution bypasses the error, but removes the ability to use the LiT models I guess.
I tried out the specific commit fix in vision_transformer/vit_jax/setup.py by @andsteing and it does not work unfortunately, it throws the same error
install_requires = [
'absl-py',
'clu',
'einops',
'flax',
'flaxformer @ git+https://github.com/google/flaxformer#5400cce007f3141c7f223a11846a2093b03a3588',
'jax',
'ml-collections',
'numpy',
'packaging',
'pandas',
'scipy',
'tensorflow_datasets',
'tensorflow_probability',
'tensorflow',
'tensorflow_text',
'tqdm',
]
So I guess commenting out the lines are the way forward for now?
Ok, the problem is actually in the latest release of the aqtp
package: google/aqt#196
You can fix this by specifying !pip install aqtp==0.1.0
at the top of the Colab.
(PR fixing this repo is pending...)