FlagOpen/FlagPerf

使用nvidia的transformer样例报错:No module named 'fairseq.data.batch_C'

Closed this issue · 1 comments

stezpy commented

具体使用 "transformer:pytorch_1.13:A100:1:8:1": "/home/datasets_ckpt/transformer/train/",经过简单测试,容器内已经安装了fairseq.data.batch_C,直接导入正常,在训练流程里导入就报错,具体测试的是FlagPerf/training/benchmarks/transformer/pytorch/run_pretraining.py

check_dir
transformer

具体报错log
Traceback (most recent call last):
File "/data/peiyuan.zhang/FlagPerf/training/benchmarks/transformer/pytorch/run_pretraining.py", line 13, in
from train.evaluator import Evaluator
File "/data/peiyuan.zhang/FlagPerf/training/benchmarks/transformer/pytorch/train/evaluator.py", line 4, in
from fairseq.data import data_utils
File "/data/peiyuan.zhang/FlagPerf/training/benchmarks/transformer/pytorch/fairseq/data/init.py", line 25, in
from .language_pair_dataset import LanguagePairDataset, load_dataset_splits
File "/data/peiyuan.zhang/FlagPerf/training/benchmarks/transformer/pytorch/fairseq/data/language_pair_dataset.py", line 26, in
from . import data_utils
File "/data/peiyuan.zhang/FlagPerf/training/benchmarks/transformer/pytorch/fairseq/data/data_utils.py", line 30, in
import fairseq.data.batch_C
ModuleNotFoundError: No module named 'fairseq.data.batch_C'

感谢反馈,#163 修复此问题。如有问题请再次打开