RUCAIBox/TextBox

AttributeError: 'SingleSentenceDataLoader' object has no attribute 'vocab_size'

only-yao opened this issue · 4 comments

当我运行 python run_textbox.py --model=LeakGAN --dataset=COCO --task_type=unconditional 的时候, 总是报 AttributeError: 'SingleSentenceDataLoader' object has no attribute 'vocab_size , 如果使用预训练模型的话,使用那里 LeakGAN模型呢,🙏

We have fixed the bug you mentioned. Thanks for your correction.
Now our LeakGAN only support RNN-based architecture, following original paper.

We have fixed the bug you mentioned. Thanks for your correction.
Now our LeakGAN only support RNN-based architecture, following original paper.

我更新了最新的代码, 出现了以下错误

Traceback (most recent call last):
File "run_textbox.py", line 19, in
run_textbox(
File "/data/TextBox/textbox/quick_start/quick_start.py", line 64, in run_textbox
best_valid_score, best_valid_result = trainer.fit(train_data, valid_data, saved=saved)
File "/data/TextBox/textbox/trainer/trainer.py", line 1394, in fit
train_loss = self._g_train_epoch(train_data, epoch_idx)
File "/data/TextBox/textbox/trainer/trainer.py", line 1350, in _g_train_epoch
total_loss = self._optimize_step(losses, total_loss, self.model.generator, self.g_optimizer)
File "/data/TextBox/textbox/trainer/trainer.py", line 1274, in _optimize_step
loss.backward(retain_graph=True if i < len(opt) - 1 else False)
File "/opt/conda/lib/python3.8/site-packages/torch/tensor.py", line 221, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)
File "/opt/conda/lib/python3.8/site-packages/torch/autograd/init.py", line 130, in backward
Variable._execution_engine.run_backward(
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [32, 300]], which is output 0 of TBackward, is at ver

torch == 1.7.0
transformers == 4.1.1

We have fixed this bug, which is due to version difference. Thanks for your correction.

We have fixed this bug, which is due to version difference. Thanks for your correction.

问题已经解决, 非常感谢你的回复