kanyun-inc/fairseq-gec

Inference results using "generate.sh"

Opened this issue · 1 comments

Hi ☺

Thank you for your great work in advance.

I tried to generate inference results using your pretrained model and my dataset (It consists of two files: source and target).

But after preprocessing ("preprocess.sh"), I've encountered a problem when i run "generate.sh":

>>> bash generate.sh 0 _expr_ht
_last
Traceback (most recent call last):
  File "generate.py", line 196, in <module>
    cli_main()
  File "generate.py", line 192, in cli_main
    main(args)
  File "generate.py", line 111, in main
    hypos = task.inference_step(generator, models, sample, prefix_tokens)
  File "/data02/jeiyoon_park/fairseq-gec/fairseq/tasks/fairseq_task.py", line 243, in inference_step
    return generator.generate(models, sample, prefix_tokens=prefix_tokens)
  File "/data02/jeiyoon_park/anaconda3/envs/ydfu/lib/python3.7/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
    return func(*args, **kwargs)
  File "/data02/jeiyoon_park/fairseq-gec/fairseq/sequence_generator.py", line 382, in generate
    scores.view(bsz, beam_size, -1)[:, :, :step],
  File "/data02/jeiyoon_park/fairseq-gec/fairseq/search.py", line 83, in step
    torch.div(self.indices_buf, vocab_size, out=self.beams_buf)
RuntimeError: result type Float can't be cast to the desired output type Long

I checked generated "outputema_last.nbest.txt" file and found some logs

Label file not found: /data02/jeiyoon_park/fairseq-gec/ht_result/test.label.src.txt
Label file not found: /data02/jeiyoon_park/fairseq-gec/ht_result/test.label.tgt.txt

I just want to infer the results using your pretrained model and my dataset, not train a model

Could you let me know how to infer the results?

Thank you!

Im using PyTorch 1.12.1 version by the way.