csebuetnlp/banglanmt

An error occurred while translating an English txt file.

Closed this issue · 2 comments

I'm not sure what I'm doing wrong. I followed the testing/evaluation guidelines. I'd like to convert an English text file to a Bangla text file. Kindly help.
Here are the logs: Inside the input directory (here 'eng'), I also have 'data' and 'vocab' directories.

python pipeline.py  --src_lang bn --tgt_lang en -i ./eng/ -o ./bang/  --eval_model ./base_en2bn.pt  --do_eval 
sh: line 1: spm_export_vocab: command not found
sh: line 1: spm_export_vocab: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
sh: line 1: spm_encode: command not found
Error: mkl-service + Intel(R) MKL: MKL_THREADING_LAYER=INTEL is incompatible with libgomp.so.1 library.
	Try to import numpy first or set the threading layer accordingly. Set MKL_SERVICE_FORCE_INTEL to force it.
Traceback (most recent call last):
  File "pipeline.py", line 355, in <module>
    main(args)
  File "pipeline.py", line 227, in main
    evaluate(args)
  File "pipeline.py", line 207, in evaluate
    translate(args.eval_model, "test", args)
  File "pipeline.py", line 105, in translate
    with open(merged_tgt_file) as inpf:
FileNotFoundError: [Errno 2] No such file or directory: './bang/temp/merged.tgt'

This looks like an installation problem, please make sure SentencePiece CLI and the other dependencies are installed correctly.

@marufmoinuddin Please take a look here.