RowitZou/topic-dialog-summ

two questiones

Closed this issue · 3 comments

wac81 commented
  1. how to replace bert pretrained model ,which file can be modified? like BART from fackbook
  2. Are complete training data used
  1. models/rl_model.py, line 44:
          self.encoder = Bert(args.bert_dir, args.finetune_bert)
    

You could replace BERT with other pre-trained encoders, e.g., Roberta. However, seq2seq architectures like BART are not supported yet, because we designed our own decoder.

  1. Yes.
wac81 commented

RuntimeError: Given normalized_shape=[768], expected input with shape [*, 768], but got input of size[4, 4, 1024]

i try Roberta model,but got error , could you give me all modify suggestion for code or args?

wac81 commented

it's done. i fixed all args according to the prompt