A problem about the Paper: Adversarial Training Methods For Semi-Supervised Text Classification
xljhtq opened this issue · 1 comments
Hi:
I have read the paper: Adversarial Training Methods For Semi-Supervised Text Classification.
Does the normalization of the input vectors need to be added if the input embeddings are not trainable ? Because I know in the case of the trainable input embeddings, the normalization must exist.
What do you have in mind ?
Hello,
In my opinion, the goal of adding adversarial perturbation to word embedding is improving the quality of word embedding, which can further improve the classification performance as the paper noted.
Since you are using pre-trained word embedding and want to fix it, I think there is little performance gain compared to the great extra training time brought by the adversarial training.
It may be helpful that adding a fully-connected layer(map vocab size to vocab size) to adapt to the pre-trained word embedding to your task, and you can optimize the parameter of this layer during the adversarial training.