/paralm

Reprepesentation Learning using Paraphrase Objectives

Primary LanguageJupyter Notebook

ParaLM

This is a scratch REPO! BEWARE

This is an exploration of representation learning using paraphrases.

ParaSCI is used to train an Encoder and Decoder model by reconstructing the input sentences from the output sentence and vice versa.

we look at the following:

  • Simple seq2seq transformer in pytorch trained to generate these
  • Autoencoder methods
  • VICReg
  • Simple Seq2Seq paraphrasing