/standup-generator

Generating standup comedy jokes with OpenAI's GPT-2

Primary LanguageJupyter Notebook

Standup Generator

An exploration into text generation using OpenAI's GPT-2 model. The transformer is fine-tuned on standup comedy transcripts to generate its own jokes. This task is inherently difficult, as computers are notoriously bad at understanding (and especially generating) humor. It's worth noting that the transformer is trained on transcripts from a wide variety of comedians of various styles and deliveries of standup comedy.

Results

Loss: 2.85

Perplexity: 17.29

Note: Some aspects of the data collection, saving, and loading were adapted from https://gist.github.com/nwams/da9290c4a21c1fddfc5cba9f82f8ba5a