Code and samples from the paper "Language Models are Unsupervised Multitask Learners".
For now, we have only released a smaller (117M parameter) version of GPT-2.
See more details in our blog post.
Download the model data (needs gsutil):
sh download_model.sh 117M
Install python packages:
pip3 install -r requirements.txt
Unconditional sample generation
|WARNING: Samples are unfiltered and may contain offensive content.|
To generate unconditional samples from the small model:
python3 src/generate_unconditional_samples.py | tee samples
There are various flags for controlling the samples:
python3 src/generate_unconditional_samples.py --top_k 40 --temperature 0.7 | tee samples
While we have not yet released GPT-2 itself, you can see some unconditional samples from it (with default settings of temperature 1 and no truncation) in
Conditional sample generation
To give the model custom prompts, you can use:
We may release code for evaluating the models on various benchmarks.
We are still considering release of the larger models.