Bibliography (13):

  1. TensorFlow

  2. https://github.com/openai/gpt-2

  3. Nshepperd/gpt-2: Code for the Paper "Language Models Are Unsupervised Multitask Learners"

  4. https://github.com/ak9250/gpt-2-colab

  5. https://colab.research.google.com/

  6. Language Models are Unsupervised Multitask Learners

  7. GPT-2 Neural Network Poetry

  8. https://www.aiweirdness.com/d-and-d-character-bios-now-making-19-03-15/

  9. Minimaxir/textgenrnn: Easily Train Your Own Text-Generating Neural Network of Any Size and Complexity on Any Text Dataset With a Few Lines of Code.

  10. https://github.com/minimaxir/gpt-2-simple

  11. https://colab.research.google.com/drive/1VLG8e7YSEwypxU-noRNhsv5dW4NfTGce

  12. Wikipedia Bibliography:

    1. OpenAI

    2. Recurrent neural network