“Crowdsourcing The Best GPT-2-1.5b Poetry”, Gwern2020-02-09 (; backlinks; similar)⁠:

[Public-editable Google Docs document for coordinating a read through a large sample of neural-net-generated poetry, to locate the best poem samples for displaying in the GPT-2 writeup.]

I used a large neural net model, GPT-2-1.5b, trained on hundreds of megabytes of poetry, to generate 1 million words of poetry. That’s too much for me to read by myself to find the best poems. Perhaps you’d like to help?

It’s simple:

  1. Pick an unread URL from ‘Open Samples’ below, open it, and remove it from the list.

  2. Read it. (Each URL is ≤ 1,000 lines, so it should be fun.)

  3. Add any good poems to ‘Selected Samples’ at the end of this document.

  4. Enjoy reading the current ‘Selected Samples’—or pick another URL to read!