Skip to main content

May 2020 News

May 2020 newsletter: GPT-3 scaling, implications, deep theory; anime GAN updates, and 1 book review.

2019-12-262023-03-27 finished certainty: certain importance: log bibliography

May 2020’s newsletter is now out; previous, April 2020 (archives). This is a collation of links and summary of major changes, overlapping with my Changelog⁠; brought to you by my donors on Patreon⁠.


  • Ganbooru prototype: released:

    256px BigGAN trained on Danbooru2019 Danbooru2019 Figures dataset


    Mailing List Switch
    The newsletter moved this month to Substack due to reaching the TinyLetter 5000-subscriber limit. Please let me know of any issues beyond the known issue of length truncation. (Note that reading the website version on desktop is the recommended way for annotations etc.)

On GPT-3: Meta-Learning, Scaling, Implications, And Deep Theory

On “GPT-3: Language Models are Few-Shot Learners”, Brown et al 2020 (poems & my followup GPT-3 Creative Writing⁠, compare my old finetuned GPT-2 poetry⁠; random samples⁠; “OpenAI API” with real-world demos)

Moved to “The Scaling Hypothesis”.









  1. Don’t worry: we already have short-shorts & ear-TIPS to hedge against fursona inflation. That said, we advise taking a large position in equineties image macro funds to benefit from a flight to quality and herding: it’ll be a bear market for kinky bonds—and that’s no bull.↩︎

  2. Some interesting references on viral evolution: