The Unreasonable Effectiveness of Recurrent Neural Networks
Deep Learning for Assisting the Process of Music Composition (part 3)
Karpathy/char-Rnn: Multi-Layer Recurrent Neural Networks (LSTM, GRU, RNN) for Character-Level Language Models in Torch
ab-test#training-a-neural-net-to-generate-css
Decision Transformer: Reinforcement Learning via Sequence Modeling
https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.258.5120&rep=rep1&type=pdf
https://workshop2015.iwslt.org/downloads/IWSLT_2015_RP_13.pdf
https://mi.eng.cam.ac.uk/projects/cued-rnnlm/papers/Interspeech15.pdf
https://homepages.inf.ed.ac.uk/abmayne/publications/sennrich2016NAACL.pdf
Generative Concatenative Nets Jointly Learn to Write and Classify Reviews
Controlling Linguistic Style Aspects in Neural Language Generation
Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation
Physics of Language Models: Part 3.3, Knowledge Capacity Scaling Laws
Minimaxir/textgenrnn: Easily Train Your Own Text-Generating Neural Network of Any Size and Complexity on Any Text Dataset With a Few Lines of Code.
CTRL: A Conditional Transformer Language Model For Controllable Generation
MEGATRON-CNTRL: Controllable Story Generation with External Knowledge Using Large-Scale Language Models
Deep neural language modeling enables functional protein generation across families
RedCaps: web-curated image-text data created by the people, for the people
Controllable Natural Language Generation with Contrastive Prefixes
ab-test#training-a-neural-net-to-generate-css
Wikipedia Bibliography: