“Interacting With GPT-2 to Generate Controlled and Believable Musical Sequences in ABC Notation”, Cariña Geerlings, Albert Meroño-Peñuela2020-10-16 (, , ; backlinks; similar)⁠:

Generating symbolic music with language models is a promising research area, with potential applications in automated music composition. Recent work shows that Transformer architectures can learn to generate compelling four-instrument scores from large MIDI datasets.

In this paper, we re-train the small (117M) GPT-2 model with a large dataset in ABC notation, and generate samples of single-instrument folk music.

Our BLEU and ROUGE based quantitative, and survey based qualitative, evaluations suggest that ABC notation is learned with syntactical and semantic correctness, and that samples contain robust and believable n-grams.