Bibliography (3):
ByT5: Towards a token-free future with pre-trained byte-to-byte models
mT5: A massively multilingual pre-trained text-to-text transformer
https://github.com/lingjzhu/CharsiuG2P