Bibliography (4):
Unsupervised Cross-lingual Representation Learning at Scale
mT5: A massively multilingual pre-trained text-to-text transformer
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Wikipedia Bibliography:
https://en.wikipedia.org/wiki/Amazon_Alexa :
https://en.wikipedia.org/wiki/Amazon_Alexa