Bibliography (5):

  1. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  2. Attention Is All You Need

  3. https://github.com/naver/biobert-pretrained

  4. https://github.com/dmis-lab/biobert