Bibliography (7):

  1. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank

  2. https://sites.google.com/site/offensevalsharedtask/olid

  3. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  4. XLNet: Generalized Autoregressive Pretraining for Language Understanding

  5. DeBERTa: Decoding-enhanced BERT with Disentangled Attention

  6. https://github.com/UCF-ML-Research/TrojText