āHuggingface: transformers Repoā, (; similar)ā :
š¤ Transformers (formerly known as
pytorch-transformersandpytorch-pretrained-bert) provides state-of-the-art general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, et al 2019, XLNet, CTRLā¦) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch.Features
As easy to use as pytorch-transformers
As powerful and concise as Keras
High performance on NLU and NLG tasks
Low barrier to entry for educators and practitioners
State-of-the-art NLP for everyone:
Deep learning researchers
Hands-on practitioners
AI/ML/NLP teachers and educators
Lower compute costs, smaller carbon footprint:
Researchers can share trained models instead of always retraining
Practitioners can reduce compute time and production costs
10 architectures with over 30 pretrained models, some in more than 100 languages
Choose the right framework for every part of a modelās lifetime:
Train state-of-the-art models in 3 lines of code
Deep interoperability between TensorFlow 2.0 and PyTorch models
Move a single model between TF2.0/PyTorch frameworks at will
Seamlessly pick the right framework for training, evaluation, production