huggingface/transformers

#artificialintelligence 

Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides state-of-the-art general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet...) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over 32 pretrained models in 100 languages and deep interoperability between TensorFlow 2.0 and PyTorch. Choose the right framework for every part of a model's lifetime This repo is tested on Python 2.7 and 3.5 (examples are tested only on python 3.5), PyTorch 1.0.0 and TensorFlow 2.0.0-rc1 First you need to install one of, or both, TensorFlow 2.0 and PyTorch. Please refere to TensorFlow installation page and/or PyTorch installation page regarding the specific install command for your platform. When TensorFlow 2.0 and/or PyTorch has been installed, Transformers can be installed using pip as follows: Here also, you first need to install one of, or both, TensorFlow 2.0 and PyTorch.