Transformers 2.0: NLP library with deep interoperability between TensorFlow 2.0 and PyTorch
Last week, Hugging Face, a startup specializing in natural language processing, released a landmark update to their popular Transformers library, offering unprecedented compatibility between two major deep learning frameworks, PyTorch and TensorFlow 2.0. Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet…) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over 32 pretrained models in 100 languages and deep interoperability between TensorFlow 2.0 and PyTorch.
Oct-2-2019, 01:17:47 GMT
- Technology: