Scaling Training of HuggingFace Transformers With Determined

#artificialintelligence 

Training complex state-of-the-art natural language processing (NLP) models is now a breeze, thanks to HuggingFace -- making it an essential open-source go-to for data scientists and machine learning engineers to implement Transformers models and configure them as state-of-the-art NLP models with straightforward library calls. As a result, the library has become crucial for training NLP models, like in Baidu or Alibaba, and has contributed to state-of-the-art results in several NLP tasks. Our friends at Determined AI are hosting an exciting lunch-and-learn covering training HuggingFace Transformers at scale using Determined! Learn to train Transformers with distributed training, hyperparameter searches, and cheap spot instances -- all without modifying code. Please consider joining on Wednesday, June 30th at 10 AM PT for a hands-on tutorial from Liam Li, a Senior Machine Learning Engineer at Determined AI, and Angela Jiang, a Product Manager at Determined AI (lunch included!).

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found