Goto

Collaborating Authors

Amazon AWS, Hugging Face team up to spread open-source deep learning

ZDNet

Two years ago, the New York-based startup Hugging Face burst onto the natural language processing scene with a way to let many more parties participate in state-of-the art deep learning. Transformers, a programming kit to quickly grab any of a number of natural language neural networks, including Google's BERT, was posted as a library to be invoked from the dominant programming frameworks, PyTorch and TensorFlow. It has grown and grown in popularity. More than 5,000 organizations are now using the library, the company says. The popularity of that toolkit has been recognized by Amazon AWS, the world's biggest clouding computing provider, and the two companies Tuesday announced a partnership to combine Transformers with the best aspects of Amazon AWS's programming tools.


What Hugging Face and Microsoft's collaboration means for applied AI

#artificialintelligence

Last week, Hugging Face announced a new product in collaboration with Microsoft called Hugging Face Endpoints on Azure, which allows users to set up and run thousands of machine learning models on Microsoft's cloud platform. Having started as a chatbot application, Hugging Face made its fame as a hub for transformer models, a type of deep learning architecture that has been behind many recent advances in artificial intelligence, including large language models like OpenAI GPT-3 and DeepMind's protein-folding model AlphaFold. Large tech companies like Google, Facebook, and Microsoft have been using transformer models for several years. But the past couple of years has seen a growing interest in transformers among smaller companies, including many that don't have in-house machine learning talent. This is a great opportunity for companies like Hugging Face, whose vision is to become the GitHub for machine learning.


What Hugging Face and Microsoft's collaboration means for applied AI

#artificialintelligence

This article is part of our series that explores the business of artificial intelligence. Last week, Hugging Face announced a new product in collaboration with Microsoft called Hugging Face Endpoints on Azure, which allows users to set up and run thousands of machine learning models on Microsoft's cloud platform. Having started as a chatbot application, Hugging Face made its fame as a hub for transformer models, a type of deep learning architecture that has been behind many recent advances in artificial intelligence, including large language models like OpenAI GPT-3 and DeepMind's protein-folding model AlphaFold. Large tech companies like Google, Facebook, and Microsoft have been using transformer models for several years. But the past couple of years has seen a growing interest in transformers among smaller companies, including many that don't have in-house machine learning talent.


Hugging Face collaborates with Microsoft for new AI-powered service – TechCrunch

#artificialintelligence

Fresh off a $100 million funding round, Hugging Face, which provides hosted AI services and a community-driven portal for AI tools and data sets, today announced a new product in collaboration with Microsoft. Called Hugging Face Endpoints on Azure, Hugging Face co-founder and CEO Clément Delangue described it as a way to turn Hugging Face-developed AI models into "scalable production solutions." "The mission of Hugging Face is to democratize good machine learning," Delangue said in a press release. "We're striving to help every developer and organization build high-quality, machine learning-powered applications that have a positive impact on society and businesses. With Hugging Face Endpoints, we've made it simpler than ever to deploy state-of-the-art models, and we can't wait to see what Azure customers will build with them." The demand for AI remains high.


Hugging Face launches popular Transformers NLP library for TensorFlow

#artificialintelligence

Maker of the popular PyTorch-Transformers model library, Hugging Face today said it's bringing its NLP library to the TensorFlow machine learning framework. The PyTorch version of the library has seen more than 500,000 Pip installs since the beginning of the year, Hugging Face CEO Clément Delangue told VentureBeat. The Transformers library for TensorFlow brings together the most advanced Transformers-based AI models, like Google's BERT and XLNet, Facebook's RoBERTa, and OpenAI's GPT and GPT-2. It also includes Hugging Face's DistilBERT. Each of the models exceeds human performance and ranks atop the GLUE benchmark leaderboard.