Goto

Collaborating Authors

New Research Shows How AI Can Act as Mediators

#artificialintelligence

According to VentureBeat, AI researchers at Uber have recently posted a paper to Arxiv outlining a new platform intended to assist in the creation of distributed AI models. The platform is called Fiber, and it can be used to drive both reinforcement learning tasks and population-based learning. Fiber is designed to make large-scale parallel computation more accessible to non-experts, letting them take advantage of the power of distributed AI algorithms and models. Fiber has recently been made open-source on GitHub, and it's compatible with Python 3.6 or above, with Kubernetes running on a Linux system and running in a cloud environment. According to the team of researchers, the platform is capable of easily scaling up to hundreds or thousands of individual machines.


Fiber: A Platform for Efficient Development and Distributed Training for Reinforcement Learning and Population-Based Methods

arXiv.org Machine Learning

Recent advances in machine learning are consistently enabled by increasing amounts of computation. Reinforcement learning (RL) and population-based methods in particular pose unique challenges for efficiency and flexibility to the underlying distributed computing frameworks. These challenges include frequent interaction with simulations, the need for dynamic scaling, and the need for a user interface with low adoption cost and consistency across different backends. In this paper we address these challenges while still retaining development efficiency and flexibility for both research and practical applications by introducing Fiber, a scalable distributed computing framework for RL and population-based methods. Fiber aims to significantly expand the accessibility of large-scale parallel computation to users of otherwise complicated RL and population-based approaches without the need to for specialized computational expertise.


Uber Open-Sources Fiber - A New Library For Distributed Machine Learning

#artificialintelligence

Latest technologies such as machine learning and deep learning require a colossal amount of data to improve its outcomes' accuracy. However, it is nearly impossible for a local computer to process the vast amount of data. As a result, practitioners use distributed computing for obtaining high-computational power to deliver quick and accurate results. However, effectively managing distributed computation is not straightforward, and this causes hindrance in training and evaluating AI models. To address these challenges, Uber has open-sourced its Fiber framework to help researchers and developers streamline their large-scale parallel scientific computation.


OpenAI's Microscope, TensorFlow Profiler & More: AI Releases This Week

#artificialintelligence

This week, we witnessed open-source tools focusing mostly on making models lighter and explainable. OpenAI, especially, has come up with an interesting tool to promote the interpretability of ML models. Furthermore, TensorFlow has made it even more simple for developers to execute their models. Let us take a look at top AI news for developers from this week. OpenAI Microscope tool is a collection of visualisations of every significant layer and neuron of eight vision'model organisms', which are often studied in interpretability.


Last Week in AI

#artificialintelligence

Every week, Invector Labs publishes a newsletter that covers the most recent developments in AI research and technology. You can find this week's issue below. You can sign up for it below. Training is one of the frequently overlooked elements of building machine learning solutions at scale. While training machine learning models seems relatively simple conceptually, it gets really complicated when applied to large models or to a large number of models.