Last week I published a brief analysis of the OpenMined platform as one of the new technologies that is trying to enable truly decentralized artificial intelligence(AI) processes by leveraging blockchain technologies. In the article, I mentioned that OpenMined drew parts of its inspiration from Google's research about federated learning as a mechanism to improve on the traditional centralized approach to train AI models. From my perspective, I consider federated learning is one of the most interesting AI research breakthroughs of the last two years that is already powering mission critical applications.
The emerging field of decentralized artificial intelligence (AI) is becoming one of the most exciting technology trends of the last few months. A lot has been written about the potential value of the intersection of artificial intelligence (AI) and blockchain technologies and we, this year, we have even entire conferences dedicated to the subject of decentralized AI. However, I feel that a lot of the hype behind decentralized AI fails to highlight some of the key value propositions of the new technology movement that can make it one of the most foundational technology trends of this decade. If you believe in the idea that AI is going to become an increasingly influential factor in our daily lives, I believe decentralized AI will be an essential element to guide the impact that machine intelligence will have in future generations. Let's look at some of the economic dynamics behind decentralized AI to try to clarify our point.
In 2017 Google introduced Federated Learning (FL), "a specific category of distributed machine learning approaches which trains machine learning models using decentralized data residing on end devices such as mobile phones." A new Google paper has now proposed a scalable production system for federated learning to enable increasing workload and output through the addition of resources such as compute, storage, bandwidth, etc. The Google paper also addresses various FL challenges, solutions and future prospects. "Federated Learning is a distributed machine learning approach which enables model training on a large corpus of decentralized data. We have built a scalable production system for Federated Learning in the domain of mobile devices, based on TensorFlow. In this paper, we describe the resulting high-level design, sketch some of the challenges and their solutions, and touch upon the open problems and future directions".
The recent rapid development of artificial intelligence (AI, mainly driven by machine learning research, especially deep learning) has achieved phenomenal success in various applications. However, to further apply AI technologies in real-world context, several significant issues regarding the AI ecosystem should be addressed. We identify the main issues as data privacy, ownership, and exchange, which are difficult to be solved with the current centralized paradigm of machine learning training methodology. As a result, we propose a novel model training paradigm based on blockchain, named Galaxy Learning, which aims to train a model with distributed data and to reserve the data ownership for their owners. In this new paradigm, encrypted models are moved around instead, and are federated once trained. Model training, as well as the communication, is achieved with blockchain and its smart contracts. Pricing of training data is determined by its contribution, and therefore it is not about the exchange of data ownership. In this position paper, we describe the motivation, paradigm, design, and challenges as well as opportunities of Galaxy Learning.
Today's AI still faces two major challenges. One is that in most industries, data exists in the form of isolated islands. The other is the strengthening of data privacy and security. We propose a possible solution to these challenges: secure federated learning. Beyond the federated learning framework first proposed by Google in 2016, we introduce a comprehensive secure federated learning framework, which includes horizontal federated learning, vertical federated learning and federated transfer learning. We provide definitions, architectures and applications for the federated learning framework, and provide a comprehensive survey of existing works on this subject. In addition, we propose building data networks among organizations based on federated mechanisms as an effective solution to allow knowledge to be shared without compromising user privacy.