Paper Summary: Neural Ordinary Differential Equations

#artificialintelligence 

NIPS 2018 (Montreal, Canada), or NeurIPS, as it is called now, is over, and I would like to take the opportunity to dissect one of the papers that received the Best Paper Award at this prestigious conference. The name of the paper is Neural Ordinary Differential Equations (arXiv link) and its authors are affiliated to the famous Vector Institute at the University of Toronto. In this post, I will try to explain some of the main ideas of this paper as well as discuss their potential implications for the future of the field of Deep Learning. Since the paper is quite advanced and touches on concepts such as Ordinary Differential Equations (ODE), Recurrent Neural Networks (RNN) or Normalizing Flows (NF), I suggest that you read up on these terms if you are not familiar with them, since I will not go into details on these. However, I will try to explain the ideas of the paper as intuitively as possible, so that you may get the main concepts without going too much into the technical details. If you are interested, you may read up on these details afterwards in the original paper.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found