Why You Should (or Shouldn't) be Using Google's JAX in 2022

#artificialintelligence 

Since Google's JAX hit the scene in late 2018, it has been steadily growing in popularity, and for good reason. DeepMind announced in 2020 that it is using JAX to accelerate its research, and a growing number of publications and projects from Google Brain and others are using JAX. With all of this buzz, it seems like JAX is the next big Deep Learning framework, right? In this article we'll clarify what JAX is (and isn't), why you should care (or shouldn't, but you probably should), and whether you should (or shouldn't) use it. If you're already familiar with JAX and want to skip the benchmarks, you can jump ahead to our recommendations on when to use it here It may be best to start off with what JAX is not. JAX is not a Deep Learning framework or library, and it is not designed to ever be a Deep Learning framework or library in and of itself. In a sentence, JAX is a high performance, numerical computing library which incorporates composable function transformations[1]. This is the universal aspect of JAX that is relevant for any use case.