Evolving Neural Networks in JAX
"So why should I switch from insert-autodiff-library to JAX?". Here is my answer: JAX is not simply a fast library for automatic differentiation. If your scientific computing project wants to benefit from XLA, JIT-compilation and the bulk-array programming paradigm -- then JAX provides a wonderful API. While PyTorch relies on pre-compiled kernels and fast C code for most common Deep Learning applications, JAX allows us to leverage a high-level interface for programming your favorite accelerators. But this is not restricted to standard gradient-based optimization setups.
Feb-14-2021, 06:00:21 GMT
- Technology: