tensor comprehension
Tensor Comprehensions in PyTorch
Tensor Comprehensions (TC) is a tool that lowers the barrier for writing high-performance code. It generates GPU code from a simple high-level language and autotunes the code for specific input sizes. We highly recommend reading the Tensor Comprehensions blogpost first. If you ran into any of the following scenarios, TC is a useful tool for you. Tensor Comprehensions are seamless to use in PyTorch, interoperating with PyTorch Tensors and nn Variables.
[R] Tensor Comprehensions in PyTorch • r/MachineLearning
Up to a dozen assignment expressions per TC definition sounds reasonable. In practice it depends on the layer types. There are basically two scaling limiters: compilation and especially autotuning time grows very fast as a function of a number of expressions; the amount of exploitable parallelism may decrease with the increasing number of inter-dependent operations. One of the goals of TC is to make it easy to define new layers and look at practically achieved performance. It should be as easy as moving the line between two TC defs and changing input/output tensor lists.
Announcing Tensor Comprehensions
Today, Facebook AI Research (FAIR) is announcing the release of Tensor Comprehensions, a C library and mathematical language that helps bridge the gap between researchers, who communicate in terms of mathematical operations, and engineers focusing on the practical needs of running large-scale models on various hardware backends. The main differentiating feature of Tensor Comprehensions is that it represents a unique take on Just-In-Time compilation to produce the high-performance codes that the machine learning community needs, automatically and on-demand. As a consequence and over the last few years, the deep learning community has grown to rely on high-performance libraries such as CuBLAS, MKL, and CuDNN to get high-performance code on GPUs and CPUs. Experimenting with ideas that deviate from the primitives provided in these libraries involves a level and magnitude of engineering that can be intimidating to researchers. We anticipate great practical value in open-sourcing a package that shortens this process from days or weeks to minutes.