Goto

Collaborating Authors

 opentuner



Review for NeurIPS paper: AdaTune: Adaptive Tensor Program Compilation Made Efficient

Neural Information Processing Systems

Weaknesses: OpenTuner * OpenTuner (http://opentuner.org) is a general-purpose auto-tuning framework that is widely used in compiler / automatic program optimization. The authors do not mention or compare against them. OpenTuner is built for tuning various knobs in a compiler / automatic program optimization setting. The authors only use simulated annealing. I would like to see how the optimization times compare with OpenTuner which can adaptively use different search techniques. Evaluation on larger NNs * Results from optimizing ResNet-50 would be more compelling.


Multitask and Transfer Learning for Autotuning Exascale Applications

Sid-Lakhdar, Wissam M., Aznaveh, Mohsen Mahmoudi, Li, Xiaoye S., Demmel, James W.

arXiv.org Machine Learning

Multitask learning and transfer learning have proven to be useful in the field of machine learning when additional knowledge is available to help a prediction task. We aim at deriving methods following these paradigms for use in autotuning, where the goal is to find the optimal performance parameters of an application treated as a black-box function. We show comparative results with state-of-the-art autotuning techniques. For instance, we observe an average $1.5x$ improvement of the application runtime compared to the OpenTuner and HpBandSter autotuners. We explain how our approaches can be more suitable than some state-of-the-art autotuners for the tuning of any application in general and of expensive exascale applications in particular.