Improving Mask RCNN Convergence with PyTorch Lightning and SageMaker Debugger

#artificialintelligence 

MLPerf training times represent the state of the art in machine learning performance, in which AI industry leaders publish their best training times for a set of common machine learning models. But optimizing for training speed means these models are often complex, and difficult to move to practical applications. Last year, we published SageMakerCV, a collection of computer vision models based on MLPerf, but with added flexibility and optimization for use on Amazon SageMaker. The recently published MLPerf 2.0 adds a series of new optimizations. In this blog, discuss those optimizations, and how we can use PyTorch Lightning and the SageMaker Debugger to further improve training performance and flexibility.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found