Completed
QUANTIZATION AWARE TRAINING Maintain comparable Performance & Sperdup Inference using INTB Precision
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Accelerating Vision AI Applications Using NVIDIA Transfer Learning Toolkit and Pre-Trained Models
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 TRAINING CHALLENGES
- 3 TRANSFER LEARNING TOOLKIT (TLT)
- 4 TRANSFER LEARNING TOOLKIT 2.0
- 5 PURPOSE BUILT PRE-TRAINED NETWORKS Highly Accurate Re-Trainable Out of Box Deployment
- 6 QUANTIZATION AWARE TRAINING Maintain comparable Performance & Sperdup Inference using INTB Precision
- 7 AUTOMATIC MIXED PRECISION (AMP) Train with half-precision while maintaining network accuracy same as single precision
- 8 INSTANCE SEGMENTATION - MASK R-CNN
- 9 PEOPLENET
- 10 FACE MASK DETECTION
- 11 TRAINING WORKFLOW
- 12 CONVERT TO KITTI
- 13 TLT SPEC FILES
- 14 PREPARE THE DATASET
- 15 TRAIN - PRUNE - EVALUATE
- 16 TRAINING SPEC - DATASET AND MODEL
- 17 EVALUATION SPEC
- 18 TRAINING & EVALUATION
- 19 MODEL PRUNING
- 20 RE-TRAIN & EVALUATE
- 21 TRAINING KPI
- 22 QUANTIZATION & EXPORT
- 23 INFERENCE SPEC
- 24 DEPLOY USING DEEPSTREAM
- 25 SUMMARY