Refurbish Your Training Data - Reusing Partially Augmented Samples for Faster Deep Neural Network Training

Refurbish Your Training Data - Reusing Partially Augmented Samples for Faster Deep Neural Network Training

USENIX via YouTube Direct link

Challenge: Inconsistent Batch Time

8 of 17

8 of 17

Challenge: Inconsistent Batch Time

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Refurbish Your Training Data - Reusing Partially Augmented Samples for Faster Deep Neural Network Training

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 DNN Training Pipeline
  3. 3 Overhead of Data Augmentation
  4. 4 Existing Approach: Data Echoing
  5. 5 Our Approach: Data Refurbishing
  6. 6 Analysis on Sample Diversity
  7. 7 Standard Training
  8. 8 Challenge: Inconsistent Batch Time
  9. 9 PyTorch Dataloader
  10. 10 Revamper
  11. 11 Balanced Eviction
  12. 12 Cache-Aware Shuffle
  13. 13 Implementation
  14. 14 Evaluation: Environments
  15. 15 Evaluation: Baselines
  16. 16 Evaluation: Accuracy & Throughput
  17. 17 Conclusion

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.