Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Bringing a machine learning model into the real world involves a lot more than just modeling. This Specialization will teach you how to navigate various deployment scenarios and use data more effectively to train your model.
This second course teaches you how to run your machine learning models in mobile applications. You’ll learn how to prepare models for a lower-powered, battery-operated devices, then execute models on both Android and iOS platforms. Finally, you’ll explore how to deploy on embedded systems using TensorFlow on Raspberry Pi and microcontrollers.
This Specialization builds upon our TensorFlow in Practice Specialization. If you are new to TensorFlow, we recommend that you take the TensorFlow in Practice Specialization first. To develop a deeper, foundational understanding of how neural networks work, we recommend that you take the Deep Learning Specialization.
Syllabus
- Device-based models with TensorFlow Lite
- Welcome to this course on TensorFlow Lite, an exciting technology that allows you to put your models directly and literally into people's hands. You'll start with a deep dive into the technology, and how it works, learning about how you can optimize your models for mobile use -- where battery power and processing power become an important factor. You'll then look at building applications on Android and iOS that use models, and you'll see how to use the TensorFlow Lite Interpreter in these environments. You'll wrap up the course with a look at embedded systems and microcontrollers, running your models on Raspberry Pi and SparkFun Edge boards. Don't worry if you don't have access to the hardware -- for the most part you'll be able to do everything in emulated environments. So, let's get started by looking at what TensorFlow is and how it works!
- Running a TF model in an Android App
- Last week you learned about TensorFlow Lite and you saw how to convert your models from TensorFlow to TensorFlow Lite format. You also learned about the standalone TensorFlow Lite Interpreter which could be used to test these models. You wrapped with an exercise that converted a Fashion MNIST based model to TensorFlow Lite and then tested it with the interpreter. This week you'll look at the first of the deployment types for this course: Android. Android is a versatile operating system that is used in a number of different device type, but most commonly phones, tablets and TV systems. Using TensorFlow Lite you can run your models on Android, so you can bring ML to any of these device types. While it helps to understand some Android programming concepts, we hope that you'll be able to follow along even if you don't, and at the very least try out the full sample apps that we'll explore for Image Classification, Object Detection and more!
- Building the TensorFLow model on IOS
- The other popular mobile operating system is, of course, iOS. So this week you'll do very similar tasks to last week -- learning how to take models and run them on iOS. You'll need some programming background with Swift for iOS to fully understand everything we go through, but even if you don't have this expertise, I think this weeks content is something you'll find fun to explore -- and you'll learn how to build a variety of ML applications that run on this important operating system!
- TensorFlow Lite on devices
- Now that you've looked at TensorFlow Lite and explored building apps on Android and iOS that use it, the next and final step is to explore embedded systems like Raspberry Pi, and learn how to get your models running on that. The nice thing is that the Pi is a full Linux system, so it can run Python, allowing you to either use the full TensorFlow for Training and Inference, or just the Interpreter for Inference. I'd recommend the latter, as training on a Pi can be slow!
Taught by
Laurence Moroney