Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Building and Enabling Voice Control with ARM Cortex-M

tinyML via YouTube

Overview

Explore a partner session from tinyML Asia 2021 focusing on building and enabling voice control with ARM Cortex-M. Discover Sensory's Edge-AI technology suite, with a special emphasis on their VoiceHub platform, which offers developers free tools to create custom wake words, phrase-spotted voice control, and large vocabulary grammars with NLU. Learn how embedded voice models are built, exported, and demonstrated on ARM Cortex-M platforms. Gain insights into Sensory's support for Voice AI, the advantages of tinyML on ultra-low-cost and ultra-low power platforms, and various aspects of voice technology including language support, operation points, and voice user interfaces. Delve into topics such as TrulyHandsFree, NaturalSensory Voice, and command set projects throughout this informative 32-minute presentation.

Syllabus

Introduction
Sensor
Agenda
Who is Sensory
How Sensory is supporting Voice AI
Why tinyML and Sensory can run on diverse ultralowcost ultralow power platforms
Truly HandsFree
Sensory
Shorthand
Natural
Sensory Voice
Language Support
Initial Size
Output Format
Operation Point
Text Input
Language Model
Download Language Model
Command Set Project
Voice User Interface
Questions
Handspring
Sponsors

Taught by

tinyML

Reviews

Start your review of Building and Enabling Voice Control with ARM Cortex-M

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.