[HTML][HTML] Model compression applied to small-footprint keyword spotting

G Tucker, M Wu, M Sun, S Panchapagesan, G Fu… - 2016 - amazon.science
Several consumer speech devices feature voice interfaces that perform on-device keyword
spotting to initiate user interactions. Accurate on-device keyword spotting within a tight CPU
budget is crucial for such devices. Motivated by this, we investigated two ways to improve
deep neural network (DNN) acoustic models for keyword spotting without increasing CPU
usage. First, we used low-rank weight matrices throughout the DNN. This allowed us to
increase representational power by increasing the number of hidden nodes per layer …
Abstract
Several consumer speech devices feature voice interfaces that perform on-device keyword spotting to initiate user interactions. Accurate on-device keyword spotting within a tight CPU budget is crucial for such devices. Motivated by this, we investigated two ways to improve deep neural network (DNN) acoustic models for keyword spotting without increasing CPU usage. First, we used low-rank weight matrices throughout the DNN. This allowed us to increase representational power by increasing the number of hidden nodes per layer without changing the total number of multiplications. Second, we used knowledge distilled from an ensemble of much larger DNNs used only during training. We systematically evaluated these two approaches on a massive corpus of far-field utterances. Alone both techniques improve performance and together they combine to give significant reductions in false alarms and misses without increasing CPU or memory usage.
amazon.science
Showing the best result for this search. See all results