mTransSee
Enabling Environment-Independent mmWave Sensing Based Gesture Recognition via Transfer Learning
Article Ecrit par: Liu, Haipeng ; Cui, Kening ; Hu, Kaiyuan ; Wang, Yuheng ; Zhou, Anfu ; Liu, Liang ; Ma, Huadong ;
Résumé: Gesture recognition using millimeter-wave radios facilitates natural human-computer interactions, but existing works require a consistent environment, i.e., the neural networks for recognition are trained and tested for the same users and at some fixed positions. In this case, their performance will decrease rapidly when they enter into a new environment. To make the model applicable in different environments, a straightforward approach is to collect and re-train the model for the gesture samples on every possible position upon each new user. However, it may ask the users to spend unacceptable time to accomplish such adaptation, which makes it difficult to be widely used in practice. In this paper, we first collect an abundant mmWave gesture dataset containing 59,280 samples as a benchmark to investigate the impact of the environment changes quantitatively. Then we propose a novel transfer-learning approach called mTransSee, which can serve the gestures in practice using pre-learned experience by least adaptation, i.e., retraining using only 8 samples per gesture for the same accuracy. mTransSee reduces dozens of workloads for the environment adaptation. We implement mTransSee on a commodity mmWave sensor and make a user study to compare the advance of mTransSee over the state-of-the-art solution in terms of user experience during adaptation.
Langue:
Anglais