Author(s): |
Yi Chen, School of Mechatronics Engineering, University of Electronic Science and Technology of China,Chengdu, 611731, China Ping Yang, School of Mechatronics Engineering, University of Electronic Science and Technology of China,Chengdu, 611731, China Xuguang Chen, School of Mechatronics Engineering, University of Electronic Science and Technology of China,Chengdu, 611731, China |
Abstract: |
With the development of smart phone, novel user interface such as gesture draws researchers much attention. Meanwhile, most smart phones have been equipped with MEMS IMU (accelerometer and gyroscope), which enables gesture input. However, many recent approaches are weak at efficiency and robustness because of the modeling methods and only accelerometer being occupied. This paper proposed a simple but practical approach based on IMU: firstly the ten gestures were classified into four categories according to their linguistic and operation similarity in gesture definition phase. In the recognition stage, the captured gesture was classified by a three-stage classifier with the gesture’s kinematic features extracted from sensor data. Then the gesture was recognized according to its acceleration changing patterns. Meanwhile, with strict feature threshold restrictions to gestures, the unconscious movements were eliminated significantly. Experiment among 16 volunteers achieved an average accuracy of 95.4% and a recognition time within 0.01s, which validates the feasibility of the proposed method in terms of accuracy and efficiency.
|