An algorithmic framework is proposed to process acceleration and surface electromyographic (SEMG) signals for gesture recognition. It includes a novel segmentation scheme, a score-based sensor fusion scheme, and two new features. A Bayes linear classifier and an improved dynamic time-warping algorithm are utilized in the framework. In addition, a prototype system, including a wearable gesture sensing device (embedded with a three-axis accelerometer and four SEMG sensors) and an application program with the proposed algorithmic framework for a mobile phone, is developed to realize gesture-based real-time interaction. With the device worn on the forearm, the user is able to manipulate a mobile phone using 19 predefined gestures or even personalized ones. Results suggest that the developed prototype responded to each gesture instruction within 300 ms on the mobile phone, with the average accuracy of 95.0% in user-dependent testing and 89.6% in user-independent testing. Such performance during the interaction testing, along with positive user experience questionnaire feedback, demonstrates the utility of the framework.
提出了一种用于手势识别的算法框架来处理加速度和表面肌电(SEMG)信号。它包括一种新颖的分割方案、一种基于分数的传感器融合方案以及两个新特征。该框架中使用了贝叶斯线性分类器和一种改进的动态时间规整算法。此外,还开发了一个原型系统,包括一个可穿戴手势传感设备(嵌入了一个三轴加速度计和四个SEMG传感器)以及一个带有针对手机的所提出算法框架的应用程序,以实现基于手势的实时交互。将该设备佩戴在前臂上,用户能够使用19种预定义手势甚至个性化手势来操作手机。结果表明,所开发的原型在手机上能在300毫秒内对每个手势指令做出响应,在依赖用户的测试中平均准确率为95.0%,在不依赖用户的测试中为89.6%。交互测试期间的这种性能以及积极的用户体验调查问卷反馈,证明了该框架的实用性。