Fusion and Comparison of IMU and EMG Signals for Wearable Gesture Recognition

  • Marcus GeorgiEmail author
  • Christoph Amma
  • Tanja Schultz
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 574)


We evaluate the performance of a wearable gesture recognition system for arm, hand, and finger motions, using the signals of an Inertial Measurement Unit (IMU) worn at the wrist, and the Electromyogram (EMG) of muscles in the forearm. A set of 12 gestures was defined, similar to manipulatory movements and to gestures known from the interaction with mobile devices. We recorded performances of our gesture set by five subjects in multiple sessions. The resulting data corpus is made publicly available to build a common ground for future evaluations and benchmarks. Hidden Markov Models (HMMs) are used as classifiers to discriminate between the defined gesture classes. We achieve a recognition rate of 97.8 % in session-independent, and of 74.3 % in person-independent recognition. We give a detailed analysis of error characteristics and of the influence of each modality to the results to underline the benefits of using both modalities together.


Wearable computing Gesture recognition Inertial Measurement Unit Electromyography 



This research was partly funded by Google through a Google Faculty Research Award.


  1. 1.
    Alibali, M.W.: Gesture in spatial cognition: Expressing, communicating, and thinking about spatial information. Spat. Cogn. Comput. 5(4), 307–331 (2005)Google Scholar
  2. 2.
    Amma, C., Georgi, M., Schultz, T.: Airwriting: a wearable handwriting recognition system. Pers. Ubiquit. Comput. 18(1), 191–203 (2014). CrossRefGoogle Scholar
  3. 3.
    Benbasat, A.Y., Paradiso, J.A.: An inertial measurement framework for gesture recognition and applications. In: Wachsmuth, I., Sowa, T. (eds.) GW 2001. LNCS (LNAI), vol. 2298, pp. 9–20. Springer, Heidelberg (2002) CrossRefGoogle Scholar
  4. 4.
    Chen, X., Zhang, X., Zhao, Z., Yang, J., Lantz, V., Wang, K.: Hand gesture recognition research based on surface emg sensors and 2d-accelerometers. In: 2007 11th IEEE International Symposium on Wearable Computers, pp. 11–14 (2007).
  5. 5.
    Cho, S.J., Oh, J.K., Bang, W.C., Chang, W., Choi, E., Jing, Y., Cho, J., Kim, D.Y.: Magic wand: a hand-drawn gesture input device in 3-d space with inertial sensors. In: Ninth International Workshop on Frontiers in Handwriting Recognition, 2004 IWFHR-9 2004, pp. 106–111 October 2004Google Scholar
  6. 6.
    Hartmann, B., Link, N.: Gesture recognition with inertial sensors and optimized dtw prototypes. In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Istanbul, Turkey, 10–13 October 2010. pp. 2102–2109. Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Istanbul, Turkey, 10–13 October 2010, IEEE (Oct 2010).
  7. 7.
    Hauptmann, A.G., McAvinney, P.: Gestures with speech for graphic manipulation. Int. J. Man Mach. Stud. 38(2), 231–249 (1993). CrossRefGoogle Scholar
  8. 8.
    Kim, D., Hilliges, O., Izadi, S., Butler, A.D., Chen, J., Oikonomidis, I., Olivier, P.: Digits: Freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In: Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, pp. 167–176. UIST 2012, ACM, New York, NY, USA (2012).
  9. 9.
    Kim, J., Mastnik, S., André, E.: Emg-based hand gesture recognition for realtime biosignal interfacing. In: Proceedings of the 13th International Conference on Intelligent User Interfaces. IUI 2008, vol. 39, pp. 30–39. ACM Press, New York, NY, USA (2008).
  10. 10.
    Li, Y., Chen, X., Tian, J., Zhang, X., Wang, K., Yang, J.: Automatic recognition of sign language subwords based on portable accelerometer and emg sensors. In: International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction, pp. 17–1. ICMI-MLMI 2010, ACM, New York, NY, USA (2010).
  11. 11.
    Mistry, P., Maes, P., Chang, L.: WUW-wear Ur world: a wearable gestural interface. In: Proceedings of CHI 2009, pp. 4111–4116 (2009).
  12. 12.
    Oviatt, S.: Ten myths of multimodal interaction. Commun. ACM 42(11), 74–81 (1999)CrossRefGoogle Scholar
  13. 13.
    Rabiner, L.: A tutorial on hidden markov models and selected applications in speech recognition. Proc. IEEE 77(2), 257–286 (1989)CrossRefGoogle Scholar
  14. 14.
    Rekimoto, J.: GestureWrist and GesturePad: unobtrusive wearable interaction devices. In: Proceedings Fifth International Symposium on Wearable Computers (2001)Google Scholar
  15. 15.
    Samadani, A.A., Kulic, D.: Hand gesture recognition based on surface electromyography. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (2014)Google Scholar
  16. 16.
    Saponas, T.S., Tan, D.S., Morris, D., Balakrishnan, R.: Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. pp. 515–524. CHI 2008, ACM Press, New York, New York, USA (2008).
  17. 17.
    Wolf, M.T., Assad, C., Stoica, A., You, K., Jethani, H., Vernacchia, M.T., Fromm, J., Iwashita, Y.: Decoding static and dynamic arm and hand gestures from the jpl biosleeve. In: 2013 IEEE Aerospace Conference, pp. 1–9 (2013).
  18. 18.
    Zhang, X., Chen, X., Li, Y., Lantz, V., Wang, K., Yang, J.: A framework for hand gesture recognition based on accelerometer and emg sensors. IEEE Trans. Syst., Man Cybern., Part A: Syst. Humans 41(6), 1064–1076 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Cognitive Systems Lab, Institute for Anthropomatics and RoboticsKarlsruhe Institute of TechnologyKarlsruheGermany

Personalised recommendations