Emotion Detection Based on Smartphone Using User Memory Tasks and Videos

  • Nicolas SimonazziEmail author
  • Jean-Marc Salotti
  • Caroline Dubois
  • Dominique Seminel
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1253)


In this paper, we present a research study on the classification of emotions, through data gathered on a smartphone. To this end, we have developed a mobile application to elicit emotions in participants using memory tasks with success – failure manipulation and also using video clips. Interactions were recorded with accelerometer and gyroscope sensors records and keystroke on the device. We trained supervised classification models, with the records, to predict the nature of emotion elicited on two dimensions (pleasure and activation) and the success or failure related tasks memory tasks. In order to evaluate the emotion induction we have proposed a self-assessment procedure. We achieved interesting results on the pleasure dimension, by proposing a protocol with natural interactions on smartphone.


Emotion recognition Affective computing Keystroke Accelerometer Gyroscope Success – failure manipulation Video emotion elicitation Human-computer-interaction 


  1. 1.
    Picard, R.: Affective computing. Perceptual Computing Section, Media Laboratory, Massachusetts Institute of Technology (1995)Google Scholar
  2. 2.
    Hudlicka, E.: Guidelines for designing computational models of emotions. Int. J. Synth. Emot. (IJSE) 2, 26–79 (2011)CrossRefGoogle Scholar
  3. 3.
    Ekman, P.: An argument for basic emotions. Cogn. Emot. 6, 169–200 (1992)CrossRefGoogle Scholar
  4. 4.
    Russell, J.: A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161 (1980)CrossRefGoogle Scholar
  5. 5.
    Poria, S., Cambria, E., Bajpai, R.: A review of affective computing: from unimodal analysis to multimodal fusion. Inf. Fusion 37, 98–125 (2017)CrossRefGoogle Scholar
  6. 6.
    Olsen, A.: Smartphone accelerometer data used for detecting human emotions, pp. 410–415 (2016)Google Scholar
  7. 7.
    Coutrix, C.: Identifying emotions expressed by mobile users through 2D surface and 3D motion gestures, pp. 311–320 (2012)Google Scholar
  8. 8.
    Ghosh, S., Ganguly, N., Mitra, B.: TapSense: combining self-report patterns and typing characteristics for smartphone based emotion detection (2017)Google Scholar
  9. 9.
    Cui, L., Li, S.: Emotion detection from natural walking, pp. 23–33 (2016)Google Scholar
  10. 10.
    Epp, C., Lippold, M.: Identifying emotional states using keystroke dynamics, pp. 715–724 (2011)Google Scholar
  11. 11.
    Ferrer, R.A., Grenen, E.G., Taber, J.M.: Effectiveness of internet-based affect induction procedures: a systematic review and meta-analysis. Emotion 15(6), 752 (2015)CrossRefGoogle Scholar
  12. 12.
    Nummenmaa, L., Niemi, P.: Inducing affective states with success-failure manipulations: a meta-analysis. Emotion 4(2), 207 (2004)CrossRefGoogle Scholar
  13. 13.
    Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25(1), 49–59 (1994)Google Scholar
  14. 14.
    Anguita, D., Ghio, A., Oneto, L., Parra, X.: A public domain dataset for human activity recognition using smartphones (2013)Google Scholar
  15. 15.
    Khan, A., Lee, Y.-K., Lee, S.-Y.: Human activity recognition via an accelerometer-enabled-smartphone using kernel discriminant analysis, pp. 1–6 (2010)Google Scholar

Copyright information

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2021

Authors and Affiliations

  • Nicolas Simonazzi
    • 1
    Email author
  • Jean-Marc Salotti
    • 1
  • Caroline Dubois
    • 2
  • Dominique Seminel
    • 2
  1. 1.IMS, CNRS, Bordeaux INP, Bordeaux UniversityTalenceFrance
  2. 2.Orange Labs ServicePessacFrance

Personalised recommendations