Certificate of Registration Media number Эл #ФС77-53688 of 17 April 2013. ISSN 2308-6033. DOI 10.18698/2308-6033
  • Русский
  • Английский

Emotion recognition system based on the facial motor units’ analysis

Published: 24.08.2016

Authors: Bobe A.S., Konyshev D.V., Vorotnikov S.A.

Published in issue: #9(57)/2016

DOI: 10.18698/2308-6033-2016-9-1530

Category: Mechanical Engineering and Machine Science | Chapter: Robots, Mechatronics, and Robotic Systems

The article describes the human emotion recognition system embodiment to support verbal communication with service anthropomorphic robots and considers existing emotion recognition approaches. We investigated a new algorithm for P. Ekman's estimation of main emotions based on 20 informative facial image features evaluation. Three independent classifiers calculated each emotion intensity. The algorithm is implemented in Qt medium and tested on two image databases, as well as in real time, showing average recognition rate of about 85%. The system can be used in neurocomputer interface applications in robotics and psychological diagnosis systems.

[1] Konyshev D.V., Vorotnikov S.A., Vybornov N.A. Prikaspiyskiy Zhurnal: Upravlenie i Vysokie Tekhnologii - Caspian Journal: Control and High Technologies, 2014, no. 3, pp. 216-229.
[2] Breazeal P. Robot Emotion: a functional perspective. Who Needs Emotions: The Brain Meets the Robot. The MIT Press Publ., 2004, рр. 137-168.
[3] Ekman P., Friesen W.V., Hager J.C. Facial Acton Coding System. The Manual. Research Nexus division of Network Information Research Corporation Publ., 2002.
[4] Tariq U., Lin K., Li Z., Zhou Z, Wang Z., Le V., Huang T.S., Lv X., Han T.X., Emotion Recognition from an Ensemble of Features. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions, 2012, vol. 42, no. 4, pp. 1017-1026.
[5] Zhao G., Pietikainen M. Experiments with Facial Expression Recognition Using Spatiotemporal Local Binary Patterns. University of OuluPubl., Finland, 2007.
[6] Zhao G., Pietikainen M. Dynamic Texture Recognition Using Local Binary Patterns with an Application to Facial Expressions. IEEE Trans. Pattern Analysis and Machine Intelligence, 2007, vol. 29, pp. 915-928.
[7] Bartlett M.S., Hager J.C, Ekman P., Seynowskie T.J. Measuring facial expressions by computer image analysis. Cambridge University Press Publ., 1999, pp. 253-263.
[8] Chandrani S., Washef A., Soma M., Debasis M. Facial Expressions: A Cross-Cultural Study. Emotion Recognition: A Pattern Analysis Approach. Wiley Publ., 2015, pp. 69-86.
[9] Zaboleeva-Zotova A.V. Otkrytoe obrazovanie - Open Education, 2011, no. 2, pp. 59-62.
[10] Milborrow S., Nicolls F. Active Shape Models with SIFT Descriptors andMARS, VISAPP (2) Publ., 2014, pp. 380-387.
[11] Gabor D. Theory of Communication. Part 1: The analysis of information. Journal of the IEE, 1946, vol. 93, no. 26, pp. 429-441.
[12] Babiy M.S., Chekalov A.P. Vestnik Sumskogo Gosudarstvennogo Universiteta. Seriya Tekhnicheskie nauki - Bulletin of Sumy State University. Series Technical Sciences, 2012, no. 1, pp. 20-24.
[13] Knyazev B.A., Gapanyuk Yu.E. Inzhenerny vestnik - Engineering Bulletin, 2013, pp. 509-524. Available at: (accessed 27 February, 2015).
[14] Kanade T., Cohn J. F., Tian Y. Comprehensive database for facial expression analysis. Proc. of the Fourth IEEE International Conference Automatic Face and Gesture Recognition. Grenoble, France, 2000, pp. 46-53.
[15] Lucey P., Cohn J.F., Kanade T., Saragih J., Ambadar Z., Matthews I. The Extended Cohn-Kanade Dataset (CK+): A complete expression dataset for action unit and emotion-specified expression. Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops, 2010, pp. 94-101.