Robust Approach for Emotion Classification Using Gait
The ability to gauge emotion via their gait is a key component in enabling machines to understand people in images, videos and in real-time. Emotion recognition using machinery has been done through various methods. Facial Expressions; necessitates an obs
- PDF / 277,075 Bytes
- 10 Pages / 439.37 x 666.142 pts Page_size
- 40 Downloads / 224 Views
Abstract The ability to gauge emotion via their gait is a key component in enabling machines to understand people in images, videos and in real-time. Emotion recognition using machinery has been done through various methods. Facial Expressions; necessitates an obstructed view of the face, Psychological Parameters; require constant and prolonged monitoring of the subject, and Sentiment Analysis; demands text be written by that individual. However, in real-time situations, these features are not readily available which makes it difficult to classify emotion. Thus, this paper proposes a technique for classification of emotions without having to compromise on the accuracy of the detection while at the same time utilising the most trivial of information; walking. A University of York dataset of videos segregated emotionwise was chosen to perform pose estimation through which important features were extracted. These features were processed in multiple machine learning classification models like; Decision Tree, Naive Bayes, SVM, K-Nearest Neighbours and Artificial Neural Network, to classify into five emotions. It was observed that this approach demonstrates a higher accuracy than most theories proposed in the past. Keywords Gait analysis · Emotional classification · Computer vision · Machine learning · Deep learning
S. Srivastava · V. Rastogi (B) · C. Prakash · D. Sethi Indira Gandhi Delhi Technical University for Women, Delhi, India e-mail: [email protected] S. Srivastava e-mail: [email protected] C. Prakash e-mail: [email protected] D. Sethi e-mail: [email protected] C. Prakash National Institute of Technology Delhi, New Delhi, India © Springer Nature Singapore Pte Ltd. 2021 D. Gupta et al. (eds.), International Conference on Innovative Computing and Communications, Advances in Intelligent Systems and Computing 1165, https://doi.org/10.1007/978-981-15-5113-0_74
885
886
S. Srivastava et al.
1 Introduction Machines interact with humans in multiple forms, above and beyond the normal mouse and screen. The initiation of tasks performed by a machine has catapulted from the click of a mouse to gesture recognition and motion sensing. With this emergence in technology, the lack of one key element in human-computer interactions is highlighted; emotions. Emotions can describe many aspects of human behaviour such as intent and actions. Emotions are gauged through facial cues and physiological parameters which are demanding. It is difficult to obtain clear frontal faces or a delicate, noiseless environment for the above-mentioned methods [1]. In contrast, human gait patterns can be computed with relative ease as less machinery is required. Recognising the distinguishing patterns to classify emotions can help resolve significant problems in many domains such as medical diagnosis, biometrics, athletics, industrial sector, etc. [2]. Robots and AI devices that understand human speech and have natural language processing interact better with humans when emotion recognition gets associated with the speech [
Data Loading...