37
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Emotion recognition based on customized smart bracelet with built-in accelerometer

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background: Recently, emotion recognition has become a hot topic in human-computer interaction. If computers could understand human emotions, they could interact better with their users. This paper proposes a novel method to recognize human emotions (neutral, happy, and angry) using a smart bracelet with built-in accelerometer.

          Methods: In this study, a total of 123 participants were instructed to wear a customized smart bracelet with built-in accelerometer that can track and record their movements. Firstly, participants walked two minutes as normal, which served as walking behaviors in a neutral emotion condition. Participants then watched emotional film clips to elicit emotions (happy and angry). The time interval between watching two clips was more than four hours. After watching film clips, they walked for one minute, which served as walking behaviors in a happy or angry emotion condition. We collected raw data from the bracelet and extracted a few features from raw data. Based on these features, we built classification models for classifying three types of emotions (neutral, happy, and angry).

          Results and Discussion: For two-category classification, the classification accuracy can reach 91.3% (neutral vs. angry), 88.5% (neutral vs. happy), and 88.5% (happy vs. angry), respectively; while, for the differentiation among three types of emotions (neutral, happy, and angry), the accuracy can reach 81.2%.

          Conclusions: Using wearable devices, we found it is possible to recognize human emotions (neutral, happy, and angry) with fair accuracy. Results of this study may be useful to improve the performance of human-computer interaction.

          Related collections

          Most cited references24

          • Record: found
          • Abstract: found
          • Article: not found

          Embodiment of sadness and depression--gait patterns associated with dysphoric mood.

          To analyze gait patterns associated with sadness and depression. Embodiment theories suggest a reciprocal relationship between bodily expression and the way in which emotions are processed. In Study 1, the gait patterns of 14 inpatients suffering from major depression were compared with those of matched never-depressed participants. In Study 2, we employed musical mood induction to induce sad and positive mood in a sample of 23 undergraduates. A Fourier-based description of walking data served as the basis for the computation of linear classifiers and for the analysis of gait parameters. Gait patterns associated with sadness and depression are characterized by reduced walking speed, arm swing, and vertical head movements. Moreover, depressed and sad walkers displayed larger lateral swaying movements of the upper body and a more slumped posture. The results of the present study indicate that a specific gait pattern characterizes individuals in dysphoric mood.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            CASME II: An Improved Spontaneous Micro-Expression Database and the Baseline Evaluation

            A robust automatic micro-expression recognition system would have broad applications in national safety, police interrogation, and clinical diagnosis. Developing such a system requires high quality databases with sufficient training samples which are currently not available. We reviewed the previously developed micro-expression databases and built an improved one (CASME II), with higher temporal resolution (200 fps) and spatial resolution (about 280×340 pixels on facial area). We elicited participants' facial expressions in a well-controlled laboratory environment and proper illumination (such as removing light flickering). Among nearly 3000 facial movements, 247 micro-expressions were selected for the database with action units (AUs) and emotions labeled. For baseline evaluation, LBP-TOP and SVM were employed respectively for feature extraction and classifier with the leave-one-subject-out cross-validation method. The best performance is 63.41% for 5-class classification.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              What is an emotion?

                Bookmark

                Author and article information

                Contributors
                Journal
                PeerJ
                PeerJ
                PeerJ
                PeerJ
                PeerJ
                PeerJ Inc. (San Francisco, USA )
                2167-8359
                26 July 2016
                2016
                : 4
                : e2258
                Affiliations
                [1 ]School of Computer and Control Engineering, University of Chinese Academy of Sciences , Beijing, China
                [2 ]Institute of Psychology, Chinese Academy of Sciences , Beijing, China
                Article
                2258
                10.7717/peerj.2258
                4974923
                27547564
                63f882b8-4d12-4844-acc3-d40dd87821cb
                © 2016 Zhang et al.

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ) and either DOI or URL of the article must be cited.

                History
                : 13 January 2016
                : 24 June 2016
                Funding
                Funded by: National High-tech R&D Program of China
                Award ID: 2013AA01A606
                Funded by: National Basic Research Program of China
                Award ID: 2014CB744600
                Funded by: Key Research Program of Chinese Academy of Sciences (CAS)
                Award ID: KJZD-EWL04
                Funded by: CAS Strategic Priority Research Program
                Award ID: XDA06030800
                The authors received support from National High-tech R&D Program of China (2013AA01A606), National Basic Research Program of China (2014CB744600), Key Research Program of Chinese Academy of Sciences (CAS) (KJZD-EWL04), and CAS Strategic Priority Research Program (XDA06030800). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
                Categories
                Kinesiology
                Psychiatry and Psychology
                Computational Science

                emotion recognition,wearable smart device,smart bracelet,accelerometer

                Comments

                Comment on this article