Wearable-Based Affect Recognition—A Review
Abstract
:1. Introduction
2. Interdisciplinary Background
2.1. Working Definitions of Affective Phenomena
2.2. Emotion Models
- Categorical models: Here different emotions are represented in discrete categories.
- Dimensional models: Following this approach, emotions are mapped into a multidimensional space, where each of the axis represents a continuous variable.
2.3. Stress Models
3. Physiological Changes and Objective Measures
3.1. Affective States and Their Physiological Indicators
- Physiological measures are indirect measures of an affective state.
- Emotions are subjective but physiological data are not.
- Although some physiological patterns are shared across subjects, individual responses to a stimulus can differ strongly.
- Multimodal affect detecting systems reach higher accuracies than unimodal systems [1].
- The physiological signal quality often suffers from noise, induced by motion artefacts and misplacement.
3.2. Frequently Employed Sensors
3.2.1. Cardiac Activity
3.2.2. Electrodermal Activity
3.2.3. EMG
3.2.4. Respiration
3.2.5. Skin-Temperature
3.2.6. EEG and EOG
- Both require the placement of electrodes on face/scalp. Hence, EEG and EOG are quite intrusive and not practical for everyday life.
- They pose strong limitations on the movement of the participants and, hence, are not really applicable in real world scenarios.
- EOG and EEG are prone to noise generated by muscle activity.
3.2.7. Inertial Sensors
3.2.8. Context
4. Affect-Related User Studies
4.1. Affect-Related User Studies in Laboratory Settings
- C1
- Social-evaluative Stressors: A task creating a socially relevant situation for the subject. For example, performing a task in front of a panel which evaluates the subject.
- C2
- Cognitive Stressors: A task demanding significant mental engagement and attention. For example, performing an (challenging) arithmetic task under time pressure.
- C3
- Physical Stressors: A task creating a physically uncomfortable situation. For example, being exposed to extreme hot or cold.
4.2. Affect-Related User Studies in The Field
4.2.1. Guidelines for Ecological-Momentary-Assessment
- Sampling rate: When defining the number of scheduled EMAs over the observation period, the trade-off should between sampling as frequently as possible while not overloading the subject needs to be leveraged. A good compromise is to schedule an EMA every two hours [29] or approximately five times over the day [118].
- General scheduling: A good practice is to schedule EMAs randomly. This ensures that the subjects are unprepared. If the EMAs shall be distribute approximately evenly over the observation, the following approach could be used: Divide the observation period into N sections (where N is the total number of EMAs over the observation period), and randomly schedule one EMA within each section. This approach was applied for example by Muaremi et al. [74]. Considering user studies in the lab, EMAs are typically scheduled directly after each affective stimulus or condition [47].
- Manual trigger: As EMAs are commonly scheduled randomly during field studies, these questionnaires are independent of the participants’ affective states. Therefore, it is good practice to allow subjects to file an EMA (in addition to the generally scheduled ones) whenever they feel a change in their affective state. For example, Gjoreski et al. [13] enabled their study participants to log stressful events whenever they occurred.
- Number of items: In order to avoid overloading subjects, the time required to answer an EMA should be minimized. Therefore, EMAs should focused on the goal of the study and include a minimal number items. A good compromise is to include at most ten items per scheduled EMA, as discussed by Muaremi et al. [74]. Considering lab studies, the length of an EMA is usually less critical: Here EMAs can be used during the cool-down phase after an affective stimulus, which allows the completion of longer EMAs.
- Situation labels: It is important to generate labels on the spot and not in hindsight. This is due to memorization effects (e.g., halo effect), where the occurrence of a certain emotion can influence the perception of other affective states experienced during the observation period. Considering a field study, however, it is good practice to review the labels together with the study participant, for example, on a daily basis [87,95].
- Length of labels: For a (mentally) healthy subject, affective states are expected to be stable on short time scales. However, when labels are generated using EMAs, the question arises how long these labels are valid. Considering lab studies, the labels generate using a questionnaire usually refer to the preceding stimulus (e.g., TSST). Considering field studies, however, the validity of labels is not as trivial. Depending on the focus of the study, one has to decide on a label length. If the study addresses mood, longer label periods, for example, 2 h [29], can be taken into account. If the study targets shorter affective states (e.g., emotions or stress), shorter label periods are used. For example, in order to detect and classify stress, Gjoreski et al. [13] considered ten minutes before and after each provided label.
- Ensure engagement: Considering field studies, subjects motivation is key and keeping the subjects motivated will ensure high-quality labels, regarding both frequency and completeness. One way to boost motivation is an appropriate (incremental) reward system [9,87]. Another, way to increase subjects motivation might be to make the EMA optical appealing, for example, including graphical measures like the SAM or PAM.
4.3. Publicly Available Datasets
5. Data Processing and Classification
5.1. Preprocessing and Segmentation
- Electrocardiogram Preprocessing: In the raw ECG signal the R-peaks need to be identified. For this purpose, the Pan and Tompkin’s algorithm can be applied [138]. Once the R-peaks have been detected, the next step is to determine the RR intervals and assess their validity. For example, Hovsepian et al. [95] present an algorithm to assess the validity of candidate RR intervals. Behar et al. [139], presented an approach to assess the ECG signal quality in regards to arrhythmia in the context of intensive care units. Similar approaches could also be utilized to assess the ECG quality during affect-related user studies.
- Photoplethysmogram Preprocessing: A detailed description on PPG signal preprocessing methods applied to PPG data can be found in Elgendi [140] or Biswas et al. [141]. In order to remove motion artefacts, adaptive (filtering) approaches can be applied [142,143]. In more recent work, peak matching approaches in the spectral domain were employed to remove movement artefacts [144,145]. For the determination of RR intervals from identified R-peaks, similar algorithms as mentioned with ECG preprocessing can be applied. In addition, as shown by Li and Clifford [146], the quality of a PPG signal can be assessed using a combination of dynamical time warping and multilayer perceptron
- Electrodermal activity Preprocessing: In order to remove artefacts from EDA data different approaches were presented. The approaches can be grouped into filtering and machine learning-based approaches. Only changes in the low-frequency domain of the EDA signal are physiologically plausible. Hence, low-pass filtering with a cut-off of, for example, 5 Hz [147] can be applied to remove high-frequency noise. After the noise removal, for example, Soleymani et al. [31], detrended the EDA signal by subtracting a moving average, computed on smoothed version of the signal. Machine learning-based approaches, using support vector machines or convex optimization, to identify and remove artefacts in EDA data can be found in Taylor et al. [148], Greco et al. [149]. As detailed in Section 3, the EDA signal consists of two components: A slowly varying baseline conductivity referred to as skin conductance level (SCL) and a series of peaks referred to as skin conductance response (SCR). In literature different approaches to separate these two components can be found: Benedek and Kaernbach [150], for instance, present an approach to separate SCL and SCR relying on nonnegative devolution. Alternatively, Choi et al. [56] utilized, a regularized least-squares detrending method, to separate the two components.
- Electromyogram Preprocessing: Raw EMG data is often filtered to remove noise. For example, Wijsman et al. [69] report on a two step procedure. First, a bandpass filter, allowing frequencies from 20 to 450 Hz, was applied. Then, in order to remove residual power line interference from data, notch filters were applied. The notch filters attenuated the 50, 100, 150, 200, 250, and 350 Hz components of the signal. Cardiac artefacts are another common source of noise in EMG data. Hence, Willigenburg et al. [151] propose and compare different filtering procedures to remove ECG interference from the EMG signal.
- Respiration Preprocessing: Depending on the signal quality, noise removal filtering techniques (e.g., bandpass filter with cut-off frequencies at 0.1 and 0.35 Hz) have to be applied. In addition, the raw respiration signal can be detrended by subtracting a moving average [86].
5.2. Physiological Feature Extraction
5.2.1. ACC-based Feature
5.2.2. ECG- and PPG-based Features
5.2.3. EDA-Based Features
5.2.4. EMG-Based Features
5.2.5. Respiration-Based Features
5.2.6. Temperature-Based Features
5.3. Classification
- Feature selection can help to improve classification results.
- Feature selection identifies cost-effective and yet strong predictors.
- It provides a better understanding of the processes generating the data [167].
6. Discussion And Outlook
Author Contributions
Funding
Conflicts of Interest
References
- D’mello, S.; Kory, J. A Review and Meta-Analysis of Multimodal Affect Detection Systems. ACM Comput. Surv. 2015, 47, 43:1–43:36. [Google Scholar] [CrossRef]
- Bower, G.H. Mood and memory. Am. Psychol. 1981, 36, 129–148. [Google Scholar] [CrossRef] [PubMed]
- McEwen, B.; Stellar, E. Stress and the individual: Mechanisms leading to disease. Arch. Intern. Med. 1993, 153, 2093–2101. [Google Scholar] [CrossRef] [PubMed]
- Chrousos, G.; Gold, P. The concepts of stress and stress system disorders: Overview of physical and behavioral homeostasis. JAMA 1992, 267, 1244–1252. [Google Scholar] [CrossRef] [PubMed]
- Rosmond, R.; Björntorp, P. Endocrine and metabolic aberrations in men with abdominal obesity in relation to anxio-depressive infirmity. Metabolism 1998, 47, 1187–1193. [Google Scholar] [CrossRef]
- HSE. HSE on Work Related Stress. 2016. Available online: http://www.hse.gov.uk/-statistics/causdis/-ffstress/index.htm (accessed on 6 September 2017).
- Tzirakis, P.; Trigeorgis, G.; Zafeiriou, S. End-to-end multimodal emotion recognition using deep neural networks. arXiv 2017, arXiv:1704.08619. [Google Scholar] [CrossRef]
- Mirsamadi, S.; Barsoum, E.; Zhang, C. Automatic speech emotion recognition using recurrent neural networks with local attention. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), New Orleans, LA, USA, 5–9 March 2017; pp. 2227–2231. [Google Scholar]
- Wang, R.; Chen, F.; Chen, Z.; Li, T.; Campbell, A. StudentLife: Assessing mental health, academic performance and behavioral trends of college students using smartphones. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, ACM, Seattle, WA, USA, 13–17 September 2014; pp. 3–14. [Google Scholar]
- Gangemi, A.; Presutti, V.; Recupero, D.R. Frame-Based Detection of Opinion Holders and Topics: A Model and a Tool. IEEE Comput. Intell. Mag. 2014, 9, 20–30. [Google Scholar] [CrossRef]
- Gravina, R.; Li, Q. Emotion-relevant activity recognition based on smart cushion using multi-sensor fusion. Inf. Fusion 2019, 48, 1–10. [Google Scholar] [CrossRef]
- Picard, R.; Vyzas, E.; Healey, J. Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 2001, 23, 1175–1191. [Google Scholar] [CrossRef]
- Gjoreski, M.; Luǎtrek, M.; Gams, M.; Gjoreski, H. Monitoring stress with a wrist device using context. J. Biomed. Inform. 2017, 73, 159–170. [Google Scholar] [CrossRef]
- Costa, A.; Rincon, J.; Carrascosa, C.; Julian, V.; Novais, P. Emotions detection on an ambient intelligent system using wearable devices. Future Gener. Comput. Syst. 2019, 92, 479–489. [Google Scholar] [CrossRef]
- 3, G.V. 2017. Available online: https://buy.garmin.com/en-US/US/p/567813 (accessed on 11 January 2018).
- Affectiva. 2017. Available online: https://www.affectiva.com/ (accessed on 6 January 2018).
- Poria, S.; Cambria, E.; Bajpai, R.; Hussain, A. A review of affective computing: From unimodal analysis to multimodal fusion. Inf. Fusion 2017, 37, 98–125. [Google Scholar] [CrossRef] [Green Version]
- Miller, G. The Smartphone Psychology Manifesto. Perspect. Psychol. Sci. 2012, 7, 221–237. [Google Scholar] [CrossRef] [Green Version]
- Rastafoo, M.; Nakisa, B.; Rakotonirainy, A.; Chandran, V.; Tjondronegoro, D. A Critical Review of Proactive Detection of Driver Stress Levels Based on Multimodal Measurements. ACM Comput. Surv. 2018, 51, 88. [Google Scholar] [CrossRef]
- Kim, M.; Kim, M.; Oh, E.; Kim, S. A review on the computational methods for emotional state estimation from the human EEG. Comput. Math. Methods Med. 2013, 2013, 573734. [Google Scholar] [CrossRef]
- Russell, J. Core affect and the psychological construction of emotion. Psychol. Rev. 2003, 110, 145. [Google Scholar] [CrossRef]
- Liu, B. Many Facets of Sentiment Analysis. In A Practical Guide to Sentiment Analysis; Springer: Cham, Switzerland, 2017; pp. 11–39. [Google Scholar]
- Cicero, M. Cicero on the Emotions: Tusculan Disputations 3 and 4; University of Chicago Press: Chicago, IL, USA, 2002. [Google Scholar]
- Darwin, C. The Expression of the Emotions in Man and Animals, 3rd ed.; Introduction, Afterword and Commentaries by Paul Ekman; Essay on the History of the Illustrations by Phillip Prodger; HarperCollins Publishers: London, UK, 1999. First published in 1872. [Google Scholar]
- Ekman, P. An Argument for Basic Emotions. Cogn. Emot. 1992, 6, 169–200. [Google Scholar] [CrossRef]
- Ekman, P.; Friesen, W. Facial Action Coding System: A Technique for Measurement of Facial Movement; Consulting Psychologists Press: Palo Alto, CA, USA, 1978. [Google Scholar]
- Ekman, P.; Friesen, W. Measuring facial movement. Environ. Psychol. Nonverbal Behav. 1976, 1, 56–75. [Google Scholar] [CrossRef]
- Plutchik, R. Emotion: A Psychoevolutionary Synthesis; Harper & Row: New York, NY, USA, 1980. [Google Scholar]
- Zenonos, A.; Khan, A.; Sooriyabandara, M. HealthyOffice: Mood recognition at work using smartphones and wearable sensors. In Proceedings of the PerCom Workshops, Sydney, Australia, 14–18 March 2016. [Google Scholar]
- Russell, J. Affective Space Is Bipolar; American Psychological Association: Washington, DC, USA, 1979. [Google Scholar]
- Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A Multimodal Database for Affect Recognition and Implicit Tagging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef]
- Wundt, W. Vorlesung über die Menschen- und Tierseele; Voss Verlag: Leipzig, Germany, 1863. [Google Scholar]
- Becker-Asano, C. WASABI: Affect Simulation for Agents with Believable Interactivity; IOS Press: Amsterdam, The Netherlands, 2008. [Google Scholar]
- Kim, J.; André, E. Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 2067–2083. [Google Scholar] [CrossRef]
- Koelstra, S.; Muhl, C.; Patras, I. Deap: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef]
- Valenza, G.; Citi, L.; Lanatá, A.; Scilingo, E.; Barbieri, R. Revealing real-time emotional responses: A personalized assessment based on heartbeat dynamics. Sci. Rep. 2014, 4, 4998. [Google Scholar] [CrossRef]
- Abadi, M.; Subramanian, R.; Kia, S.; Avesani, P.; Patras, I.; Sebe, N. DECAF: MEG-based multimodal database for decoding affective physiological responses. IEEE Trans. Affect. Comput. 2015, 6, 209–222. [Google Scholar] [CrossRef]
- Morris, J.D. Observations: SAM: The Self-Assessment Manikin; an efficient cross-cultural measurement of emotional response. J. Advert. Res. 1995, 35, 63–68. [Google Scholar]
- Subramanian, R.; Wache, J.; Abadi, M.; Vieriu, R.; Winkler, S.; Sebe, N. ASCERTAIN: Emotion and Personality Recognition using Commercial Sensors. IEEE Trans. Affect. Comput. 2017, 9, 147–160. [Google Scholar] [CrossRef]
- Jirayucharoensak, S.; Pan-Ngum, S.; Israsena, P. EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci. World J. 2014, 2014, 627892. [Google Scholar] [CrossRef]
- Cannon, W. Bodily Changes in Pain, Hunger, Fear and Rage; D Appleton & Company: New York, NY, USA, 1929. [Google Scholar]
- Selye, H. Stress without distress. In Psychopathology of Human Adaptation; Springer: Boston, MA, USA, 1974; pp. 26–39. [Google Scholar]
- Goldstein, D.; Kopin, I. Evolution of concepts of stress. Stress 2007, 10, 109–120. [Google Scholar] [CrossRef]
- Lu, H.; Frauendorfer, D.; Choudhury, T. StressSense: Detecting stress in unconstrained acoustic environments using smartphones. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 351–360. [Google Scholar]
- Mozos, O.; Sandulescu, V.; Andrews, S.; Ellis, D.; Bellotto, N.; Dobrescu, R.; Ferrandez, J. Stress detection using wearable physiological and sociometric sensors. Int. J. Neural Syst. 2017, 27, 1650041. [Google Scholar] [CrossRef]
- Plarre, K.; Raij, A.; Scott, M. Continuous inference of psychological stress from sensory measurements collected in the natural environment. In Proceedings of the 10th International Conference on Information Processing in Sensor Networks (IPSN), Chicago, IL, USA, 12–14 April 2011; pp. 97–108. [Google Scholar]
- Schmidt, P.; Reiss, A.; Dürichen, R.; Marberger, C.; Van Laerhoven, K. Introducing WESAD, a Multimodal Dataset for Wearable Stress and Affect Detection. In Proceedings of the 20th ACM International Conference on Multimodal Interaction, Boulder, CO, USA, 16–20 October 2018. [Google Scholar]
- Sanches, P.; Höök, K.; Vaara, E.; Weymann, C.; Bylund, M.; Ferreira, P.; Peira, N.; Sjölinder, M. Mind the Body!: Designing a Mobile Stress Management Application Encouraging Personal Reflection. In Proceedings of the 8th ACM Conference on Designing Interactive Systems, Aarhus, Denmark, 16–20 August 2010; pp. 47–56. [Google Scholar]
- Thayer, R. The Biopsychology of Mood and Arousal; Oxford University Press: Oxford, UK, 1990. [Google Scholar]
- Schimmack, U.; Reisenzein, R. Experiencing activation: Energetic arousal and tense arousal are not mixtures of valence and activation. Emotion 2002, 2, 412. [Google Scholar] [CrossRef]
- Mehrotra, A.; Tsapeli, F.; Hendley, R.; Musolesi, M. MyTraces: Investigating correlation and causation between users’ emotional states and mobile phone interaction. PACM Interact. Mob. Wearable Ubiquitous Technol. 2017, 1, 83. [Google Scholar] [CrossRef]
- James, W. What is an emotion? Mind 1884, 9, 188. [Google Scholar] [CrossRef]
- Levenson, R.; Ekman, P.; Friesen, W. Voluntary facial action generates emotion-specific autonomic nervous system activity. Psychophysiology 1990, 27, 363–384. [Google Scholar] [CrossRef]
- Friedman, B.H. Feelings and the body: The Jamesian perspective on autonomic specificity of emotion. Biol. Psychol. 2010, 84, 383–393. [Google Scholar] [CrossRef]
- McCorry, L. Physiology of the autonomic nervous system. Am. J. Pharm. Educ. 2007, 71, 78. [Google Scholar] [CrossRef]
- Choi, J.; Ahmed, B.; Gutierrez-Osuna, R. Development and evaluation of an ambulatory stress monitor based on wearable sensors. IEEE Trans. Inf. Technol. Biomed. 2012, 16, 279–286. [Google Scholar] [CrossRef]
- Dawson, M.; Schell, A.; Filion, D. The electrodermal system. In Handbook of Psychophysiology, 2nd ed.; Cambridge University Press: Cambridge, UK, 2000; pp. 200–223. [Google Scholar]
- Kreibig, S. Autonomic nervous system activity in emotion: A review. Biol. Psychol. 2010, 84, 394–421. [Google Scholar] [CrossRef]
- Broek, E.; Lisy, V.; Janssen, J.; Westerink, J.; Schut, M.; Tuinenbreijer, K.; Fred, A.; Filipe, J.; Gamboa, H. Affective Man-machine Interface: Unveiling Human Emotions through Biosignals. In International Joint Conference on Biomedical Engineering Systems and Technologies; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
- Mahdiani, S.; Jeyhani, V.; Peltokangas, M.; Vehkaoja, A. Is 50 Hz high enough ECG sampling frequency for accurate HRV analysis? In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 5948–5951. [Google Scholar]
- Tamura, T.; Maeda, Y.; Sekine, M.; Yoshida, M. Wearable Photoplethysmographic Sensors-Past and Present. Electronics 2014, 3, 282–302. [Google Scholar] [CrossRef]
- Lin, W.; Wu, D.; Li, C.; Zhang, H.; Zhang, Y. Comparison of Heart Rate Variability from PPG with That from ECG; Springer: Berlin/Heidelberg, Germany, 2014; pp. 213–215. [Google Scholar]
- Healey, J.; Picard, R. Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. Intell. Transp. Syst. 2005, 6, 156–166. [Google Scholar] [CrossRef]
- Schmidt, P.; Reiss, A.; Dürichen, R.; Van Laerhoven, K. Labelling Affective States “in the Wild”: Practical Guidelines and Lessons Learned. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, Singapore, 8–12 October 2018. [Google Scholar]
- Lykken, D.T.; Venables, P.H. Direct measurement of skin conductance: A proposal for standardization. Psychophysiology 1971, 8, 656–672. [Google Scholar] [CrossRef]
- Di Lascio, E.; Gashi, S.; Santini, S. Laughter Recognition Using Non-invasive Wearable Devices. In Proceedings of the 13th EAI International Conference on Pervasive Computing Technologies for Healthcare, Trento, Italy, 20–23 May 2019; ACM: New York, NY, USA, 2019; pp. 262–271. [Google Scholar] [CrossRef]
- Heinisch, J.S.; Anderson, C.; David, K. Angry or Climbing Stairs? Towards Physiological Emotion Recognition in the Wild. In Proceedings of the 2019 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Kyoto, Japan, 11–15 March 2019; pp. 486–491. [Google Scholar]
- Van Boxtel, A. Optimal signal bandwidth for the recording of surface EMG activity of facial, jaw, oral, and neck muscles. Psychophysiology 2001, 38, 22–34. [Google Scholar] [CrossRef]
- Wijsman, J.; Grundlehner, B.; Hermens, H. Trapezius muscle EMG as predictor of mental stress. In Proceedings of the Wireless Health 2010, San Diego, CA, USA, 5–7 October 2010; pp. 155–163. [Google Scholar]
- Lisetti, C.; Nasoz, F. Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals. EURASIP J. Appl. Signal Process. 2004, 2004, 1672–1687. [Google Scholar] [CrossRef]
- Kim, K.; Bang, S.; Kim, S. Emotion recognition system using short-term monitoring of physiological signals. Med Biol. Eng. Comput. 2004, 42, 419–427. [Google Scholar] [CrossRef]
- Soleymani, M.; Pantic, M.; Pun, T. Multimodal emotion recognition in response to videos. IEEE Trans. Affect. Comput. 2012, 3, 211–223. [Google Scholar] [CrossRef]
- Ramos, J.; Hong, J.; Dey, A. Stress recognition: A step outside the lab. In Proceedings of the International Conference on Physiological Computing Systems, Lisbon, Portugal, 7–9 January 2014. [Google Scholar]
- Muaremi, A.; Arnrich, B.; Tröster, G. Towards measuring stress with smartphones and wearable devices during workday and sleep. BioNanoScience 2013, 3, 172–183. [Google Scholar] [CrossRef]
- Kanjo, E.; Younis, E.M.; Ang, C.S. Deep learning analysis of mobile physiological, environmental and location sensor data for emotion detection. Inf. Fusion 2019, 49, 46–56. [Google Scholar] [CrossRef]
- Birjandtalab, J.; Cogan, D.; Pouyan, M.B.; Nourani, M. A Non-EEG Biosignals Dataset for Assessment and Visualization of Neurological Status. In Proceedings of the 2016 IEEE International Workshop on Signal Processing Systems (SiPS), Dallas, TX, USA, 26–28 October 2016; pp. 110–114. [Google Scholar]
- Haag, A.; Goronzy, S.; Schaich, P.; Williams, J. Emotion recognition using bio-sensors: First steps towards an automatic system. In Tutorial and Research Workshop on Affective Dialogue Systems; Springer: Berlin/Heidelberg, Germany, 2004; pp. 36–48. [Google Scholar]
- Liu, C.; Rani, P.; Sarkar, N. An empirical study of machine learning techniques for affect recognition in human-robot interaction. In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 2662–2667. [Google Scholar]
- Wagner, J.; Kim, J.; André, E. From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification. In Proceedings of the IEEE International Conference on Multimedia and Expo (ICME), Amsterdam, The Netherlands, 6 July 2005; pp. 940–943. [Google Scholar]
- Leon, E.; Clarke, G.; Callaghan, V.; Sepulveda, F. A user-independent real-time emotion recognition system for software agents in domestic environments. Eng. Appl. Artif. Intell. 2007, 20, 337–345. [Google Scholar] [CrossRef]
- Zhai, J.; Barreto, A. Stress detection in computer users through non-invasive monitoring of physiological signals. Biomed. Sci. Instrum. 2006, 42, 495–500. [Google Scholar]
- Kim, D.; Seo, Y.; Cho, J.; Cho, C. Detection of subjects with higher self-reporting stress scores using heart rate variability patterns during the day. In Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 682–685. [Google Scholar]
- Katsis, C.; Katertsidis, N.; Ganiatsas, G.; Fotiadis, D. Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach. IEEE Trans. Syst. Man, Cybern. 2008, 38, 502–512. [Google Scholar] [CrossRef]
- Calvo, R.; Brown, I.; Scheding, S. Effect of Experimental Factors on the Recognition of Affective Mental States through Physiological Measures. In AI 2009: Advances in Artificial Intelligenc; Springer: Berlin/Heidelberg, Germany, 2009; pp. 62–70. [Google Scholar]
- Chanel, G.; Kierkels, J.; Soleymani, M.; Pun, T. Short-term emotion assessment in a recall paradigm. Int. J. Hum. Comput. Stud. 2009, 67, 607–627. [Google Scholar] [CrossRef]
- Khalili, Z.; Moradi, M. Emotion recognition system using brain and peripheral signals: Using correlation dimension to improve the results of EEG. In Proceedings of the 2009 International Joint Conference on Neural Networks, Atlanta, GA, USA, 14–19 June 2009; pp. 1571–1575. [Google Scholar]
- Healey, J.; Nachman, L.; Subramanian, S.; Shahabdeen, J.; Morris, M. Out of the Lab and into the Fray: Towards Modeling Emotion in Everyday Life. In Pervasive Computing; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Hernandez, J.; Morris, R.; Picard, R.W. Call Center Stress Recognition with Person-Specific Models. In Affective Computing and Intelligent Interaction; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
- Valenza, G.; Lanata, A.; Scilingo, E. The Role of Nonlinear Dynamics in Affective Valence and Arousal Recognition. IEEE Trans. Affect. Comput. 2012, 3, 237–249. [Google Scholar] [CrossRef]
- Hamdi, H.; Richard, P.; Allain, P. Emotion assessment for affective computing based on physiological responses. In Proceedings of the 2012 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Brisbane, Australia, 10–15 June 2012; pp. 1–8. [Google Scholar]
- Agrafioti, F.; Hatzinakos, D.; Anderson, A.K. ECG Pattern Analysis for Emotion Detection. IEEE Trans. Affect. Comput. 2012, 3, 102–115. [Google Scholar] [CrossRef]
- Sano, A.; Picard, R. Stress Recognition Using Wearable Sensors and Mobile Phones. In Proceedings of the Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), Geneva, Switzerland, 2–5 September 2013; pp. 671–676. [Google Scholar]
- Martinez, H.; Bengio, Y.; Yannakakis, G. Learning deep physiological models of affect. IEEE Comput. Intell. Mag. 2013, 8, 20–33. [Google Scholar] [CrossRef]
- Adams, P.; Rabbi, M.; Rahman, T.; Matthews, M.; Voida, A.; Gay, G.; Choudhury, T.; Voida, S. Towards personal stress informatics: Comparing minimally invasive techniques for measuring daily stress in the wild. In Proceedings of the 8th International Conference on Pervasive Computing Technologies for Healthcare, Oldenburg, Germany, 20–23 May 2014; pp. 72–79. [Google Scholar]
- Hovsepian, K.; al’Absi, M.; Kumar, S. cStress: Towards a gold standard for continuous stress assessment in the mobile environment. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Osaka, Japan, 7–11 September 2015; pp. 493–504. [Google Scholar]
- Rubin, J.; Abreu, R.; Ahern, S. Time, Frequency & Complexity Analysis for Recognizing Panic States from Physiologic Time-Series. In Proceedings of the 10th EAI International Conference on Pervasive Computing Technologies for Healthcare, Cancun, Mexico, 16–19 May 2016. [Google Scholar]
- Jaques, N.; Taylor, S.; Nosakhare, E.; Sano, A.; Picard, R. Multi-task Learning for Predicting Health, Stress, and Happiness. In Proceedings of the NIPS Workshop on Machine Learning for Healthcare, Barcelona, Spain, 5–10 December 2016. [Google Scholar]
- Rathod, P.; George, K.; Shinde, N. Bio-signal based emotion detection device. In Proceedings of the 2016 IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), San Francisco, CA, USA, 14–17 June 2016; pp. 105–108. [Google Scholar]
- Zhu, Z.; Satizabal, H.; Blanke, U.; Perez-Uribe, A.; Tröster, G. Naturalistic Recognition of Activities and Mood Using Wearable Electronics. IEEE Trans. Affect. Comput. 2016, 7, 272–285. [Google Scholar] [CrossRef]
- Taylor, S.A.; Jaques, N.; Nosakhare, E.; Sano, A.; Picard, R. Personalized Multitask Learning for Predicting Tomorrows Mood, Stress, and Health. IEEE Trans. Affect. Comput. 2018, 2018, 1. [Google Scholar] [CrossRef]
- Girardi, D.; Lanubile, F.; Novielli, N. Emotion detection using noninvasive low cost sensors. In Proceedings of the Seventh International Conference on Affective Computing and Intelligent Interaction, San Antonio, TX, USA, 23–26 October 2017; pp. 125–130. [Google Scholar] [CrossRef]
- Zhao, B.; Wang, Z.; Yu, Z.; Guo, B. EmotionSense: Emotion Recognition Based on Wearable Wristband. In Proceedings of the 2018 IEEE SmartWorld, Ubiquitous Intelligence Computing, Advanced Trusted Computing, Scalable Computing Communications, Cloud Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI), Guangzhou, China, 8–12 October 2018; pp. 346–355. [Google Scholar]
- Marín-Morales, J.; Higuera-Trujillo, J.; Greco, A.; Guixeres, J.; Llinares, C.; Scilingo, E.; Alcañiz, M.; Valenza, G. Affective computing in virtual reality: Emotion recognition from brain and heartbeat dynamics using wearable sensors. Sci. Rep. 2018, 8, 13657. [Google Scholar] [CrossRef]
- Santamaria-Granados, L.; Munoz-Organero, M.; Ramirez-González, G.; Abdulhay, E.; Arunkumar, N. Using Deep Convolutional Neural Network for Emotion Detection on a Physiological Signals Dataset (AMIGOS). IEEE Access 2019, 7, 57–67. [Google Scholar] [CrossRef]
- Hassan, M.M.; Alam, M.G.R.; Uddin, M.Z.; Huda, S.; Almogren, A.; Fortino, G. Human emotion recognition using deep belief network architecture. Inf. Fusion 2019, 51, 10–18. [Google Scholar] [CrossRef]
- Lang, P.J.; Bradley, M.M.; Cuthbert, B.N. International Affective Picture System (IAPS): Technical Manual and Affective Ratings; The Center for Research in Psychophysiology, University of Florida: Gainesville, FL, USA, 1999. [Google Scholar]
- Mikels, J.; Fredrickson, B.; Larkin, G.; Lindberg, C.; Maglio, S.; Reuter-Lorenz, P. Emotional category data on images from the International Affective Picture System. Behav. Res. Methods 2005, 37, 626–630. [Google Scholar] [CrossRef]
- Gross, J.; Levenson, R. Emotion elicitation using films. Cogn. Emot. 1995, 9, 87–108. [Google Scholar] [CrossRef]
- Samson, A.; Kreibig, S.; Gross, J. Eliciting positive, negative and mixed emotional states: A film library for affective scientists. Cogn. Emot. 2016, 30, 827–856. [Google Scholar] [CrossRef]
- Hanai, T.; Ghassemi, M. Predicting Latent Narrative Mood Using Audio and Physiologic Data. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–10 February 2017; pp. 948–954. [Google Scholar]
- Castellano, G.; Kessous, L.; Caridakis, G. Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech. In Affect and Emotion in Human-Computer Interaction: From Theory to Applications; Peter, C., Beale, R., Eds.; Springer: Berlin/Heidelberg, Germany, 2008; pp. 92–103. [Google Scholar]
- Dobriek, S.; Gajsek, R.; Mihelic, F.; Pavesic, N.; Struc, V. Towards Efficient Multi-Modal Emotion Recognition. Int. J. Adv. Robot. Syst. 2013, 10, 53. [Google Scholar] [CrossRef] [Green Version]
- Taylor, B.; Dey, A.; Siewiorek, D.; Smailagic, A. Using Physiological Sensors to Detect Levels of User Frustration Induced by System Delays. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Osaka, Japan, 7–11 September 2015; pp. 517–528. [Google Scholar]
- Riva, G.; Mantovani, F.; Capideville, C.S.; Preziosa, A.; Morganti, F.; Villani, D.; Gaggioli, A.; Botella, C.; Alcañiz, M. Affective interactions using virtual reality: The link between presence and emotions. CyberPsychol. Behav. 2007, 10, 45–56. [Google Scholar] [CrossRef]
- Mason, J. A review of psychoendocrine research on the sympathetic-adrenal medullary system. Psychosom. Med. 1968, 30, 631–653. [Google Scholar] [CrossRef]
- Lupien, S.; Maheu, F.; Tu, M.; Fiocco, A.; Schramek, T. The effects of stress and stress hormones on human cognition: Implications for the field of brain and cognition. Brain Cogn. 2007, 65, 209–237. [Google Scholar] [CrossRef] [Green Version]
- Kirschbaum, C.; Pirke, K.; Hellhammer, D. The Trier Social Stress Test—A tool for investigating psychobiological stress responses in a laboratory setting. Neuropsychobiology 1993, 28, 76–81. [Google Scholar] [CrossRef]
- Gjoreski, M.; Gjoreski, H.; Gams, M. Continuous stress detection using a wrist device: In laboratory and real life. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, Heidelberg, Germany, 12–16 September 2016; pp. 1185–1193. [Google Scholar]
- Stroop, R. Studies of interference in serial verbal reactions. J. Exp. Psychol. 1935, 18, 643. [Google Scholar] [CrossRef]
- Wijsman, J.; Grundlehner, B.; Liu, H.; Hermens, H. Wearable Physiological Sensors Reflect Mental Stress State in Office-Like Situations. In Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, Geneva, Switzerland, 2–5 September 2013; pp. 600–605. [Google Scholar]
- Rubin, J.; Eldardiry, H.; Abreu, R.; Ahern, S.; Du, H.; Pattekar, A.; Bobrow, D. Towards a mobile and wearable system for predicting panic attacks. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Osaka, Japan, 7–11 September 2015; pp. 529–533. [Google Scholar]
- Sano, A.; Yu, A.; McHill, A.; Phillips, A.; Picard, R. Prediction of Happy-Sad mood from daily behaviors and previous sleep history. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015. [Google Scholar]
- Cohen, S.; Kamarck, T.; Mermelstein, R. A global measure of perceived stress. J. Health Soc. Behav. 1983, 1983, 385–396. [Google Scholar] [CrossRef]
- Koh, K.; Park, J.; Kim, C.; Cho, S. Development of the Stress Response Inventory and its application in clinical practice. Psychosom. Med. 2001, 63, 668–678. [Google Scholar] [CrossRef]
- Kroenke, K.; Spitzer, R.; Williams, J. The phq-9. J. Gen. Intern. Med. 2001, 16, 606–613. [Google Scholar] [CrossRef]
- Russell, D. UCLA Loneliness Scale (Version 3): Reliability, Validity, and Factor Structure. J. Personal. Assess. 1996, 66, 20–40. [Google Scholar] [CrossRef]
- Buysse, D.; Reynolds, C.; Monk, T.; Berman, S.; Kupfer, D. The Pittsburgh Sleep Quality Index: A new instrument for psychiatric practice and research. Psychiatry Res. 1989, 28, 193–213. [Google Scholar] [CrossRef]
- Diener, E.; Wirtz, D.; Tov, W.; Kim-Prieto, C.; Choi, D.; Oishi, S.; Biswas-Diener, R. New well-being measures: Short scales to assess flourishing and positive and negative feelings. Soc. Indic. Res. 2010, 97, 143–156. [Google Scholar] [CrossRef]
- John, O.; Srivastava, S. The Big Five trait taxonomy: History, measurement, and theoretical perspectives. Handbook of Personality: Theory and Research; Guilford Press: New York, NY, USA, 1999; pp. 102–138. [Google Scholar]
- Morris, M.; Guilak, F. Mobile Heart Health: Project Highlight. IEEE Pervasive Comput. 2009, 8, 57–61. [Google Scholar] [CrossRef]
- Pollak, J.P.; Adams, P.; Gay, G. PAM: A Photographic Affect Meter for Frequent, in Situ Measurement of Affect. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011. [Google Scholar]
- Shear, K.; Brown, T.; Barlow, D.; Money, R.; Sholomskas, D.; Woods, S.; Gorman, J.; Papp, L. Multicenter collaborative panic disorder severity scale. Am. J. Psychiatry 1997, 154, 1571–1575. [Google Scholar] [CrossRef]
- Horne, J.; Ostberg, O. A self-assessment questionnaire to determine morningness-eveningness in human circadian rhythms. Int. J. Chronobiol. 1976, 4, 97–110. [Google Scholar]
- Taamneh, S.; Tsiamyrtzis, P.; Dcosta, M.; Buddharaju, P.; Khatri, A.; Manser, M.; Ferris, T.; Wunderlich, R.; Pavlidis, I. A multimodal dataset for various forms of distracted driving. Sci. Data 2017, 4, 170110. [Google Scholar] [CrossRef] [Green Version]
- Bulling, A.; Blanke, U.; Schiele, B. A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput. Surv. 2014, 46, 33. [Google Scholar] [CrossRef]
- García-Laencina, P.J.; Sancho-Gómez, J.L.; Figueiras-Vidal, A.R. Pattern classification with missing data: A review. Neural Comput. Appl. 2010, 19, 263–282. [Google Scholar] [CrossRef]
- Figo, D.; Diniz, P.C.; Ferreira, D.R.; Cardoso, J.M.P. Preprocessing techniques for context recognition from accelerometer data. Pers. Ubiquitous Comput. 2010, 14, 645–662. [Google Scholar] [CrossRef]
- Pan, J.; Tompkins, W.J. A Real-Time QRS Detection Algorithm. IEEE Trans. Biomed. Eng. 1985, BME-32, 230–236. [Google Scholar] [CrossRef]
- Behar, J.; Oster, J.; Li, Q.; Clifford, G.D. ECG Signal Quality During Arrhythmia and Its Application to False Alarm Reduction. IEEE Trans. Biomed. Eng. 2013, 60, 1660–1666. [Google Scholar] [CrossRef] [PubMed]
- Elgendi, M. On the Analysis of Fingertip Photoplethysmogram Signals. Curr. Cardiol. Rev. 2012, 8, 14–25. [Google Scholar] [CrossRef] [PubMed]
- Biswas, D.; Simões-Capela, N.; Van Hoof, C.; Van Helleputte, N. Heart Rate Estimation From Wrist-Worn Photoplethysmography: A Review. IEEE Sens. J. 2019, 19, 6560–6570. [Google Scholar] [CrossRef]
- Lee, B.; Han, J.; Baek, H.J.; Shin, J.H.; Park, K.S.; Yi, W.J. Improved elimination of motion artifacts from a photoplethysmographic signal using a Kalman smoother with simultaneous accelerometry. Physiol. Meas. 2010, 31, 1585. [Google Scholar] [CrossRef] [PubMed]
- Ram, M.R.; Madhav, K.V.; Krishna, E.H.; Komalla, N.R.; Reddy, K.A. A Novel Approach for Motion Artifact Reduction in PPG Signals Based on AS-LMS Adaptive Filter. IEEE Trans. Instrum. Meas. 2012, 61, 1445–1457. [Google Scholar] [CrossRef]
- Reiss, A.; Indlekofer, I.; Schmidt, P.; Van Laerhoven, K. Deep PPG: Large-Scale Heart Rate Estimation with Convolutional Neural Networks. Sensors 2019, 19, 3079. [Google Scholar] [CrossRef] [PubMed]
- Salehizadeh, S.M.A.; Dao, D.; Bolkhovsky, J.; Cho, C.; Mendelson, Y.; Chon, K.H. A Novel Time-Varying Spectral Filtering Algorithm for Reconstruction of Motion Artifact Corrupted Heart Rate Signals During Intense Physical Activities Using a Wearable Photoplethysmogram Sensor. Sensors 2016, 16, 10. [Google Scholar] [CrossRef]
- Li, Q.; Clifford, G.D. Dynamic time warping and machine learning for signal quality assessment of pulsatile signals. Physiol. Meas. 2012, 33, 1491. [Google Scholar] [CrossRef]
- Setz, C.; Arnrich, B.; Schumm, J.; La Marca, R.; Tröster, G.; Ehlert, U. Discriminating stress from cognitive load using a wearable EDA device. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 410–417. [Google Scholar] [CrossRef]
- Taylor, S.; Jaques, N.; Chen, W.; Fedor, S.; Sano, A.; Picard, R. Automatic identification of artifacts in electrodermal activity data. In Proceedings of the 2015 37th Annual International Conference of the IEEE on Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015. [Google Scholar]
- Greco, A.; Valenza, G.; Lanata, A.; Scilingo, E.P.; Citi, L. cvxEDA: A Convex Optimization Approach to Electrodermal Activity Processing. IEEE Trans. Biomed. Eng. 2016, 63, 797–804. [Google Scholar] [CrossRef]
- Benedek, M.; Kaernbach, C. Decomposition of skin conductance data by means of nonnegative deconvolution. Psychophysiology 2010, 47, 647–658. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Willigenburg, N.; Daffertshofer, A.; Kingma, I.; van Dieen, J. Removing ECG contamination from EMG recordings: A comparison of ICA-based and other filtering procedures. J. Electromyogr. Kinesiol. 2012, 22, 485–493. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Huynh, T.; Schiele, B. Analyzing Features for Activity Recognition. In Proceedings of the 2005 Joint Conference on Smart Objects and Ambient Intelligence: Innovative Context-aware Services: Usages and Technologies, Grenoble, France, 12–14 October 2005; pp. 159–163. [Google Scholar]
- Reiss, A.; Stricker, D. Introducing a new benchmarked dataset for activity monitoring. In Proceedings of the 16th International Symposium on Wearable Computers (ISWC), Newcastle, UK, 18–22 June 2012; pp. 108–109. [Google Scholar]
- Parkka, J.; Ermes, M.; Antila, K.; van Gils, M.; Manttari, A.; Nieminen, H. Estimating Intensity of Physical Activity: A Comparison of Wearable Accelerometer and Gyro Sensors and 3 Sensor Locations. In Proceedings of the 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, 22–26 August 2007; pp. 1511–1514. [Google Scholar]
- Malik, M. Task force of the European society of cardiology and the north American society of pacing and electrophysiology. Heart rate variability. Standards of measurement, physiological interpretation, and clinical use. Eur Heart J. 1996, 17, 354–381. [Google Scholar] [CrossRef]
- Lim, C.L.; Rennie, C.; Barry, R.J.; Bahramali, H.; Lazzaro, I.; Manor, B.; Gordon, E. Decomposing skin conductance into tonic and phasic components. Int. J. Psychophysiol. 1997, 25, 97–109. [Google Scholar] [CrossRef]
- Rainville, P.; Bechara, A.; Naqvi, N.; Damasio, A. Basic emotions are associated with distinct patterns of cardiorespiratory activity. Int. J. Psychophysiol. 2006, 61, 5–18. [Google Scholar] [CrossRef] [PubMed]
- Kukolja, D.; Popovic, S.; Horvat, M.; Kovac, B.; Cosic, K. Comparative analysis of emotion estimation methods based on physiological measurements for real-time applications. Int. J. Hum. Comput. Interact 2014, 72, 717–727. [Google Scholar] [CrossRef]
- Christy, T.; Kuncheva, L.; Williams, K. Selection of Physiological Input Modalities for Emotion Recognition; Technical Report; Bangor University: Bangor, UK, 2012. [Google Scholar]
- Kollia, V. Personalization Effect on Emotion Recognition from Physiological Data: An Investigation of Performance on Different Setups and Classifiers. ArXiv 2016, arXiv:1607.05832. [Google Scholar]
- Fernández-Delgado, M.; Cernadas, E.; Barro, S.; Amorim, D. Do We Need Hundreds of Classifiers to Solve Real World Classification Problems? J. Mach. Learn. Res. 2014, 15, 3133–3181. [Google Scholar]
- Friedman, J.; Hastie, T.; Tibshirani, R. Additive logistic regression: A statistical view of boosting (With discussion and a rejoinder by the authors). Ann. Statist. 2000, 28, 337–407. [Google Scholar] [CrossRef]
- Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Freund; Schapire, R.E.; Labs, T. A Short Introduction to Boosting Yoav. J. Jpn. Soc. Artif. Intell. 1999, 14, 1612. [Google Scholar]
- Hammerla, N.; Halloran, S.; Ploetz, T. Deep, Convolutional, and Recurrent Models for Human Activity Recognition using Wearables. arXiv 2016, arXiv:1604.08880. [Google Scholar]
- Münzner, S.; Schmidt, P.; Reiss, A.; Hanselmann, M.; Stiefelhagen, R.; Dürichen, R. CNN-based Sensor Fusion Techniques for Multimodal Human Activity Recognition. In Proceedings of the 2017 ACM International Symposium on Wearable Computers, Maui, HI, USA, 11–15 September 2017. [Google Scholar]
- Guyon, I.; Elisseeff, A. An Introduction to Variable and Feature Selection. J. Mach. Learn. Res. 2003, 3, 1157–1182. [Google Scholar]
- Empatica E4 Description. 2017. Available online: https://www.empatica.com/-e4-wristband (accessed on 7 September 2017).
- Ertin, E.; Stohs, N.; Kumar, S.; Raij, A.; al’Absi, M.; Shah, S. AutoSense: Unobtrusively wearable sensor suite for inferring the onset, causality, and consequences of stress in the field. In Proceedings of the 9th ACM Conference on Embedded Networked Sensor Systems, Seattle, WA, USA, 1–4 November 2011; pp. 274–287. [Google Scholar]
- BioPac. 2017. Available online: https://www.biopac.com/ (accessed on 9 January 2018).
- Vivalnk. 2017. Available online: http://vivalnk.com/ (accessed on 9 January 2018).
- Sadri, B.; Goswami, D.; Sala de Medeiros, M.; Pal, A.; Castro, B.; Kuang, S.; Martinez, R.V. Wearable and Implantable Epidermal Paper-Based Electronics. ACS Appl. Mater. Interfaces 2018, 10, 31061–31068. [Google Scholar] [CrossRef] [PubMed]
- Ameri, S.K.; Ho, R.; Jang, H.; Wang, Y.; Schnyer, D.M.; Akinwande, D.; Lu, N. Thinnest transparent epidermal sensor system based on graphene. In Proceedings of the 2016 IEEE International Electron Devices Meeting (IEDM), San Francisco, CA, USA, 3–7 December 2016. [Google Scholar]
- Reiss, A.; Amft, O. Design challenges of real wearable computers. In Fundamentals of Wearable Computers and Augmented Reality; CRC Press: Boca Raton, FL, USA, 2015; pp. 583–618. [Google Scholar]
- Lonini, L.; Shawen, N.; Ghaffari, R.; Rogers, J.; Jayarman, A. Automatic Detection of Spasticity from Flexible Wearable Sensors. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers, Maui, HI, USA, 11–15 September 2017; pp. 133–136. [Google Scholar]
- Vrijkotte, T.; van Doornen, L.; de Geus, E. Effects of Work Stress on Ambulatory Blood Pressure, Heart Rate, and Heart Rate Variability. Hypertension 2000, 35, 880–886. [Google Scholar] [CrossRef] [PubMed]
- Gesche, H.; Grosskurth, D.; Küchler, G.; Patzak, A. Continuous blood pressure measurement by using the pulse transit time: Comparison to a cuff-based method. Eur. J. Appl. Physiol. 2012, 112, 309–315. [Google Scholar] [CrossRef] [PubMed]
- Pandia, K.; Ravindran, S.; Cole, R.; Kovacs, G.; Giovangrandi, L. Motion artifact cancellation to obtain heart sounds from a single chest-worn accelerometer. In Proceedings of the 2010 IEEE International Conference on Acoustics, Speech and Signal Processing, Dallas, TX, USA, 14–19 March 2010; pp. 590–593. [Google Scholar]
- Gao, W.; Emaminejad, S.; Nyein, H.; Challa, S.; Chen, K.; Peck, A.; Fahad, H.; Ota, H.; Shiraki, H.; Kiriya, D.; et al. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis. Nature 2016, 529, 509–514. [Google Scholar] [CrossRef] [Green Version]
- Imani, S.; Bandodkar, A.; Mohan, V.; Kumar, R.; Yu, S.; Wang, J.; Mercier, P. A wearable chemical–electrophysiological hybrid biosensing system for real-time health and fitness monitoring. Nat. Commun. 2016, 7, 11650. [Google Scholar] [CrossRef]
- Peterson, R. On the use of college students in social science research: Insights from a second-order meta-analysis. J. Consum. Res. 2001, 28, 450–461. [Google Scholar] [CrossRef]
- Grünerbl, A.; Muaremi, A.; Osmani, V.; Bahle, G.; Lukowicz, P. Smartphone-Based Recognition of States and State Changes in Bipolar Disorder Patients. IEEE J. Biomed. Health 2015, 19, 140–148. [Google Scholar] [CrossRef]
- Popoola, G.A.; Graves, C.A.; Ford-Booker, P. Using Unsupervised Anomaly Detection to Analyze Physiological Signals for Emotion Recognition. In Proceedings of the 2018 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Louisville, KY, USA, 6–8 December 2018. [Google Scholar]
- Yang, J.; Nguyen, M.; San, P.; Li, X.; Krishnaswamy, S. Deep Convolutional Neural Networks on Multichannel Time Series for Human Activity Recognition. In Proceedings of the 24th International Conference on Artificial Intelligence, Louisville, KY, USA, 6–8 December 2015; pp. 3995–4001. [Google Scholar]
- Wöllmer, M.; Kaiser, M.; Eyben, F.; Schuller, B.; Rigoll, G. LSTM-Modeling of continuous emotions in an audiovisual affect recognition framework. Image Vis. Comput. 2013, 31, 153–163. [Google Scholar] [CrossRef] [Green Version]
- Enke, D.; Thawornwong, S. The use of data mining and neural networks for forecasting stock market returns. Expert Syst. Appl. 2005, 29, 927–940. [Google Scholar] [CrossRef]
- Ordóñez, F.; Roggen, D. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors 2016, 16, 115. [Google Scholar] [CrossRef] [PubMed]
- Vincent, P.; Larochelle, H.; Lajoie, I.; Bengio, Y.; Manzagol, P. Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion. J. Mach. Learn. Res. 2010, 11, 3371–3408. [Google Scholar]
- Bhattacharya, S.; Lane, N.D. From smart to deep: Robust activity recognition on smartwatches using deep learning. In Proceedings of the 2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), Sydney, Australia, 14–18 March 2016. [Google Scholar]
- Budner, P.; Eirich, J.; Gloor, P. “Making you happy makes me happy”—Measuring Individual Mood with Smartwatches. arXiv 2017, arXiv:1711.06134. [Google Scholar]
- Kanjo, E.; Al-Husain, L.; Chamberlain, A. Emotions in context: Examining pervasive affective sensing systems, applications, and analyses. Pers. Ubiquitous Comput. 2015, 19, 1197–1212. [Google Scholar] [CrossRef]
- Chan, S.; Torous, J.; Hinton, L.; Yellowlees, P. Mobile tele-mental health: Increasing applications and a move to hybrid models of care. Healthcare 2014, 2, 220–233. [Google Scholar] [CrossRef] [PubMed]
- Bergner, B.; Exner, J.; Zeile, P.; Rumber, M. Sensing the City—How to identify Recreational Benefits of Urban Green Areas with the Help of Sensor Techonology; REAL CORP: Schwechat, Austria, 14–16 May 2012. [Google Scholar]
Sympathetic Nervous System (SNS) | Parasympathetic Nervous System (PNS) |
---|---|
|
|
|
|
|
|
|
|
| |
| |
|
Anger | Sadness | Amusement | Happiness | |
---|---|---|---|---|
(Non-Crying) | ||||
Cardiovascular: | ||||
HR | ↑ | ↓ | ↑ | |
HRV | ↓ | ↓ | ↑ | ↓ |
Electrodermal: | ||||
SCL | ↑ | ↓ | ↑ | |
# SCRs | ↑ | ↓ | ↑ | ↑ |
Respiration: | ||||
Respiration rate | ↑ | ↑ | ↑ | ↑ |
Physiological Signal Type | Derived Indicators | |
---|---|---|
Head/Face | Electroencephalogram | Electric potential changes of brain neurons |
Electromyogram | Facial muscle activity (e.g., zygomaticus major) | |
Electrooculography | Eye movements | |
Photoplethysmogram (ear) | HR and HRV | |
Torso/Back | Electrocardiogram | HR and HRV |
Electrodermal activity | Tonic and phasic component | |
Electromyogram | Muscle activity | |
Inertial sensor | Physical activity/body pose | |
Respiratory inductive Plethys-mograph | Respiration rate and volume | |
Body thermometer | Temperature | |
Hand/Wrist | Electrodermal activity meter | Tonic and phasic component |
Blood Oxymeter | Blood oxygen saturation | |
Blood pressure | Sphygmomanometer | |
Inertial sensor | Physical activity | |
Photoplethysmogram | HR and HRV | |
Thermometer | Temperature | |
Feet/Ankle | Electrodermal activity | Tonic and phasic component |
Inertial sensor | Physical activity | |
Context | Sensors of a mobile phone (GPS, microphone, etc.) | Location, Sound, Activity, Interaction |
Author | Affective States | Sensor Signals | |
---|---|---|---|
<2005 | Picard et al. [12] | Neutral, anger, hate, grief, joy, platonic/romantic love, reverence | EDA, EMG, PPG, RESP |
Haag et al. [77] | Low/medium/high arousal and positive/negative valence | ECG, EDA, EMG, TEMP, PPG, RESP | |
Lisetti and Nasoz [70] | Sadness, anger, fear, surprise, frustration, amusement | ECG, EDA, TEMP | |
2005 | Liu et al. [78] | Anxiety, boredom, engagement, frustration, anger | ECG, EDA, EMG |
Wagner et al. [79] | Joy, anger, pleasure, sadness | ECG, EDA, EMG, RESP | |
Healey and Picard [63] | Three stress levels | ECG, EDA, EMG, RESP | |
07 | Leon et al. [80] | Neutral/positive/negative valence | EDA, HR, BP |
2008 | Zhai and Barreto [81] | Relaxed and stressed | EDA, PD, PPG, TEMP |
Kim et al. [82] | Distinguish high/low stress group of individuals | PPG | |
Kim and André [34] | Four quadrants in valence-arousal space | ECG, EDA, EMG, RESP | |
Katsis et al. [83] | High/low stress, disappointment, euphoria | ECG, EDA, EMG, RESP | |
2009 | Calvo et al. [84] | Neutral, anger, hate, grief, joy, platonic/romantic love, reverence | ECG, EMG |
Chanel et al. [85] | Positively/negatively excited, calm-neutral (in valence-arousal space) | BP, EEG, EDA, PPG, RESP | |
Khalili and Moradi [86] | Positively/negatively excited, calm (valence-arousal space) | BP, EEG, EDA, RESP, TEMP | |
2010 | Healey et al. [87] | Points in valence arousal space. moods | ACC, EDA, HR, audio |
2011 | Plarre et al. [46] | Baseline, different types of stress (social, cognitive and physical), perceived stress | ACC, ECG, EDA, RESP, TEMP, ambient temperature |
Hernandez et al. [88] | Detect stressful calls | EDA | |
2012 | Valenza et al. [89] | Five classes of arousal and five valence levels | ECG, EDA, RESP |
Hamdi et al. [90] | Joy, sadness, disgust, anger, fear, surprise | ECG, EEG, EMG | |
Agrafioti et al. [91] | Neutral, gore, fear, disgust, excitement, erotica, game elicited mental arousal | ECG | |
Koelstra et al. [35] | Four quadrants in valence-arousal space | ECG, EDA, EEG, EMG, EOG, RESP, TEMP, facial video | |
Soleymani et al. [31] | Neutral, anxiety, amusement, sadness, joy, disgust, anger, surprise, fear | ECG, EDA, EEG, RESP, TEMP | |
2013 | Sano and Picard [92] | Stress vs. neutral | ACC, EDA, phone usage |
Martinez et al. [93] | Relaxation, anxiety, excitement, fun | EDA, PPG | |
2014 | Valenza et al. [36] | Four quadrants in valence-arousal space | ECG |
Adams et al. [94] | Stress vs. neutral (aroused vs. non-aroused) | EDA, audio | |
2015 | Hovsepian et al. [95] | Stress vs. neutral | ECG, RESP |
Abadi et al. [37] | High/Low valence, arousal and dominance | ECG, EOG, EMG, near-infrared face video, MEG | |
2016 | Rubin et al. [96] | Panic attack | ACC, ECG, RESP |
Jaques et al. [97] | Stress, happiness, health values | EDA, TEMP, ACC, phone usage | |
Rathod et al. [98] | Normal, happy, sad, fear, anger | EDA, PPG | |
Zenonos et al. [29] | Excited, happy, calm, tired, bored, sad, stressed, angry | ACC, ECG, PPG, TEMP | |
Zhu et al. [99] | Angle in valence arousal space | ACC, phone context | |
Birjandtalab et al. [76] | Relaxation, different types of stress (physical, emotional, cognitive) | ACC, EDA, TEMP, HR, SpO2 | |
2017 | Gjoreski et al. [13] | Lab: no/low/high stress; Field: stress vs. neutral | ACC, EDA, PPG, TEMP |
Mozos et al. [45] | Stress vs. neutral | ACC, EDA, PPG, audio | |
Taylor et al. [100] | Tomorrow’s mood, stress, health | ACC, EDA, context | |
Girardi et al. [101] | High vs. low valence and arousal | EEG, EDA, EMG | |
2018 | Schmidt et al. [64] | Neutral, amusement, stress | Torso: ACC, ECG, EDA, EMG, RESP, TEMP; Wrist: ACC, EDA, PPG, TEMP |
Zhao et al. [102] | LALV, LAHV, HALV, HAHV | EDA, PPG, TEMP | |
Marín-Morales et al. [103] | LALV, LAHV, HALV, HAHV | ECG, EEG | |
Santamaria- Granados et al. [104] | LALV, LAHV, HALV, HAHV | ECG, EDA | |
2019 | Heinisch et al. [67] | High positive pleasure high arousal, high negative pleasure high arousal and neutral | EMG, PPG, TEMP |
Hassan et al. [105] | Happy, relaxed, disgust, sad and neutral | EDA, PPG, EMG (from DEAP) | |
Kanjo et al. [75] | Five valence classes | ACC, EDA, HR, TEMP, environmental, GPS | |
Di Lascio et al. [66] | Detect laughter episodes | ACC, EDA, PPG |
Questionnaires Employed Prior or After the Study. | ||||
---|---|---|---|---|
Goal | Tool and Description | I | Source | Example Use |
Stress level | PSS: subject’s perception and awareness of stress | 10 | Cohen et al. [123] | Sano and Picard [92] |
SRI: score severity of stress-related symptoms within time interval | 22 | Koh et al. [124] | Kim et al. [82] | |
Depression level | PHQ-9: score DSM-IV manual | 9 | Kroenke et al. [125] | Wang et al. [9] |
Loneliness level | UCLA loneliness scale: addressing loneliness and social isolation. | 20 | Russell [126] | Wang et al. [9] |
Sleep behaviour and quality | PSQI: Providing information about sleep quality | 19 | Buysse et al. [127] | Sano and Picard [92] |
Measure suc-cess areas | Flourishing scale: measure success, self-esteem, purpose and optimism | 8 | Diener et al. [128] | Wang et al. [9] |
Personality traits | BFI: indicating personality traits | 44 | John and Srivastava [129] | Taylor et al. [100], Sano et al. [122] |
Questionnaires employed in ecological-momentary-assessment (during study). | ||||
Affect in Valence-arousal space | Mood Map: a translation of the circumplex model of emotion | 2 | Morris and Guilak [130] | Healey et al. [87] |
SAM | 2 | Morris [38] | Schmidt et al. [64] | |
Positive and negative affect | Shortened PANAS | 10 | Muaremi et al. [74] | Muaremi et al. [74] |
Positive Affect of PANAS | PAM: choose one of 16 images, mapped to the valence-arousal space | 1 | Pollak et al. [131] | Wang et al. [9] |
Subjective mood indicator | Smartphone app querying user’s mood | 8 | HealthyOffice app | Zenonos et al. [29] |
Stress level assessment | Adaptation of PSS for ambulatory setting | 5 | Hovsepian et al. [95] | Hovsepian et al. [95] |
Log current Stress Level | 1 | Gjoreski et al. [13] Hernandez et al. [88] | Gjoreski et al. [13] Hernandez et al. [88] | |
Severity of panic attack symptoms | Symptoms from the DSM-IV and Panic Disorder Severity Scale standard instrument | 15 | Shear et al. [132] | Rubin et al. [121] |
Author | Employed Questionnaires and Their Scheduling | |
---|---|---|
Emotion | Healey et al. [87] | During study: Participants completed EMAs whenever they felt a change in their affective/physiological state. EMAs included a form of the circumplex model and a field for free text. Conducted Interviews at the end of each workday to generate additional labels and revision. |
Rubin et al. [121] | During study: Start/stop time and severity ratings of 15 panic attack symptoms were reported by the subject using a mobile app. | |
Jaques et al. [97] | During study: Students reported health, stress and happiness twice a day (morning and evening). | |
Stress | Hernandez et al. [88] | During study: Nine employees of a call center rated all their incoming calls on a 7 point likert scale (endpoints marked as “extremely good/bad”). |
Muaremi et al. [74] | During: Participants were asked to fill in a shortened PANAS four times between 8 a.m and 8 p.m. Before going to sleep they answered the question: “How stressful have you felt today?” | |
Kim et al. [82] | Pre-study: In order to divide the subjects into two groups they filled out a simplified SRI. | |
Sano and Picard [92] | Pre-study: Participants filled in a PSS, PSQI, and BFI. During study: Morning/evening EMAs on sleep, mood, stress level, health, and so forth. Post-study: Participants filled in questionnaires on health, mood, and stress. | |
Adams et al. [94] | Pre-study: Participants completed a PANAS, PSS, and a measure of mindfulness. During study: Self-reports approximately every 30 min. (with small random variations). Participants reported on momentary stress and affect. Additional reports and a small free text field were available too. Post-study: Semi-structured interview at the end of the end data collection. | |
Hovsepian et al. [95] | During study: EMAs randomly scheduled approximately 15 times. During each EMA subjects filled in a shortened version of the PSS containing 6 items. | |
Gjoreski et al. [13] | During study: Subjects replied to 4 to 6 randomly scheduled EMAs. During each EMA subjects reported on their current stress level. | |
Schmidt et al. [64] | Pre-Study: PSS and PSQI During study: EMAs were scheduled every 2 h (with small random variations) during the wake time of the subjects. EMAs included valence+arousal SAM, basic emotions, stress level, shortened STAI, and PAM. | |
Mood | Wang et al. [9] | Pre-study: Subject filled in a number of behavioural and health surveys. During study: Every participant filled in 8 EMAs every day. The EMAs include measures on mood, health, stress and other affective states. Post-study: Interviews and the same set of behavioural and health surveys were administered. |
Sano et al. [122] | Pre-study: subjects filed BFI, PSQI, and Morningness-Eveningness [133] questionnaire. During study: similar to Sano and Picard [92] subject filled EMAs in morning and evening reporting on: activities, sleep, social interaction, health,mood, stress level and tiredness. Post-study: Subjects filed in a PSS, STAI, and other questionnaires related to physical and mental health. | |
Zenonos et al. [29] | During study: EMAs were scheduled every two hours. For the EMAs an app was used, containing sliders from 0-100 for 8 moods. Additionally, a free text field was provided. |
Name | Labels | Pop. | Sub. | Loc. | Included Modalities | |
---|---|---|---|---|---|---|
Emotion (E) | Eight-Emotion [12] | Neutral, anger, hate, grief, joy, platonic love,romantic love, reverence | GS | 1 | L | ECG, EDA, EMG, RESP |
DEAP [35] | Continuous scale of valence, arousal, liking, dominance, Discrete scale of familiarity | 26.9 | 32 | L | ECG, EDA, EEG, EMG, EOG, RESP, TEMP, face video (not all subjects) | |
MAHNOB-HCI [31] | Discrete scale of valence, arousal, dominance, predictability, Emotional keywords | 27 | L | ECG, EDA EEG, RESP, TEMP, face and body video, eye gaze tracker, audio | ||
DECAF [37] | Discrete scale of valence, arousal, dominance | 30 | L | ECG, EMG, EOG, MEG, near-infrared face video | ||
ASCERTAIN [39] | Discrete scale of valence, arousal, liking, engagement, familiarity, Big Five | 30 | 58 | L | ECG, EDA, EEG, facial activity data (facial landmark trajectories) | |
USI_Laughs [66] | Detect and distinguish laughter from other events | 34 | L | ACC, EDA, PPG, TEMP | ||
Stress (S) | Driver [63] | Stress levels: low, medium, high | - | 24 | FC | ECG, EDA, EMG, RESP |
Non-EEG [76] | Four types of stress (physical, emotional, cognitive, none) | CS | 20 | L | ACC, EDA, HR, TEMP, SpO2 | |
Distracted Driving [134] | Driving being subject to no, emotional, cognitive, and sensorimotor distraction | Elder + Young | 68 | L | EDA, heart and respiration rate, facial expressions, eye tracking | |
StudentLife [9] | Sleep, activity, sociability, mental well-being, stress, academic performance | CS + GS | 48 | F | ACC, audio, context, GPS, smartphone usage | |
E+S | WESAD [64] | Three affective states: neutral, amusement, stress | 15 | L | chest: ACC, ECG, EDA, EMG, RESP, TEMP; wrist: ACC, EDA, PPG, TEMP |
Features | |
---|---|
ACC | Time-domain: Statistical features (e.g., mean, median, standard deviation, absolute integral, correlation between axes), first and second derivative of acceleration energy Frequency-domain: Power ratio (0–2.75 Hz and 0–5 Hz band), peak frequency, entropy of the normalised power spectral density References: [45,137,153,154] |
ECG/PPG | Time-domain: Statistical features (e.g., mean, median, 20th and 80th percentile), HR, HRV, statistical features on HRV (e.g., Root Mean Square of Successive Differences (RMSSD), Standard Deviation of the RR Intervals (SDNN)), number and percentage of successive RR intervals differing by more than 20 ms (NN20, pNN20) or 50 ms (NN50, pNN50), pNN50/pNN20 ratio, Frequency-domain: Ultra low (ULF, 0–0.003 Hz), very low (VLF, 0.003–0.03 Hz), low (LF, 0.03–0.15 Hz), and high (HF, 0.15–0.4 Hz) frequency bands of HRV, normalised LF and HF, LF/HF ratio Non-linear: Lyapunov exponent, standard deviations ( and ) from Poincaré plot, ratio, sample entropy Geometrical: triangular interpolation index Multimodal: respiratory sinus arrhythmia, motion compensated HR , respiration-based HRV decomposition References: [56,63,89,95,96,155] |
EDA | Time-domain: Statistical features (mean, standard deviation, min, max, slope, average rising time, mean of derivative, etc.) Frequency-domain: 10 spectral power in the 0–2.4 Hz bands SCL features: Statistical features, degree of linearity SCR features: Number of identified SCR segments, sum of SCR startle magnitude and response durations, area under the identified SCRs References: [56,63,147,148,149,156] |
EMG | Time-domain: Statistical features, number of myoresponses Frequency-domain: Mean and median frequency, energy References: [34,35,69] |
RESP | Time-domain: Statistical features (e.g., mean, median, 80th percentile) applied to: inhalation (I) and exhalation (E) duration, ratio between I/E, stretch, volume of air inhaled/exhaled Frequency-domain: Breathing rate, mean power values of four subbands (0–0.1 Hz, 0.1–0.2 Hz, 0.2–0.3 Hz and 0.3–0.4 Hz) Multimodal: RSA References: [34,46,95,157,158] |
TEMP | Time-domain: Statistical features (e.g., mean, slope), intersection of the y-axis with a linear regression applied to the signal References: [13,113] |
Author | Algorithm | Classes | Set. | Sub. | Val. | Accuracy | |
---|---|---|---|---|---|---|---|
<2005 | Picard et al. [12] | kNN | 8 | L | 1 | LOO | 81% |
Haag et al. [77] | NN | contin. | L | 1 | 3-fold split | AR: <96%, VA: <90% | |
Lisetti and Nasoz [70] | kNN, LDA, NN | 6 | L | 14 | LOO | 72%; 75%; 84% | |
2005 | Liu et al. [78] | BN, kNN, RT, SVM | 5 | L | 15 | LOO | 74%; 75%; 84%; 85% |
Wagner et al. [79] | kNN, LDF, NN | 4 | L | 1 | LOO | 81%; 80%; 81% | |
Healey and Picard [63] | LDF | 3 | FC | 24 | LOO | 97% | |
07 | Leon et al. [80] | NN | 3 | L | 8+1 | LOSO | 71% |
2008 | Zhai and Barreto [81] | DT, NB, SVM | Bin. | L | 32 | 20-fold CV | 88%; 79%; 90% |
Kim et al. [82] | LR | Bin. | FC | 53 | 5-fold CV | ∼63% | |
Kim and André [34] | LDA | 4 | L | 3 | LOO | sub. dependent/independent: 95%/70% | |
Katsis et al. [83] | SVM | 4 | L | 10 | 10-fold CV | 79% | |
2009 | Calvo et al. [84] | BN, FT, LR, NB, NN, SVM | 8 | L | 3 | 10-fold CV | one subject: 37–98%, all subjects: 23–71% |
Chanel et al. [85] | LDA, QDA, SVM | 3/Bin. | L | 10 | LOSO | <50%; <47%; <50%, Bin. <70% | |
Khalili and Moradi [86] | QDA | 3 | L | 5 | LOO | 66.66% | |
10 | Healey et al. [87] | AB,DT, BN, NB | Bin. | F | 19 | 10-fold CV | None 2 |
2011 | Plarre et al. [46] | AB, DT, SVM/ HMM | Bin. | L/F | 21/17 | 10-fold CV | 82%; 88%; 88%/0.71 3 |
Hernandez et al. [88] | SVM | Bin. | F | 9 | LOSO | 73% | |
2012 | Valenza et al. [89] | QDA | 5 | L | 35 | 40-fold CV | >90% |
Hamdi et al. [90] | ANOVA | 6 | L | 16 | - | None 4 | |
Agrafioti et al. [91] | LDA | Bin. | L | 31 | LOO | Active/Pas AR: 78/52% Positive/Neg VA: <62% | |
Koelstra et al. [35] | NB | Bin. | L | 32 | LOO | AR/VA/LI: 57%/63%/59% | |
Soleymani et al. [31] | SVM | 3 | L | 27 | LOSO | VA: 46%, AR: 46% | |
2013 | Sano and Picard [92] | kNN, SVM | Bin. | F | 18 | 10-fold CV | <88% |
Martinez et al. [93] | CNN | 4 1 | L | 36 | 3-fold CV | learned features: <75%, hand-crafted: <69% | |
2014 | Valenza et al. [36] | SVM | Bin. | L | 30 | LOO | VA: 79%, AR: 84% |
Adams et al. [94] | GMM | Bin. | F | 7 | - | 74% | |
2015 | Hovsepian et al. [95] | SVM/BN | Bin. | L/F | 26/20 | LOSO | 92%/>40% |
Abadi et al. [37] | NB, SVM | Bin. | L | 30 | LOTO | VA/AR/DO: 50-60% | |
2016 | Rubin et al. [96] | DT, GB, kNN, LR, PA, RF, RR, SVM | Bin. | F | 10 | 10-fold CV | Bin. panic: 73–97% Bin. pre-panic: 71–91% |
Jaques et al. [97] | LR, NN,SVM | Bin. | F | 30 | 5-fold CV | <76%; <86%; <88% | |
Rathod et al. [98] | Rule-based | 6 | L | 6 | - | <87% | |
Zenonos et al. [29] | DT, kNN, RF | 5 | F | 4 | LOSO | 58%; 57%; 62% | |
Zhu et al. [99] | RR | 1 | F | 18 | LOSO | 5 | |
Birjandtalab et al. [76] | GMM | 4 | L | 20 | - | <85% | |
2017 | Gjoreski et al. [13] | AB, BN, DT, kNN, RF, SVM | 3/Bin. | L/F | 21/5 | LOSO | <73%/<90% |
Mozos et al. [45] | AB, kNN, SVM | Bin. | L | 18 | CV | 94%; 93%; 87% | |
Taylor et al. [100] | Single/Multitask LR, NN, SVM | Bin. | F | 104 | Cust. 6 | Mood: <78%, Stress/Health<82% | |
Girardi et al. [101] | DT, NB, SVM | Bin. | L | 19 | LOSO | ||
2018 | Schmidt et al. [64] | AB, DT, kNN, LDA, RF | 3/Bin. | L | 15 | LOSO | <80%/<93% |
Zhao et al. [102] | NB, NN, RF, SVM | 4/Bin. | L | 15 | LOSO | 76% | |
Marín-Morales et al. [103] | SVM | Bin. | L | 60 | LOSO | Val<75%, AR<82% | |
Santamaria- Granados et al. [104] | CNN | Bin. | L | 40 | - | Val: 75%, AR:71% | |
2019 | Heinisch et al. [67] | DT, kNN, RF | 3 | L | 18 | LOSO | <67% |
Hassan et al. [105] | DBN+SVM | 5 | L | 32 | 10-fold CV | 89.53% use DEAP | |
Kanjo et al. [75] | CNN+LSTM | 5 | FC | 34 | User 7 | <95% | |
Di Lascio et al. [66] | LR, RF, SVM | Bin. | L | 34 | LOSO | <81% |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Schmidt, P.; Reiss, A.; Dürichen, R.; Laerhoven, K.V. Wearable-Based Affect Recognition—A Review. Sensors 2019, 19, 4079. https://doi.org/10.3390/s19194079
Schmidt P, Reiss A, Dürichen R, Laerhoven KV. Wearable-Based Affect Recognition—A Review. Sensors. 2019; 19(19):4079. https://doi.org/10.3390/s19194079
Chicago/Turabian StyleSchmidt, Philip, Attila Reiss, Robert Dürichen, and Kristof Van Laerhoven. 2019. "Wearable-Based Affect Recognition—A Review" Sensors 19, no. 19: 4079. https://doi.org/10.3390/s19194079