TY - GEN
T1 - Sentiment pen
T2 - 10th Augmented Human International Conference, AH 2019
AU - Han, Jiawen
AU - Chernyshov, George
AU - Zheng, Dingding
AU - Gao, Peizhong
AU - Narumi, Takuji
AU - Wolf, Katrin
AU - Kunze, Kai
N1 - Funding Information:
The work is supported by the Japan Science and Technology Agency CREST program under Grant No: JPMJCR16E1.
Publisher Copyright:
©2019 Copyright held by the owner/author(s). Publication rights licensed to ACM.
PY - 2019/3/11
Y1 - 2019/3/11
N2 - In this paper, we discuss the assessment of the emotional state of the user from digitized handwriting for implicit human-computer interaction. The proposed concept exemplifies how a digital system could recognize the emotional context of the interaction.We discuss our approach to emotion recognition and the underlying neurophysiological mechanisms. To verify the viability of our approach, we have conducted a series of tests where participants were asked to perform simple writing tasks after being exposed to a series of emotionally-stimulating video clips from EMDB[6], one set of four clips per each quadrant on the circumplex model of emotion[28]. The user-independent Support Vector Classifier (SVC) built using the recorded data shows up to 66% accuracy for certain types of writing tasks for 1 in 4 classification (1. High Valence, High Arousal; 2. High Valence, Low Arousal; 3. Low Valence, High Arousal; 4. Low Valence, Low Arousal). In the same conditions, a user-dependent classifier reaches an average of 70% accuracy across all 12 study participants. While future work is required to improve the classification rate, this work should be seen as proof-of-concept for emotion assessment of users while handwriting aiming to motivate research on implicit interaction while writing to enable emotion-sensitivity in mobile and ubiquitous computing.
AB - In this paper, we discuss the assessment of the emotional state of the user from digitized handwriting for implicit human-computer interaction. The proposed concept exemplifies how a digital system could recognize the emotional context of the interaction.We discuss our approach to emotion recognition and the underlying neurophysiological mechanisms. To verify the viability of our approach, we have conducted a series of tests where participants were asked to perform simple writing tasks after being exposed to a series of emotionally-stimulating video clips from EMDB[6], one set of four clips per each quadrant on the circumplex model of emotion[28]. The user-independent Support Vector Classifier (SVC) built using the recorded data shows up to 66% accuracy for certain types of writing tasks for 1 in 4 classification (1. High Valence, High Arousal; 2. High Valence, Low Arousal; 3. Low Valence, High Arousal; 4. Low Valence, Low Arousal). In the same conditions, a user-dependent classifier reaches an average of 70% accuracy across all 12 study participants. While future work is required to improve the classification rate, this work should be seen as proof-of-concept for emotion assessment of users while handwriting aiming to motivate research on implicit interaction while writing to enable emotion-sensitivity in mobile and ubiquitous computing.
KW - Affective computing
KW - Emotional recognition
KW - Handwriting analysis
UR - http://www.scopus.com/inward/record.url?scp=85062980704&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85062980704&partnerID=8YFLogxK
U2 - 10.1145/3311823.3311868
DO - 10.1145/3311823.3311868
M3 - Conference contribution
AN - SCOPUS:85062980704
T3 - ACM International Conference Proceeding Series
BT - Proceedings of the 10th Augmented Human International Conference, AH 2019
PB - Association for Computing Machinery
Y2 - 11 March 2019 through 12 March 2019
ER -