TY - GEN
T1 - Masktrap
T2 - 28th International Conference on Intelligent User Interfaces, IUI 2023
AU - Yamamoto, Takumi
AU - Masai, Katsutoshi
AU - Withana, Anusha
AU - Sugiura, Yuta
N1 - Funding Information:
The parts of this work were supported by JST PRESTO (grant number JPMJPR2134). This project was also partially funded by the Australian Research Council Discovery Early Career Award (DE-CRA) - DE200100479. Dr. Withana is the recipient of a DECRA funded by the Australian Government. We wish to thank Kana Matsuo and Yohei Kawasaki for their instruction on the machine learning part, Ryota Matsui for the statistics part, and thank all participants of our user study.
Publisher Copyright:
© 2023 Owner/Author.
PY - 2023/3/27
Y1 - 2023/3/27
N2 - Embedding technology into day-to-day wearables and creating smart devices such as smartwatches and smart-glasses has been a growing area of interest. In this paper, we explore the interaction around face masks, a common accessory worn by many to prevent the spread of infectious diseases. Particularly, we propose a method of using the straps of a face mask as an input medium. We identified a set of plausible gestures on mask straps through an elicitation study (N = 20), in which the participants proposed different gestures for a given referent. We then developed a prototype to identify the gestures performed on the mask straps and present the recognition accuracy from a user study with eight participants. Our results show the system achieves 93.07% classification accuracy for 12 gestures.
AB - Embedding technology into day-to-day wearables and creating smart devices such as smartwatches and smart-glasses has been a growing area of interest. In this paper, we explore the interaction around face masks, a common accessory worn by many to prevent the spread of infectious diseases. Particularly, we propose a method of using the straps of a face mask as an input medium. We identified a set of plausible gestures on mask straps through an elicitation study (N = 20), in which the participants proposed different gestures for a given referent. We then developed a prototype to identify the gestures performed on the mask straps and present the recognition accuracy from a user study with eight participants. Our results show the system achieves 93.07% classification accuracy for 12 gestures.
KW - Mask
KW - Strings
KW - User Elicitation Study
KW - Wearable Device
UR - http://www.scopus.com/inward/record.url?scp=85152131373&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85152131373&partnerID=8YFLogxK
U2 - 10.1145/3581641.3584062
DO - 10.1145/3581641.3584062
M3 - Conference contribution
AN - SCOPUS:85152131373
T3 - International Conference on Intelligent User Interfaces, Proceedings IUI
SP - 762
EP - 775
BT - IUI 2023 - Proceedings of the 28th International Conference on Intelligent User Interfaces
PB - Association for Computing Machinery
Y2 - 27 March 2023 through 31 March 2023
ER -