Universal earphones: Earphones with automatic side and shared use detection

Kohei Matsumura, Daisuke Sakamoto, Masahiko Inami, Takeo Igarashi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

We present universal earphones that use both a proximity sensor and a skin conductance sensor and we demonstrate several implicit interaction techniques they achieve by automatically detecting the context of use. The universal earphones have two main features. The first involves detecting the left and right sides of ears, which provides audio to either ear, and the second involves detecting the shared use of earphones and this provides mixed stereo sound to both earphones. These features not merely free users from having to check the left and right sides of earphones, but they enable them to enjoy sharing stereo audio with other people.

Original languageEnglish
Title of host publicationIUI'12 - Proceedings of the 17th International Conference on Intelligent User Interfaces
Pages305-306
Number of pages2
DOIs
Publication statusPublished - 2012 Apr 26
Event2012 17th ACM International Conference on Intelligent User Interfaces, IUI'12 - Lisbon, Portugal
Duration: 2012 Feb 142012 Feb 17

Publication series

NameInternational Conference on Intelligent User Interfaces, Proceedings IUI

Other

Other2012 17th ACM International Conference on Intelligent User Interfaces, IUI'12
CountryPortugal
CityLisbon
Period12/2/1412/2/17

Keywords

  • Earphones
  • Implicit interaction
  • Intelligent interface

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'Universal earphones: Earphones with automatic side and shared use detection'. Together they form a unique fingerprint.

  • Cite this

    Matsumura, K., Sakamoto, D., Inami, M., & Igarashi, T. (2012). Universal earphones: Earphones with automatic side and shared use detection. In IUI'12 - Proceedings of the 17th International Conference on Intelligent User Interfaces (pp. 305-306). (International Conference on Intelligent User Interfaces, Proceedings IUI). https://doi.org/10.1145/2166966.2167025