Multi-touch steering wheel for in-car tertiary applications using infrared sensors

Shunsuke Koyama, Yuta Sugiura, Masa Ogata, Anusha Withana, Yuji Uema, Makoto Honda, Sayaka Yoshizu, Chihiro Sannomiya, Kazunari Nawa, Masahiko Inami

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

This paper proposes a multi-touch steering wheel for in-car tertiary applications. Existing interfaces for in-car applications such as buttons and touch displays have several operating problems. For example, drivers have to consciously move their hands to the interfaces as the interfaces are fixed on specific positions. Therefore, we developed a steering wheel where touch positions can correspond to different operating positions. This system can recognize hand gestures at any position on the steering wheel by utilizing 120 infrared (IR) sensors embedded in it. The sensors are lined up in an array surrounding the whole wheel. An Support Vector Machine (SVM) algorithm is used to learn and recognize the different gestures through the data obtained from the sensors. The gestures recognized are flick, click, tap, stroke and twist. Additionally, we implemented a navigation application and an audio application that utilizes the torus shape of the steering wheel. We conducted an experiment to observe the possibility of our proposed system to recognize flick gestures at three positions. Results show that an average of 92% of flick could be recognized.

Original languageEnglish
Title of host publicationACM International Conference Proceeding Series
PublisherAssociation for Computing Machinery
ISBN (Print)9781450327619
DOIs
Publication statusPublished - 2014
Externally publishedYes
Event5th Augmented Human International Conference, AH 2014 - Kobe, Japan
Duration: 2014 Mar 72014 Mar 8

Other

Other5th Augmented Human International Conference, AH 2014
CountryJapan
CityKobe
Period14/3/714/3/8

Fingerprint

Wheels
Railroad cars
Infrared radiation
Sensors
Support vector machines
Navigation
Display devices
Experiments

Keywords

  • Automobile
  • Gesture Recognition
  • Infrared Sensor
  • Interaction Design
  • Multi-touch
  • Torus Interface

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Software

Cite this

Koyama, S., Sugiura, Y., Ogata, M., Withana, A., Uema, Y., Honda, M., ... Inami, M. (2014). Multi-touch steering wheel for in-car tertiary applications using infrared sensors. In ACM International Conference Proceeding Series [a5] Association for Computing Machinery. https://doi.org/10.1145/2582051.2582056

Multi-touch steering wheel for in-car tertiary applications using infrared sensors. / Koyama, Shunsuke; Sugiura, Yuta; Ogata, Masa; Withana, Anusha; Uema, Yuji; Honda, Makoto; Yoshizu, Sayaka; Sannomiya, Chihiro; Nawa, Kazunari; Inami, Masahiko.

ACM International Conference Proceeding Series. Association for Computing Machinery, 2014. a5.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Koyama, S, Sugiura, Y, Ogata, M, Withana, A, Uema, Y, Honda, M, Yoshizu, S, Sannomiya, C, Nawa, K & Inami, M 2014, Multi-touch steering wheel for in-car tertiary applications using infrared sensors. in ACM International Conference Proceeding Series., a5, Association for Computing Machinery, 5th Augmented Human International Conference, AH 2014, Kobe, Japan, 14/3/7. https://doi.org/10.1145/2582051.2582056
Koyama S, Sugiura Y, Ogata M, Withana A, Uema Y, Honda M et al. Multi-touch steering wheel for in-car tertiary applications using infrared sensors. In ACM International Conference Proceeding Series. Association for Computing Machinery. 2014. a5 https://doi.org/10.1145/2582051.2582056
Koyama, Shunsuke ; Sugiura, Yuta ; Ogata, Masa ; Withana, Anusha ; Uema, Yuji ; Honda, Makoto ; Yoshizu, Sayaka ; Sannomiya, Chihiro ; Nawa, Kazunari ; Inami, Masahiko. / Multi-touch steering wheel for in-car tertiary applications using infrared sensors. ACM International Conference Proceeding Series. Association for Computing Machinery, 2014.
@inproceedings{e9460425ded8425287e172815a778fd6,
title = "Multi-touch steering wheel for in-car tertiary applications using infrared sensors",
abstract = "This paper proposes a multi-touch steering wheel for in-car tertiary applications. Existing interfaces for in-car applications such as buttons and touch displays have several operating problems. For example, drivers have to consciously move their hands to the interfaces as the interfaces are fixed on specific positions. Therefore, we developed a steering wheel where touch positions can correspond to different operating positions. This system can recognize hand gestures at any position on the steering wheel by utilizing 120 infrared (IR) sensors embedded in it. The sensors are lined up in an array surrounding the whole wheel. An Support Vector Machine (SVM) algorithm is used to learn and recognize the different gestures through the data obtained from the sensors. The gestures recognized are flick, click, tap, stroke and twist. Additionally, we implemented a navigation application and an audio application that utilizes the torus shape of the steering wheel. We conducted an experiment to observe the possibility of our proposed system to recognize flick gestures at three positions. Results show that an average of 92{\%} of flick could be recognized.",
keywords = "Automobile, Gesture Recognition, Infrared Sensor, Interaction Design, Multi-touch, Torus Interface",
author = "Shunsuke Koyama and Yuta Sugiura and Masa Ogata and Anusha Withana and Yuji Uema and Makoto Honda and Sayaka Yoshizu and Chihiro Sannomiya and Kazunari Nawa and Masahiko Inami",
year = "2014",
doi = "10.1145/2582051.2582056",
language = "English",
isbn = "9781450327619",
booktitle = "ACM International Conference Proceeding Series",
publisher = "Association for Computing Machinery",

}

TY - GEN

T1 - Multi-touch steering wheel for in-car tertiary applications using infrared sensors

AU - Koyama, Shunsuke

AU - Sugiura, Yuta

AU - Ogata, Masa

AU - Withana, Anusha

AU - Uema, Yuji

AU - Honda, Makoto

AU - Yoshizu, Sayaka

AU - Sannomiya, Chihiro

AU - Nawa, Kazunari

AU - Inami, Masahiko

PY - 2014

Y1 - 2014

N2 - This paper proposes a multi-touch steering wheel for in-car tertiary applications. Existing interfaces for in-car applications such as buttons and touch displays have several operating problems. For example, drivers have to consciously move their hands to the interfaces as the interfaces are fixed on specific positions. Therefore, we developed a steering wheel where touch positions can correspond to different operating positions. This system can recognize hand gestures at any position on the steering wheel by utilizing 120 infrared (IR) sensors embedded in it. The sensors are lined up in an array surrounding the whole wheel. An Support Vector Machine (SVM) algorithm is used to learn and recognize the different gestures through the data obtained from the sensors. The gestures recognized are flick, click, tap, stroke and twist. Additionally, we implemented a navigation application and an audio application that utilizes the torus shape of the steering wheel. We conducted an experiment to observe the possibility of our proposed system to recognize flick gestures at three positions. Results show that an average of 92% of flick could be recognized.

AB - This paper proposes a multi-touch steering wheel for in-car tertiary applications. Existing interfaces for in-car applications such as buttons and touch displays have several operating problems. For example, drivers have to consciously move their hands to the interfaces as the interfaces are fixed on specific positions. Therefore, we developed a steering wheel where touch positions can correspond to different operating positions. This system can recognize hand gestures at any position on the steering wheel by utilizing 120 infrared (IR) sensors embedded in it. The sensors are lined up in an array surrounding the whole wheel. An Support Vector Machine (SVM) algorithm is used to learn and recognize the different gestures through the data obtained from the sensors. The gestures recognized are flick, click, tap, stroke and twist. Additionally, we implemented a navigation application and an audio application that utilizes the torus shape of the steering wheel. We conducted an experiment to observe the possibility of our proposed system to recognize flick gestures at three positions. Results show that an average of 92% of flick could be recognized.

KW - Automobile

KW - Gesture Recognition

KW - Infrared Sensor

KW - Interaction Design

KW - Multi-touch

KW - Torus Interface

UR - http://www.scopus.com/inward/record.url?scp=84899785830&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84899785830&partnerID=8YFLogxK

U2 - 10.1145/2582051.2582056

DO - 10.1145/2582051.2582056

M3 - Conference contribution

AN - SCOPUS:84899785830

SN - 9781450327619

BT - ACM International Conference Proceeding Series

PB - Association for Computing Machinery

ER -