Experimental verification for motion control of a powered wheelchair using a gazing feature in an environment

Airi Ishizuka, Ayanori Yorozu, Masaki Takahashi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper describes the motion control system for a powered wheelchair using a gaze in an unknown environment. Recently, new Human-Computer Interfaces (HCIs) that have replaced joysticks have been developed for a person with a disability of the upper body. In this paper, movement of the eyes is used as an HCI. The wheelchair control system proposed in this study aims to achieve an operation such that a passenger gazes towards the direction he or she wants to move in the unknown environment. The gazing feature of the passenger in the 3D environment is acquired in real time and the wheelchair is subsequently controlled. The features include the entrance of the area of the passage and the gazing feature is acquired by obtaining the features and the gazing point of the passenger. The acquired information about the direction in which the passenger wants to move becomes operation input to the wheelchair. The wheelchair is controlled by obtaining this operation input and the information of the environment. The conventional motion control system can perform safe and smooth movement by avoiding obstacles. The effectiveness of the proposed system is demonstrated through experiments in a real environment with three participants.

Original languageEnglish
Title of host publicationProceedings of the 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016
PublisherAssociation for Computing Machinery
Pages147-151
Number of pages5
VolumePart F126740
ISBN (Electronic)9781450352130
DOIs
Publication statusPublished - 2016 Dec 7
Event4th International Conference on Control, Mechatronics and Automation, ICCMA 2016 - Barcelona, Spain
Duration: 2016 Dec 72016 Dec 11

Other

Other4th International Conference on Control, Mechatronics and Automation, ICCMA 2016
CountrySpain
CityBarcelona
Period16/12/716/12/11

Fingerprint

Wheelchairs
Motion control
Control systems
Interfaces (computer)
Experiments

Keywords

  • Control System
  • Eye Gaze Tracking
  • Human-Computer Interface
  • Powered Wheelchair

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Software

Cite this

Ishizuka, A., Yorozu, A., & Takahashi, M. (2016). Experimental verification for motion control of a powered wheelchair using a gazing feature in an environment. In Proceedings of the 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016 (Vol. Part F126740, pp. 147-151). Association for Computing Machinery. https://doi.org/10.1145/3029610.3029614

Experimental verification for motion control of a powered wheelchair using a gazing feature in an environment. / Ishizuka, Airi; Yorozu, Ayanori; Takahashi, Masaki.

Proceedings of the 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016. Vol. Part F126740 Association for Computing Machinery, 2016. p. 147-151.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ishizuka, A, Yorozu, A & Takahashi, M 2016, Experimental verification for motion control of a powered wheelchair using a gazing feature in an environment. in Proceedings of the 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016. vol. Part F126740, Association for Computing Machinery, pp. 147-151, 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016, Barcelona, Spain, 16/12/7. https://doi.org/10.1145/3029610.3029614
Ishizuka A, Yorozu A, Takahashi M. Experimental verification for motion control of a powered wheelchair using a gazing feature in an environment. In Proceedings of the 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016. Vol. Part F126740. Association for Computing Machinery. 2016. p. 147-151 https://doi.org/10.1145/3029610.3029614
Ishizuka, Airi ; Yorozu, Ayanori ; Takahashi, Masaki. / Experimental verification for motion control of a powered wheelchair using a gazing feature in an environment. Proceedings of the 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016. Vol. Part F126740 Association for Computing Machinery, 2016. pp. 147-151
@inproceedings{fa78da099c1047ecaa414fd4292d89dd,
title = "Experimental verification for motion control of a powered wheelchair using a gazing feature in an environment",
abstract = "This paper describes the motion control system for a powered wheelchair using a gaze in an unknown environment. Recently, new Human-Computer Interfaces (HCIs) that have replaced joysticks have been developed for a person with a disability of the upper body. In this paper, movement of the eyes is used as an HCI. The wheelchair control system proposed in this study aims to achieve an operation such that a passenger gazes towards the direction he or she wants to move in the unknown environment. The gazing feature of the passenger in the 3D environment is acquired in real time and the wheelchair is subsequently controlled. The features include the entrance of the area of the passage and the gazing feature is acquired by obtaining the features and the gazing point of the passenger. The acquired information about the direction in which the passenger wants to move becomes operation input to the wheelchair. The wheelchair is controlled by obtaining this operation input and the information of the environment. The conventional motion control system can perform safe and smooth movement by avoiding obstacles. The effectiveness of the proposed system is demonstrated through experiments in a real environment with three participants.",
keywords = "Control System, Eye Gaze Tracking, Human-Computer Interface, Powered Wheelchair",
author = "Airi Ishizuka and Ayanori Yorozu and Masaki Takahashi",
year = "2016",
month = "12",
day = "7",
doi = "10.1145/3029610.3029614",
language = "English",
volume = "Part F126740",
pages = "147--151",
booktitle = "Proceedings of the 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016",
publisher = "Association for Computing Machinery",

}

TY - GEN

T1 - Experimental verification for motion control of a powered wheelchair using a gazing feature in an environment

AU - Ishizuka, Airi

AU - Yorozu, Ayanori

AU - Takahashi, Masaki

PY - 2016/12/7

Y1 - 2016/12/7

N2 - This paper describes the motion control system for a powered wheelchair using a gaze in an unknown environment. Recently, new Human-Computer Interfaces (HCIs) that have replaced joysticks have been developed for a person with a disability of the upper body. In this paper, movement of the eyes is used as an HCI. The wheelchair control system proposed in this study aims to achieve an operation such that a passenger gazes towards the direction he or she wants to move in the unknown environment. The gazing feature of the passenger in the 3D environment is acquired in real time and the wheelchair is subsequently controlled. The features include the entrance of the area of the passage and the gazing feature is acquired by obtaining the features and the gazing point of the passenger. The acquired information about the direction in which the passenger wants to move becomes operation input to the wheelchair. The wheelchair is controlled by obtaining this operation input and the information of the environment. The conventional motion control system can perform safe and smooth movement by avoiding obstacles. The effectiveness of the proposed system is demonstrated through experiments in a real environment with three participants.

AB - This paper describes the motion control system for a powered wheelchair using a gaze in an unknown environment. Recently, new Human-Computer Interfaces (HCIs) that have replaced joysticks have been developed for a person with a disability of the upper body. In this paper, movement of the eyes is used as an HCI. The wheelchair control system proposed in this study aims to achieve an operation such that a passenger gazes towards the direction he or she wants to move in the unknown environment. The gazing feature of the passenger in the 3D environment is acquired in real time and the wheelchair is subsequently controlled. The features include the entrance of the area of the passage and the gazing feature is acquired by obtaining the features and the gazing point of the passenger. The acquired information about the direction in which the passenger wants to move becomes operation input to the wheelchair. The wheelchair is controlled by obtaining this operation input and the information of the environment. The conventional motion control system can perform safe and smooth movement by avoiding obstacles. The effectiveness of the proposed system is demonstrated through experiments in a real environment with three participants.

KW - Control System

KW - Eye Gaze Tracking

KW - Human-Computer Interface

KW - Powered Wheelchair

UR - http://www.scopus.com/inward/record.url?scp=85016401010&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85016401010&partnerID=8YFLogxK

U2 - 10.1145/3029610.3029614

DO - 10.1145/3029610.3029614

M3 - Conference contribution

AN - SCOPUS:85016401010

VL - Part F126740

SP - 147

EP - 151

BT - Proceedings of the 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016

PB - Association for Computing Machinery

ER -