Task-oriented function detection based on operational tasks

Yuchi Ishikawa, Haruya Ishikawa, Shuichi Akizuki, Masaki Yamazaki, Yasuhiro Taniguchi, Yoshimitsu Aoki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose novel representations for functions of an object, namely Task-oriented Function, which is improved upon the idea of Afforadance in the field of Robotics Vision. We also propose a convolutional neural network to detect task-oriented functions. This network takes as input an operational task as well as an RGB image and assign each pixel an appropriate label for every task. Task-oriented funciton makes it possible to descibe various ways to use an object because the outputs from the network differ depending on operational tasks. We introduce a new dataset for task-oriented function detection, which contains about 1200 RGB images and 6000 pixel-level annotations assuming five tasks. Our proposed method reached 0.80 mean IOU in our dataset.

Original languageEnglish
Title of host publication2019 19th International Conference on Advanced Robotics, ICAR 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages635-640
Number of pages6
ISBN (Electronic)9781728124674
DOIs
Publication statusPublished - 2019 Dec
Event19th International Conference on Advanced Robotics, ICAR 2019 - Belo Horizonte, Brazil
Duration: 2019 Dec 22019 Dec 6

Publication series

Name2019 19th International Conference on Advanced Robotics, ICAR 2019

Conference

Conference19th International Conference on Advanced Robotics, ICAR 2019
Country/TerritoryBrazil
CityBelo Horizonte
Period19/12/219/12/6

ASJC Scopus subject areas

  • Artificial Intelligence
  • Mechanical Engineering
  • Control and Optimization
  • Modelling and Simulation

Fingerprint

Dive into the research topics of 'Task-oriented function detection based on operational tasks'. Together they form a unique fingerprint.

Cite this