Abstract
In this paper, we propose a novel human-object interaction (HOI) representation for understanding plausible tool use; this is done by using an object-centric method. Everyday tools (such as kitchen or DIY utensils) have the ideal region and direction for being touched by hands and surrounding objects. By analyzing the human demonstrations, tactile histories that occurred on the tools are accumulated to the tool's surface as the "Tactile Log", which can help to achieve the plausible tool use by robot arms. We also proposed a dataset for the evaluation of generated tactile logs, which is called the Object-Centric Interaction (OCI) dataset consists of RGB-D videos of plausible tool use by human and 3D models of the tools used in the videos. We confirm that the precision of our tactile logging is 0.77.
Original language | English |
---|---|
Publication status | Published - 2019 |
Event | 29th British Machine Vision Conference, BMVC 2018 - Newcastle, United Kingdom Duration: 2018 Sept 3 → 2018 Sept 6 |
Conference
Conference | 29th British Machine Vision Conference, BMVC 2018 |
---|---|
Country/Territory | United Kingdom |
City | Newcastle |
Period | 18/9/3 → 18/9/6 |
ASJC Scopus subject areas
- Computer Vision and Pattern Recognition