Grasping Point Estimation Based on Stored Motion and Depth Data in Motion Reproduction System

Xiaobai Sun, Takahiro Nozaki, Toshiyuki Murakami, Kouhei Ohnishi

研究成果: Conference contribution

抄録

Most countries are running shortage of working force due to the aging population and reduction in the birthrate. Robot manipulators are expected to replace human work. However, it it still difficult for manipulators to do simple tasks such as fruit harvesting, foods cooking or toy assembling. A problem for robotic automation arise in the difficulty in teaching how much force manipulators should use for a task execution. Motion reproduction system, which uses bilateral control to store motion data, is one of a method to teach manipulators motion including position and force. The problem concerning motion reproduction system is that the motion reproducing fails if environment is changed between motion saving phase and motion reproducing phase. Motion reproduction system which can understand and adapt to environment is required. Vision sensor can sense environment. Computer vision is mainly focus on how to classify objects. Vision information is seldom combined with motion control especially force motion. Therefore, I propose a motion reproduction system in which reproduced motion is decided based on several motions and collected depth data. Convolutional Neural Network(CNN) was used to estimate a motion command from a depth image. Saved force data was used to generate labels for training. The label decision is different from conventional Machine learning alzorithm.

元の言語English
ホスト出版物のタイトルProceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019
出版者Institute of Electrical and Electronics Engineers Inc.
ページ471-476
ページ数6
ISBN(電子版)9781538669594
DOI
出版物ステータスPublished - 2019 5 24
イベント2019 IEEE International Conference on Mechatronics, ICM 2019 - Ilmenau, Germany
継続期間: 2019 3 182019 3 20

出版物シリーズ

名前Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019

Conference

Conference2019 IEEE International Conference on Mechatronics, ICM 2019
Germany
Ilmenau
期間19/3/1819/3/20

Fingerprint

Data Depth
Point Estimation
Grasping
Manipulators
Motion
Labels
Cooking
Motion control
Fruits
Manipulator
Computer vision
Learning systems
Teaching
Robotics
Automation
Aging of materials
Robots
Neural networks
Sensors
Robot Manipulator

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Automotive Engineering
  • Mechanical Engineering
  • Control and Optimization
  • Industrial and Manufacturing Engineering

これを引用

Sun, X., Nozaki, T., Murakami, T., & Ohnishi, K. (2019). Grasping Point Estimation Based on Stored Motion and Depth Data in Motion Reproduction System. : Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019 (pp. 471-476). [8722836] (Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICMECH.2019.8722836

Grasping Point Estimation Based on Stored Motion and Depth Data in Motion Reproduction System. / Sun, Xiaobai; Nozaki, Takahiro; Murakami, Toshiyuki; Ohnishi, Kouhei.

Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019. Institute of Electrical and Electronics Engineers Inc., 2019. p. 471-476 8722836 (Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019).

研究成果: Conference contribution

Sun, X, Nozaki, T, Murakami, T & Ohnishi, K 2019, Grasping Point Estimation Based on Stored Motion and Depth Data in Motion Reproduction System. : Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019., 8722836, Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019, Institute of Electrical and Electronics Engineers Inc., pp. 471-476, 2019 IEEE International Conference on Mechatronics, ICM 2019, Ilmenau, Germany, 19/3/18. https://doi.org/10.1109/ICMECH.2019.8722836
Sun X, Nozaki T, Murakami T, Ohnishi K. Grasping Point Estimation Based on Stored Motion and Depth Data in Motion Reproduction System. : Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019. Institute of Electrical and Electronics Engineers Inc. 2019. p. 471-476. 8722836. (Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019). https://doi.org/10.1109/ICMECH.2019.8722836
Sun, Xiaobai ; Nozaki, Takahiro ; Murakami, Toshiyuki ; Ohnishi, Kouhei. / Grasping Point Estimation Based on Stored Motion and Depth Data in Motion Reproduction System. Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019. Institute of Electrical and Electronics Engineers Inc., 2019. pp. 471-476 (Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019).
@inproceedings{547d4264fd3f4dd883c0e269e9a980ed,
title = "Grasping Point Estimation Based on Stored Motion and Depth Data in Motion Reproduction System",
abstract = "Most countries are running shortage of working force due to the aging population and reduction in the birthrate. Robot manipulators are expected to replace human work. However, it it still difficult for manipulators to do simple tasks such as fruit harvesting, foods cooking or toy assembling. A problem for robotic automation arise in the difficulty in teaching how much force manipulators should use for a task execution. Motion reproduction system, which uses bilateral control to store motion data, is one of a method to teach manipulators motion including position and force. The problem concerning motion reproduction system is that the motion reproducing fails if environment is changed between motion saving phase and motion reproducing phase. Motion reproduction system which can understand and adapt to environment is required. Vision sensor can sense environment. Computer vision is mainly focus on how to classify objects. Vision information is seldom combined with motion control especially force motion. Therefore, I propose a motion reproduction system in which reproduced motion is decided based on several motions and collected depth data. Convolutional Neural Network(CNN) was used to estimate a motion command from a depth image. Saved force data was used to generate labels for training. The label decision is different from conventional Machine learning alzorithm.",
keywords = "bilateral control, image processing, motion control, motion reproduction",
author = "Xiaobai Sun and Takahiro Nozaki and Toshiyuki Murakami and Kouhei Ohnishi",
year = "2019",
month = "5",
day = "24",
doi = "10.1109/ICMECH.2019.8722836",
language = "English",
series = "Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "471--476",
booktitle = "Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019",

}

TY - GEN

T1 - Grasping Point Estimation Based on Stored Motion and Depth Data in Motion Reproduction System

AU - Sun, Xiaobai

AU - Nozaki, Takahiro

AU - Murakami, Toshiyuki

AU - Ohnishi, Kouhei

PY - 2019/5/24

Y1 - 2019/5/24

N2 - Most countries are running shortage of working force due to the aging population and reduction in the birthrate. Robot manipulators are expected to replace human work. However, it it still difficult for manipulators to do simple tasks such as fruit harvesting, foods cooking or toy assembling. A problem for robotic automation arise in the difficulty in teaching how much force manipulators should use for a task execution. Motion reproduction system, which uses bilateral control to store motion data, is one of a method to teach manipulators motion including position and force. The problem concerning motion reproduction system is that the motion reproducing fails if environment is changed between motion saving phase and motion reproducing phase. Motion reproduction system which can understand and adapt to environment is required. Vision sensor can sense environment. Computer vision is mainly focus on how to classify objects. Vision information is seldom combined with motion control especially force motion. Therefore, I propose a motion reproduction system in which reproduced motion is decided based on several motions and collected depth data. Convolutional Neural Network(CNN) was used to estimate a motion command from a depth image. Saved force data was used to generate labels for training. The label decision is different from conventional Machine learning alzorithm.

AB - Most countries are running shortage of working force due to the aging population and reduction in the birthrate. Robot manipulators are expected to replace human work. However, it it still difficult for manipulators to do simple tasks such as fruit harvesting, foods cooking or toy assembling. A problem for robotic automation arise in the difficulty in teaching how much force manipulators should use for a task execution. Motion reproduction system, which uses bilateral control to store motion data, is one of a method to teach manipulators motion including position and force. The problem concerning motion reproduction system is that the motion reproducing fails if environment is changed between motion saving phase and motion reproducing phase. Motion reproduction system which can understand and adapt to environment is required. Vision sensor can sense environment. Computer vision is mainly focus on how to classify objects. Vision information is seldom combined with motion control especially force motion. Therefore, I propose a motion reproduction system in which reproduced motion is decided based on several motions and collected depth data. Convolutional Neural Network(CNN) was used to estimate a motion command from a depth image. Saved force data was used to generate labels for training. The label decision is different from conventional Machine learning alzorithm.

KW - bilateral control

KW - image processing

KW - motion control

KW - motion reproduction

UR - http://www.scopus.com/inward/record.url?scp=85067110125&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85067110125&partnerID=8YFLogxK

U2 - 10.1109/ICMECH.2019.8722836

DO - 10.1109/ICMECH.2019.8722836

M3 - Conference contribution

AN - SCOPUS:85067110125

T3 - Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019

SP - 471

EP - 476

BT - Proceedings - 2019 IEEE International Conference on Mechatronics, ICM 2019

PB - Institute of Electrical and Electronics Engineers Inc.

ER -