Robots working in the human environment have been researched in the field of motion control. For the next-generation robot, human and robot interaction technologies are needed. In particular, learning and displaying of human haptic motion are important. Therefore, the authors have proposed a method for abstracting haptic motion and designed a haptic motion display system. The motion abstraction method divides a measured motion to each action from the point of force and position. Then, action modes have been defined for expressing each divided action. Action modes are expressing force directionality or position directionality. By utilizing the proposed motion abstraction method, various kinds of human motion are abstracted as action modes. The designed haptic motion display system is trying to show these various kinds of human motion. This paper defines human action modes and environmental action modes from action modes. Human action modes are expressing human action force directionality, while environmental action modes are expressing environmental position response directionality. Furthermore, the haptic motion display system is redesigned. This system is redesigned based on human action modes and environmental action modes. The validity of the proposed method is confirmed by the experimental results.
ASJC Scopus subject areas
- Control and Systems Engineering
- Electrical and Electronic Engineering