We aim at developing a remote collaborative Mixed Real-ity(MR) system targeting the interaction between instructors and workers. In this system, both instructors and workers have their own real objects, and the instructor shows the worker how to operate the object by moving his own real object. Since the movement of the instructor's object is displayed as a CG image of the trajectory overlapping the worker's object in each operation, the worker can easily see the changes in the position of the instructor's object and trace the movement along the trajectory. We created this model using the object coordinates instead of the world coordinates in order to be able to ignore the changes in the position of the users and other objects in the world coordinates. We implemented a prototype and used it to conduct validation experiments. The results indicated that worker operation time is shorter in the proposed model in comparison to conventional models, where each user's orientation towards the object is different.