TY - GEN
T1 - Neural Implicit Event Generator for Motion Tracking
AU - Masuda, Mana
AU - Sekikawa, Yusuke
AU - Fujii, Ryo
AU - Saito, Hideo
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - We present a novel framework of motion tracking from event data using implicit expression. Our framework uses pre-trained event generation MLP called the implicit event generator (IEG) and carries out motion tracking by updating its state (position and velocity) based on the difference between the observed event and generated event from the current state estimation. The difference is computed implicitly by the IEG. Unlike the conventional explicit approach, which requires dense computation to evaluate the difference, our implicit approach realizes the update of the efficient state directly from sparse event data. Our sparse algorithm is especially suitable for mobile robotics applications in which computational resources and battery life are limited. To verify the effectiveness of our method on real-world data, we applied it to the AR marker tracking application. We have confirmed that our framework works well in real-world environments in the presence of noise and background clutter.
AB - We present a novel framework of motion tracking from event data using implicit expression. Our framework uses pre-trained event generation MLP called the implicit event generator (IEG) and carries out motion tracking by updating its state (position and velocity) based on the difference between the observed event and generated event from the current state estimation. The difference is computed implicitly by the IEG. Unlike the conventional explicit approach, which requires dense computation to evaluate the difference, our implicit approach realizes the update of the efficient state directly from sparse event data. Our sparse algorithm is especially suitable for mobile robotics applications in which computational resources and battery life are limited. To verify the effectiveness of our method on real-world data, we applied it to the AR marker tracking application. We have confirmed that our framework works well in real-world environments in the presence of noise and background clutter.
UR - http://www.scopus.com/inward/record.url?scp=85136333716&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85136333716&partnerID=8YFLogxK
U2 - 10.1109/ICRA46639.2022.9812142
DO - 10.1109/ICRA46639.2022.9812142
M3 - Conference contribution
AN - SCOPUS:85136333716
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 2200
EP - 2206
BT - 2022 IEEE International Conference on Robotics and Automation, ICRA 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 39th IEEE International Conference on Robotics and Automation, ICRA 2022
Y2 - 23 May 2022 through 27 May 2022
ER -