Event Camera-Based Visual Odometry for Dynamic Motion Tracking of a Legged Robot Using Adaptive Time Surface
Shifan Zhu,Zhipeng Tang,Michael Yang,Erik Learned-Miller,Donghyun Kim,Shifan Zhu,Zhipeng Tang,Michael Yang,Erik Learned-Miller,Donghyun Kim
Our paper proposes a direct sparse visual odometry method that combines event and RGBD data to estimate the pose of agile-legged robots during dynamic locomotion and acrobatic behaviors. Event cameras offer high temporal resolution and dynamic range, which can eliminate the issue of blurred RGB images during fast movements. This unique strength holds a potential for accurate pose estimation of agi...