-
Notifications
You must be signed in to change notification settings - Fork 790
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Visual Odometry Oscillations When Stationary with D435i on ROS 2 Humble #1418
Comments
Which topics are you using for visual odometry? With D435i, don't use the RGB camera (use left+depth instead with IR emitter disabled). |
I’m currently using both RGB and depth images because I need the depth data enhanced by infrared structured light for obstacle avoidance and navigation. This means I can’t use IR as the RGB image source unless I toggle the infrared mode. I’d like to know if using RGB and depth simultaneously on the D435i or D455 cameras could cause any specific issues? I’ve tried the following configuration, and it seems that the jitter has been reduced. I observed the X and Y positions, and the jitter now seems to be within approximately 0.01 cm, whereas it was around 0.05 cm when Odom/Strategy was set to 0. 'frame_id': 'head_camera_link', However, I don’t fully understand the principles behind these parameters. With the current setup, I still notice occasional jitter in areas with fewer features. Under normal conditions, should there be absolutely no jitter when the system is stationary? Any advice would be greatly appreciated. Thank you! |
I think with D455 you could already have better results than D435i if you need to use the RGB camera. The RGB camera on D435i has a rolling shutter, while on D455 it has a global shutter. The odometry will always shake based on the noise in the point cloud, in other words, the shaking you see is related to noise in the disparity. For example, if the camera is looking at far objects, it will shake more than when looking at close objects.
Larger FOV helps to be more robust to that. VIO could help in some cases, though if you stay stationary looking at a white wall, a VIO approach would also drift over time. If you have wheel odometry, it is possible to feed rgbd_odometry with Another approach could be to integrate imu, wheel odom and vo into robot_localization EKF node. |
First, I’d like to clarify a configuration mistake I made earlier. I didn’t find the parameter Vis/CoreType in rqt, but instead found Vis/CorType. When I changed Vis/CorType=1, the odometry drifted significantly under certain conditions. Additionally, I couldn’t find the Vis/KeyFrame parameter you mentioned earlier. Could this be due to differences in parameter names between ROS2 versions? If I use the F2M method, my odometry only publishes at 15 Hz. Even when I set the camera frame rate to 60 Hz, the output remains at 15 Hz. Are there any recommended optimizations for this? Regarding the guess_frame_id parameter, this approach seems more robust than robot_localization. Even if the wheel odometry has some errors, it appears that VO odometry can correct it—similar to how AMCL adjusts the TF tree. I tried this setup, and my TF tree looked like: map -> vo odom_wheel -> base_link However, vo and odom were not connected, so I couldn’t get the transform from map -> base_link. To fix this, I changed the frame_id in RTAB-Map to odom_wheel, and then the TF tree became: map -> vo -> odom_wheel -> base_link Does this configuration seem reasonable? Or should both guess_frame_id and frame_id be set to odom_wheel? I’ve looked into the Odom/ResetCountdown parameter, and I think it’s quite useful. However, I have a question regarding its behavior: The wheel odometry experiences slippage and drift. I’m concerned that in such cases, the visual odometry might unintentionally adopt the errors caused by wheel slippage. Is there any recommended configuration or approach to handle this kind of situation more robustly? Thanks again for creating such a fantastic tool, and I really appreciate your time and insights! |
Hello,
I'm experiencing an issue with RTAB-Map's visual odometry using an Intel RealSense D435i camera on ROS 2 Humble. When the system is stationary, the visual odometry output exhibits lateral oscillations, causing the estimated position to drift left and right. This behavior persists even when the sensor is completely still.
690_1734786483.mp4
The text was updated successfully, but these errors were encountered: