Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Visual Odometry Oscillations When Stationary with D435i on ROS 2 Humble #1418

Open
guojiaweialex opened this issue Dec 21, 2024 · 4 comments

Comments

@guojiaweialex
Copy link

Hello,

I'm experiencing an issue with RTAB-Map's visual odometry using an Intel RealSense D435i camera on ROS 2 Humble. When the system is stationary, the visual odometry output exhibits lateral oscillations, causing the estimated position to drift left and right. This behavior persists even when the sensor is completely still.

690_1734786483.mp4
@matlabbe
Copy link
Member

Which topics are you using for visual odometry? With D435i, don't use the RGB camera (use left+depth instead with IR emitter disabled).

@guojiaweialex
Copy link
Author

Which topics are you using for visual odometry? With D435i, don't use the RGB camera (use left+depth instead with IR emitter disabled).

I’m currently using both RGB and depth images because I need the depth data enhanced by infrared structured light for obstacle avoidance and navigation. This means I can’t use IR as the RGB image source unless I toggle the infrared mode.

I’d like to know if using RGB and depth simultaneously on the D435i or D455 cameras could cause any specific issues?

I’ve tried the following configuration, and it seems that the jitter has been reduced. I observed the X and Y positions, and the jitter now seems to be within approximately 0.01 cm, whereas it was around 0.05 cm when Odom/Strategy was set to 0.

'frame_id': 'head_camera_link',
'subscribe_depth': True,
'subscribe_odom_info': True,
'approx_sync': True,
'wait_imu_to_init': True,
'qos': 2,
'queue_size': 10,
'Rtabmap/DetectionRate': '5',
'Odom/Strategy': '1',
'Vis/CorFlowGpu': 'true',
'Vis/CoreType': '1',
'GFTT/Gpu': 'true',
'SIFT/Gpu': 'true',
'Stereo/Gpu': 'true',
'Vis/BundleAdjustment': '1',
'Vis/MaxFeatures': '1000',
'Grid/MaxObstacleHeight': '1.5',

However, I don’t fully understand the principles behind these parameters. With the current setup, I still notice occasional jitter in areas with fewer features.

Under normal conditions, should there be absolutely no jitter when the system is stationary?
Do you have any suggestions for improving performance in environments with plain walls or areas with fewer features?
I currently have a wheel odometer and IMU, and I can also use the camera’s IMU if needed.

Any advice would be greatly appreciated.

Thank you!

@matlabbe
Copy link
Member

I think with D455 you could already have better results than D435i if you need to use the RGB camera. The RGB camera on D435i has a rolling shutter, while on D455 it has a global shutter.

The odometry will always shake based on the noise in the point cloud, in other words, the shaking you see is related to noise in the disparity. For example, if the camera is looking at far objects, it will shake more than when looking at close objects.

Odom/Strategy=1 is F2F (Frame to KeyFrame approach) and Vis/CoreType is optical flow. This strategy may make the camera shake less when not moving, but it may actually shake more when moving, in particular if the Vis/KeyFrame threshold is low. Odom/Strategy=0 is F2M (Frame to Map approach), which does local bundle adjustment over some key frames. This generally gives less drift, though may shake more when not moving.

Do you have any suggestions for improving performance in environments with plain walls or areas with fewer features?
I currently have a wheel odometer and IMU, and I can also use the camera’s IMU if needed.

Larger FOV helps to be more robust to that. VIO could help in some cases, though if you stay stationary looking at a white wall, a VIO approach would also drift over time. If you have wheel odometry, it is possible to feed rgbd_odometry with guess_frame_id parameter, the frame of wheel odometry. Combined with Odom/ResetCountdown (non zero), vo could reset automatically when it cannot track features anymore, and reset at a position based on current wheel odometry pose.

Another approach could be to integrate imu, wheel odom and vo into robot_localization EKF node.

@guojiaweialex
Copy link
Author

guojiaweialex commented Dec 24, 2024

I think with D455 you could already have better results than D435i if you need to use the RGB camera. The RGB camera on D435i has a rolling shutter, while on D455 it has a global shutter.

The odometry will always shake based on the noise in the point cloud, in other words, the shaking you see is related to noise in the disparity. For example, if the camera is looking at far objects, it will shake more than when looking at close objects.

Odom/Strategy=1 is F2F (Frame to KeyFrame approach) and Vis/CoreType is optical flow. This strategy may make the camera shake less when not moving, but it may actually shake more when moving, in particular if the Vis/KeyFrame threshold is low. Odom/Strategy=0 is F2M (Frame to Map approach), which does local bundle adjustment over some key frames. This generally gives less drift, though may shake more when not moving.

Do you have any suggestions for improving performance in environments with plain walls or areas with fewer features?
I currently have a wheel odometer and IMU, and I can also use the camera’s IMU if needed.

Larger FOV helps to be more robust to that. VIO could help in some cases, though if you stay stationary looking at a white wall, a VIO approach would also drift over time. If you have wheel odometry, it is possible to feed rgbd_odometry with guess_frame_id parameter, the frame of wheel odometry. Combined with Odom/ResetCountdown (non zero), vo could reset automatically when it cannot track features anymore, and reset at a position based on current wheel odometry pose.

Another approach could be to integrate imu, wheel odom and vo into robot_localization EKF node.

First, I’d like to clarify a configuration mistake I made earlier. I didn’t find the parameter Vis/CoreType in rqt, but instead found Vis/CorType. When I changed Vis/CorType=1, the odometry drifted significantly under certain conditions.

Additionally, I couldn’t find the Vis/KeyFrame parameter you mentioned earlier. Could this be due to differences in parameter names between ROS2 versions?

test

If I use the F2M method, my odometry only publishes at 15 Hz. Even when I set the camera frame rate to 60 Hz, the output remains at 15 Hz.

Are there any recommended optimizations for this?
I’ve already enabled GPU acceleration. Or is 15 Hz generally sufficient for odometry?

Regarding the guess_frame_id parameter, this approach seems more robust than robot_localization. Even if the wheel odometry has some errors, it appears that VO odometry can correct it—similar to how AMCL adjusts the TF tree.

I tried this setup, and my TF tree looked like:

map -> vo

odom_wheel -> base_link

However, vo and odom were not connected, so I couldn’t get the transform from map -> base_link.

To fix this, I changed the frame_id in RTAB-Map to odom_wheel, and then the TF tree became:

map -> vo -> odom_wheel -> base_link

Does this configuration seem reasonable? Or should both guess_frame_id and frame_id be set to odom_wheel?
I’m concerned that my configuration might not be entirely correct.

I’ve looked into the Odom/ResetCountdown parameter, and I think it’s quite useful. However, I have a question regarding its behavior:
If visual features are lost and the visual odometry resets, it may rely on the wheel odometry to reset.
Now, consider the following scenario:

The wheel odometry experiences slippage and drift.
The visual odometry can initially correct the TF tree based on its observations.
But later, if the visual odometry also loses features, will it reset back to the drifted position derived from the wheel odometry?

I’m concerned that in such cases, the visual odometry might unintentionally adopt the errors caused by wheel slippage.

Is there any recommended configuration or approach to handle this kind of situation more robustly?

Thanks again for creating such a fantastic tool, and I really appreciate your time and insights!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants