You’re reading an older version of the Slamcore SDK documenation. The latest one is 23.01.

Wheel Odometry Integration

Note

Wheel odometry integration requires visual-inertial-kinematics (VIK) calibration and this is available as part of commercial projects. Please contact support@slamcore.com for more information before capturing sequences as described below.

Ground, wheeled robots often have sensors on the motors. These sensors can be used to obtain odometry estimates. The combination of wheel odometry with visual-inertial SLAM can provide highly robust, accurate positioning. This tutorial provides a description of the process that needs to be followed to integrate wheel odometry into SLAMcore’s Position sensor fusion algorithms within a ROS Melodic environment.

The wheel odometry integration will allow your system to run SLAM with an accurate positioning system based on visual feed, the IMU and wheel odometry obtained from your robotic platform.

To synchronise the sensor data from the visual, inertial and odometry sources, SLAMcore provides calibration customised to your robotic setup. Two stages of calibration are carried out:

  1. visual-inertial (VI) calibration to estimate the intrinsic and extrinsic parameters of your IMU and camera, and

  2. visual-inertial-kinematics calibration to estimate the parameters for wheel odometry.

Requirements

  • Wheel odometry ROS nav_msgs/Odometry messages published in ROS1 Melodic or ROS2 Foxy.

  • Wheel odometry is reported in its respective frame of reference (O) which

  • is different to the visual-inertial frame of reference of the robotic system (S).

  • The format of the odometry is (x, y, theta). Measurement covariance is not required.

  • The odometry measurement frequency should be constant and at least 10Hz.

Note

For an example of how to publish odometry information over ROS see:

You may obtain the pose directly from your wheel encoders instead of using tf.

VI Calibration

Please refer to RealSense Camera Calibration for instructions on calibrating your Intel RealSense D435i or D455 camera.

VIK Calibration

During VIK calibration, the following parameters are estimated from the visual-inertial-kinematics datasets:

  • Time offset between the camera and wheel odometry measurements

  • Parameters accounting for wheel odometry’s systematic and non-systematic errors.

  • Wheel odometry to IMU extrinsics transformation: T_SO.

VIK Calibration Dataset

To perform the VIK calibration customised to your camera and wheeled robotic platform, we require multiple datasets recorded from that specific camera and platform, with visual, inertial and wheel odometry data. When recording the calibration datasets, ensure that your camera is mounted securely on your robotic platform in the position and orientation required for deployment.

Recording Procedure

Note

On NVIDIA Jetson kits flashed with the default release of an L4T image, you may be required to have superuser permissions before running the SLAMcore ROS dataset recorder. To do so, run sudo su before running the launch files.

With wheel odometry input turned on for your robotic platform, you may record a dataset with visual, inertial and odometry data in ROS by running this:

$ source /opt/ros/melodic/setup.bash

# initialise your odometry provider and robot controller
# ...

$ roslaunch slamcore_slam run_dataset_recorder.launch \
>   output_dir:=<output-dir> \
>   odom_reading_topic:=<odometry-provider-topic> \
>   override_realsense_depth:=true \
>   realsense_depth_override_value:=false

[OPTIONAL] To enable ROS1 visualisation in RViz, run the setup_monitoring.launch file in another terminal window or machine:

$ roslaunch slamcore_viz setup_monitoring.launch
$ source /opt/ros/foxy/setup.bash

# initialise your odometry provider and robot controller
# ...

$ ros2 launch slamcore_slam dataset_recorder.launch.py \
>   output_dir:=<output-dir> \
>   odom_reading_topic:=<odometry-provider-topic> \
>   override_realsense_depth:=true \
>   realsense_depth_override_value:=false

See more parameter options and defaults under ROS1 API and ROS2 API.

Calibration Sequences

Note

Please take into account the following while recording the calibration sequences.

  • Start and finish recording with the camera in the same place, pointing in the same direction with the same orientation in all your sequences.

  • Ensure the camera motion is smooth but not too slow.

  • Record in a space where there are enough visual textures in the scene within 2 metres from the camera.

  • Before starting to record each one of the VIK calibration sequences, stop and restart the odometry-publishing node, or at least make sure that the odometry measurements start from zero translation and rotation.

  • Make sure that the published odometry data abides to the coordinate frame convention of ROS. This means that when the robot moves forward the X component of the odometry translation should increase and when it moves laterally, it’s Y component should change. The rotations should also abide to this convention. For an in-plane 2D motion, and assuming the quaternion convention, only the Z and W components should change when the robot is operating.

A number of datasets or sequences are required for the VIK calibration:

  • 5 calibration sequences with good conditions for the Visual Inertial SLAM system. For each sequence, perform either 2 consecutive clockwise or counterclockwise square loops, along the perimeter of a 5×5 m square on a flat ground.

    • While 5 calibration sequences would be ideal, 1-2 will suffice if there are time-constraints.

    • While a 5×5 m square area is ideal, 1×1, 1.5×1.5, 2×2, 3×3 or 4×4 will suffice if there are space constraints.

    • The starting points for each sequence can be different, but the ending points and orientation roughly matching each of the starting points.

  • 1 long sequence in which the robot goes over different surfaces, such as carpets, hardwood, tatami, rugs. Ideally these should be the surfaces that the robot interacts the most during the intended deployment. Its goal is to estimate the non-systematic errors (i.e., the noise model of the odometry measurements).

Dataset Folder Structure

Before uploading the VIK datasets for calibration, please ensure that the recorded datasets abide to the following format

VIK_calibration_datasets/
├── VIK_calib0/
│   ├── capture_info.json
│   ├── imu0/
│   ├── ir0/
│   ├── ir1/
│   └── odometry0/
├── VIK_calib1/
├── VIK_calib2/
├── VIK_calib3/
├── VIK_calib4/
└── VIK_calib_long/ --------------> robot travelling on different surfaces

Send to SLAMcore for Calibration

Dataset Validation

Before sending the datasets to SLAMcore, ensure that the sequences recorded are suitable for calibration by running Visual-inertial SLAM on them by running them on the SLAMcore Visualiser:

$ slamcore_visualiser dataset -u <path/to/dataset>

For all the datasets, ensure that the trajectories look reasonable with no large jumps and that loop closures are present (if possible).

Note

The VIK calibration is an iterative process, during which we might request for additional datasets or a different recording setup to improve calibration results. Validating the datasets prior to calibration will improve the efficiency of calibration process.

Compress the files

Compress each individual dataset into a zip file each for ease of file sharing. For example, run in your terminal window for the first dataset you record.

# rename the dataset directory, then compress
$ mv <path/to/dataset> <path/to/VI_calib0>
$ tar -czvf VI_calib0.tar.gz <path/to/VI_calib0>

The folder structure of the compressed datasets should roughly be:

SLAMcore_<company_name>_calibration_datasets/
├── VI_calibration_datasets/
│   ├── VI_calib0.tar.gz
│   └── VI_calib1.tar.gz
└── VIK_calibration_datasets/
    ├── VIK_calib0.tar.gz
    ├── VIK_calib1.tar.gz
    ├── VIK_calib2.tar.gz
    ├── VIK_calib3.tar.gz
    ├── VIK_calib4.tar.gz
    └── VIK_calib_long.tar.gz

Obtain the Camera-Odometry Frame of Reference Transformation

Based on your robotic platform’s setup, measure roughly the physical transformation between the wheel odometry frame of reference and the visual-inertial sensor’s frame of reference as a 4×4 homogeneous matrix.

_images/wheel_odom_diagram.png

Fig. 59 Example robot and camera-odometry frames of reference

We can work with either one of the following transformations:

  • T_SO: The transformation of the wheel odometry frame (O) with regards to the IMU sensor’s frame of reference (S) , or

  • T_CO: The transformation of the wheel odometry frame (O) with regards to the left IR sensor’s frame of reference (C) .

The rough estimate will be overwritten during the calibration process.

_images/wheel_odom_T_SO.png

Fig. 60 T_SO Transformation Matrix

As an example, for the robot shown in this tutorial, the T_SO would be:

[  0, -1,  0,  0.0154;
   0,  0, -1,  0.2358;
   1,  0,  0, -0.0874;
   0,  0,  0,  1.0000  ]
_images/wheel_odom_distances.png

Fig. 61 Approximate distance between frames of reference

The wheel odometry frame of reference is approximately 1.5cm to the right, 23.6cm below and 8.7cm behind the camera’s IMU sensor frame of reference. The wheel odometry frame of reference is also rotated 90 degrees in X and -90 degrees in Z relative to the visual-inertial frame of reference - this rotation is given by the rotation matrix shown above. You may find this online tool useful to convert between different representations of a 3D rotation.

Detailed look at the the visual-inertial frame of reference (camera’s coordinate system)

The Intel RealSense camera’s coordinate system (used for the visual-inertial frame of reference) is shown in the figure below in detail (as illustrated on RealSense’s tutorial page).

_images/d435_coordinate_frames.png

Fig. 62 RealSense camera coordinate system (Visual-inertial frame of reference)

As mentioned earlier, the rough estimate provided above will be overwritten during the calibration process. The refined T_SO will be found in the VIK configuration file in the following form (with the rotation being expressed in wxyz quaternion form):

"T_SO": {
    "R": [
      0.4988096468110019,
      0.5038227837046615,
      -0.5036569976989878,
      0.4936407271864948
    ],
    "T": [
      0.015391235851986106,
      0.23577251157419954,
      -0.08736454907300172
    ]

Email SLAMcore

Email us at support@slamcore.com with the following information:

  1. A link to download the folder containing the calibration datasets.

  2. The rough estimation of T_SO or T_CO, specifying which has been provided: the transform to the IMU or the left IR sensor.

  3. A photo showing the robot and camera sensor setup and the orientation of their respective frames of references.

Once the calibration is complete, we will provide you with a VIK configuration file in JSON format, which can be used to run VIK on your setup.

Run the system in VIK Mode

Warning

The calibration configuration only applies to the camera and robotic platform setup you have recorded the calibration sequences on. Please do not alter the camera position on the robot, or use a different camera/robot with the VIK configuration file. You will need to calibrate again if the camera moves even slightly from the calibrated position or orientation due to bumping.

Run VIK SLAM on live camera (ROS)

To run VIK SLAM live on your robotic platform with our ROS wrapper, ensure you can provide visual, inertial and odometry data and run:

$ source /opt/ros/melodic/setup.bash

# initialise your odometry provider and robot controller
# ...

$ roslaunch slamcore_slam run_slam.launch \
>   config_file:=<path/to/vik-configuration-file> \
>   odom_reading_topic:=<odometry-provider-topic>

[OPTIONAL] To enable ROS1 visualisation in RViz, run the setup_monitoring.launch file in another terminal window or machine:

$ roslaunch slamcore_viz setup_monitoring.launch
$ source /opt/ros/foxy/setup.bash

# initialise your odometry provider and robot controller
# ...

$ ros2 launch slamcore_slam slam_publisher.launch.py \
>   config_file:=<path/to/vik-configuration-file> \
>   odom_reading_topic:=<odometry-provider-topic>

Run in Localisation mode with VIK (ROS)

To run in localisation mode, you must have a previously created session map. Load the session map by providing the full path to the session file when launching the SLAM wrapper ROS node:

$ source /opt/ros/melodic/setup.bash

# initialise your odometry provider and robot controller
# ...

$ roslaunch slamcore_slam run_slam.launch \
>   config_file:=<path/to/vik-configuration-file> \
>   session_file:=<path/to/session/file> \
>   odom_reading_topic:=<odometry-provider-topic>

[OPTIONAL] To enable ROS1 visualisation in RViz, run the setup_monitoring.launch file in another terminal window or machine:

$ roslaunch slamcore_viz setup_monitoring.launch
$ source /opt/ros/foxy/setup.bash

# initialise your odometry provider and robot controller
# ...

$ ros2 launch slamcore_slam slam_publisher.launch.py \
>   config_file:=<path/to/vik-configuration-file> \
>   session_file:=<path/to/session/file> \
>   odom_reading_topic:=<odometry-provider-topic>

Run VIK SLAM with a dataset

You may run VIK SLAM on any dataset that has been recorded with odometry data. You can do this by passing a VIK configuration file. If a VIK configuration file is not provided, the SLAMcore tools will default to running in visual-inertial SLAM mode on any type of dataset.

$ slamcore_visualiser dataset \
>   -u <path/to/dataset> -c <path/to/vik-configuration-file>

You may verify that you are running VIK SLAM on the visualiser by checking the Mode displayed on the left sidebar.

_images/wheel_odom_visualiser.png
$ slamcore_dataset_processor dataset \
>   -u <path/to/dataset> \
>   -c <path/to/vik-configuration-file> \
>   -o <output/directory>
$ source /opt/ros/melodic/setup.bash

$ roslaunch slamcore_slam run_slam.launch \
>   dataset_path:=<path/to/dataset> \
>   config_file:=<path/to/vik-configuration-file>

[OPTIONAL] To enable ROS1 visualisation in RViz, run the setup_monitoring.launch file in another terminal window or machine:

$ roslaunch slamcore_viz setup_monitoring.launch
$ source /opt/ros/foxy/setup.bash

$ ros2 launch slamcore_slam slam_publisher.launch.py \
>   dataset_path:=<path/to/dataset> \
>   config_file:=<path/to/vik-configuration-file>