Wheel Odometry Integration

Note

Wheel odometry integration requires visual-inertial-kinematics (VIK) calibration and this is available as part of commercial projects. Please contact support@slamcore.com for more information before capturing sequences as described below.

Ground, wheeled robots often have sensors on the motors. These sensors can be used to obtain odometry estimates. The combination of wheel odometry with visual-inertial SLAM can provide highly robust, accurate positioning. This tutorial provides a description of the process that needs to be followed to integrate wheel odometry into SLAMcore’s Position sensor fusion algorithms within a ROS Melodic environment.

The wheel odometry integration will allow your system to run SLAM with an accurate positioning system based on visual feed, the IMU and wheel odometry obtained from your robotic platform.

To synchronise the sensor data from the visual, inertial and odometry sources, SLAMcore provides calibration customised to your robotic setup. Two stages of calibration are carried out:

  1. visual-inertial (VI) calibration to estimate the intrinsic and extrinsic parameters of your IMU and camera, and

  2. visual-inertial-kinematics calibration to estimate the parameters for wheel odometry.

Requirements

  • Wheel odometry ROS nav_msgs/Odometry messages published in ROS1 Melodic or ROS2 Foxy.

  • Wheel odometry is reported in its respective frame of reference (O) which

  • is different to the visual-inertial frame of reference of the robotic system (S).

  • The format of the odometry is (x, y, theta). Measurement covariance is not required.

  • The odometry measurement frequency should be constant and at least 10Hz.

Note

For an example of how to publish odometry information over ROS see:

You may obtain the pose directly from your wheel encoders instead of using tf.

VI Calibration

During VI calibration, the following parameters are estimated from the visual-inertial datasets:

  • Camera intrinsics and distortion parameters

  • Camera to IMU (Sensor) extrinsics

  • Time offset between the camera and IMU measurements

  • Noise densities for the accelerometer and gyro IMU biases

  • Accelerometer and gyro IMU biases priors.

VI Calibration Dataset

To perform the VI calibration customised to your camera, we require a dataset recorded from that specific camera, with visual and inertial data. The ideal recording environment for VI calibration is a well-lit space with textured structures within 2 metres of the camera, such as in a standard office or home, with the camera hand-held.

Warning

Please record using the same camera that will be used with your robotic platform as the VI calibration is camera-specific.

Recording Procedure

Record 2 datasets, from different starting points, following this procedure:

  1. Place the camera on a flat surface and take note of its exact starting position.

  2. Launch the dataset recorder (either with our ROS wrappers or with SLAMcore tools):

    ################## if using SLAMcore tools with GUI ###################
    $ slamcore_dataset_recorder --no-depth
    
    
    ################# if using SLAMcore tools without GUI #################
    $ slamcore_dataset_recorder_cli --no-depth -o <output-dir>
    
    
    ############################ if using ROS1 ############################
    $ source /opt/ros/melodic/setup.bash
    
    $ roslaunch slamcore_slam run_dataset_recorder.launch \
    > override_realsense_depth:=true \
    > realsense_depth_override_value:=false \
    > output_dir:=<output_dir>
    
    # To enable ROS1 visualisation, run in another terminal or machine (optional)
    $ roslaunch slamcore_viz setup_monitoring.launch
    
    ############################ if using ROS2 ############################
    $ source /opt/ros/foxy/setup.bash
    
    $ ros2 launch slamcore_slam run_dataset_recorder.launch.py \
    > override_realsense_depth:=true \
    > realsense_depth_override_value:=false \
    > output_dir:=<output_dir>
    
  3. Walk around the space with smooth steady motion of the camera through all 6 axes (x,y,z, pitch, roll, yaw). A slow figure-of-eight motion will work fine.

  4. Capture around 2 minutes of data and ensure you return to the starting point with roughly the same orientation.

  5. Stop the recording.

Note

Ensure that the trajectory does not contain sudden jerks or rotations.

Dataset Folder Structure

You should end up with the following dataset structure:

VI_calibration_datasets/
├── VI_calib0/
│   ├── capture_info.json
│   ├── imu0/
│   ├── ir0/
│   └── ir1/
└── VI_calib1/

VIK Calibration

During VIK calibration, the following parameters are estimated from the visual-inertial-kinematics datasets:

  • Time offset between the camera and wheel odometry measurements

  • Parameters accounting for wheel odometry’s systematic and non-systematic errors.

  • Wheel odometry to IMU extrinsics transformation: T_SO.

VIK Calibration Dataset

To perform the VIK calibration customised to your camera and wheeled robotic platform, we require multiple datasets recorded from that specific camera and platform, with visual, inertial and wheel odometry data. When recording the calibration datasets, ensure that your camera is mounted securely on your robotic platform in the position and orientation required for deployment.

Recording Procedure

Note

On NVIDIA Jetson kits flashed with the default release of an L4T image, you may be required to have superuser permissions before running the SLAMcore ROS dataset recorder. To do so, run sudo su before running the launch files.

With wheel odometry input turned on for your robotic platform, you may record a dataset with visual, inertial and odometry data in ROS by running this:

############################ if using ROS1 ############################
# source the setup.bash file according to your shell before running any of the ROS nodes or launchfiles provided
$ source /opt/ros/melodic/setup.bash

# initialise your odometry provider and robot controller

# connect to the camera and odometry provider and record
$ roslaunch slamcore_slam run_dataset_recorder.launch \
> output_dir:=<output-dir> \
> odom_reading_topic:=<odometry-provider-topic> \
> override_realsense_depth:=true \
> realsense_depth_override_value:=false

############################ if using ROS2 ############################
$ source /opt/ros/foxy/setup.bash

$ ros2 launch slamcore_slam run_dataset_recorder.launch.py \
> output_dir:=<output-dir> \
> odom_reading_topic:=<odometry-provider-topic> \
> override_realsense_depth:=true \
> realsense_depth_override_value:=false

To enable ROS1 visualisation of the camera feed in rviz, run in another terminal window or machine (optional):

$ roslaunch slamcore_viz setup_monitoring.launch

See more parameter options and defaults under ROS1 API and ROS2 API.

Calibration Sequences

A number of datasets or sequences are required for the VIK calibration:

  1. 5 calibration sequences with good conditions for the Visual Inertial SLAM system. For each sequence, perform 2 consecutive square loops (the first loop clockwise, the second loop counter-clockwise), along the perimeter of a 5×5 m square on a flat ground.

  • While 5 calibration sequences would be ideal, 1-2 will suffice if there are time-constraints.

  • While a 5×5 m square area is ideal, 1×1, 1.5×1.5, 2×2, 3×3 or 4×4 works will suffice if there are space constraints.

  • The starting points for each sequence should be different when possible, but the ending points and orientation roughly matching each of the starting points.

  1. 1 long sequence in which the robot goes over different surfaces, such as carpets, hardwood, tatami, rugs.

Note

Here are some recording tips to ensure good calibration results:

  • Ensure the camera motion is smooth but not too slow.

  • Start and finish recording with the camera in the same place, pointing in the same direction with the same orientation.

  • Record in a space where there are enough visual textures in the scene within 2 metres from the camera.

Dataset Folder Structure

VIK_calibration_datasets/
├── VIK_calib0/
│   ├── capture_info.json
│   ├── imu0/
│   ├── ir0/
│   ├── ir1/
│   └── odometry0/
├── VIK_calib1/
├── VIK_calib2/
├── VIK_calib3/
├── VIK_calib4/
└── VIK_calib_long/ --------------> robot travelling on different surfaces

Send to SLAMcore for Calibration

Dataset Validation

Before sending the datasets to SLAMcore, ensure that the sequences recorded are suitable for calibration by running Visual-inertial SLAM on them by running them on the SLAMcore Visualiser:

$ slamcore_visualiser dataset -u <path/to/dataset>

For all the datasets, ensure that the trajectories look reasonable with no large jumps and that loop closures are present (if possible).

Note

The VIK calibration is an iterative process, during which we might request for additional datasets or a different recording setup to improve calibration results. Validating the datasets prior to calibration will improve the efficiency of calibration process.

Compress the files

Compress each individual dataset into a zip file each for ease of file sharing. For example, run in your terminal window:

$ tar -czvf VI_calib0.tar.gz <path/to/dataset>

The folder structure of the compressed datasets should roughly be:

SLAMcore_<company_name>_calibration_datasets/
├── VI_calibration_datasets/
│   ├── VI_calib0.tar.gz
│   └── VI_calib1.tar.gz
└── VIK_calibration_datasets/
    ├── VIK_calib0.tar.gz
    ├── VIK_calib1.tar.gz
    ├── VIK_calib2.tar.gz
    ├── VIK_calib3.tar.gz
    ├── VIK_calib4.tar.gz
    └── VIK_calib_long.tar.gz

Obtain the Camera-Odometry Frame of Reference Transformation

Based on your robotic platform’s setup, measure roughly the physical transformation between the odometry frame and visual-inertial sensor’s frame of reference as a 4×4 homogeneous matrix.

We can work with either one of the following transformations:

  • T_SO: The transformation between the odometry frame (O) and the IMU sensor’s frame of reference (S) , or

  • T_CO: Transformation between the odometry frame (O) and the left IR sensor’s frame of reference (C) .

The rough estimate will be overwritten during the calibration process.

_images/wheel_odom_T_SO.png

Fig. 49 T_SO Transformation Matrix

The Intel RealSense D435i reference frame is as illustrated at RealSense’s tutorial page.

Email SLAMcore

Email us at support@slamcore.com with the following information:

  1. A link to download the folder containing the calibration datasets

2. The rough estimation of T_SO or T_CO, specifying which has been provided: the transform to the IMU or the left IR sensor.

  1. A photo showing the robot and camera sensor setup and the orientation of their respective frames of references.

Once the calibration is complete, we will provide you a VIK configuration file in JSON format, which can be used to run VIK on your setup.

Run the system in VIK Mode

Warning

The calibration configuration only applies to the camera and robotic platform setup you have recorded the calibration sequences on. Please do not alter the camera position on the robot, or use a different camera/robot with the VIK configuration file. You will need to calibrate again if the camera moves even slightly from the calibrated position or orientation due to bumping.

Run VIK SLAM on live camera (ROS)

To run VIK SLAM live on your robotic platform with our ROS wrapper, ensure you can provide visual, inertial and odometry data and run:

############################ if using ROS1 ############################
# source the setup.bash file according to your shell before running any of the ROS nodes or launchfiles provided
$ source /opt/ros/melodic/setup.bash

# initialise your odometry provider and robot controller

# connect to the camera and odometry provider and run VIK SLAM
$ roslaunch slamcore_slam run_slam.launch \
> config_file:=<path/to/vik-configuration-file> \
> odom_reading_topic:=<odometry-provider-topic>


############################ if using ROS2 ############################
$ source /opt/ros/foxy/setup.bash

$ ros2 launch slamcore_slam slam_publisher.launch.py \
> config_file:=<path/to/vik-configuration-file> \
> odom_reading_topic:=<odometry-provider-topic>

To enable ROS1 visualisation of the camera feed in rviz, run in another terminal window or machine:

$ roslaunch slamcore_viz setup_monitoring.launch

Run in Localisation mode with VIK (ROS)

To run in localisation mode, you must have a previously created session map. Load the session map by providing the full path to the session file when launching the SLAM wrapper ROS node:

############################ if using ROS1 ############################
# source the setup.bash file according to your shell before running any of the ROS nodes or launchfiles provided
$ source /opt/ros/melodic/setup.bash

# initialise your odometry provider and robot controller

# connect to the camera and odometry provider and run VIK SLAM with a session file
$ roslaunch slamcore_slam run_slam.launch \
> config_file:=<path/to/vik-configuration-file> \
> session_file:=<path/to/session/file> \
> odom_reading_topic:=<odometry-provider-topic>


############################ if using ROS2 ############################
# source the setup.bash file according to your shell before running any of the ROS nodes or launchfiles provided
$ source /opt/ros/foxy/setup.bash

# initialise your odometry provider and robot controller

# connect to the camera and odometry provider and run VIK SLAM with a session file
$ ros2 launch slamcore_slam slam_publisher.launch.py \
> config_file:=<path/to/vik-configuration-file> \
> session_file:=<path/to/session/file> \
> odom_reading_topic:=<odometry-provider-topic>

Run VIK SLAM with a dataset

SLAMcore Visualiser and Dataset Processor

You may run VIK SLAM on any dataset that has been recorded with odometry data, enabling kinematics by parsing the VIK configuration file.

In the SLAMcore Visualiser:

$ slamcore_visualiser dataset -u <path/to/dataset> -c <path/to/vik-configuration-file>

You may verify that you are running VIK SLAM on the visualiser by checking the Mode displayed on the left sidebar.

_images/wheel_odom_visualiser.png

To run VIK SLAM in the SLAMcore Dataset Processor:

$ slamcore_dataset_processor dataset -u <path/to/dataset> -c <path/to/vik-configuration-file> -o <output/directory>

SLAMcore ROS Wrappers

To run VIK SLAM on a dataset with the ROS1 wrapper:

$ source /opt/ros/melodic/setup.bash

$ roslaunch slamcore_slam run_slam.launch dataset_path:=<path/to/dataset> config_file:=<path/to/vik-configuration-file>

To run VIK SLAM on a dataset with the ROS2 wrapper:

$ source /opt/ros/foxy/setup.bash

$ ros2 launch slamcore_slam slam_publisher.launch.py dataset_path:=<path/to/dataset> config_file:=<path/to/vik-configuration-file>

Note

If a VIK configuration file is not provided, the SLAMcore tools will default to running in visual-inertial SLAM mode on any type of dataset.