Slamcore ROS 2 Wrapper

Introduction

Slamcore provides a ROS 2 Wrapper which allows interaction with Slamcore algorithms via ROS 2 middleware for the following ROS 2 distributions:

  • ROS 2 Foxy on Ubuntu 20.04 and JetPack 5

  • ROS 2 Galactic on Ubuntu 20.04 and JetPack 5

  • ROS 2 Humble on Ubuntu 22.04

We also provide a Docker image available on slamcore-ros2-docker to enable users to run our ROS 2 Wrappers in an isolated environment or on unsupported operating systems (e.g. ROS 2 Humble on JetPack 5).

Warning

The Slamcore ROS 2 Wrapper depends on the Slamcore C++ API, you will need to install this before you attempt to install the Slamcore ROS 2 Debian package. Further details are provided below.

The installation instructions are as follows:

A working ROS 2 installation is required before installing the Slamcore ROS 2 Wrapper. More specifically ros-<ros-version>-ros-base should be installed while the ros-<ros-version>-desktop package is optional. The Slamcore C++ API must also be installed before installing the Slamcore ROS 2 Wrapper.

  1. To install and get started with ROS 2, follow the official instructions provided in the ROS documentation for Foxy, Galactic or Humble.

  2. Obtain the Slamcore C++ API package from the Download Slamcore Software link on the Slamcore Portal. This will download a Debian package named slamcore-dev.

  3. Next, obtain the Slamcore ROS 2 Wrapper package from the Download Slamcore Software link on the Slamcore Portal. Select the package for the ROS 2 distribution you want to use, e.g., ROS 2 Humble.

  4. Install the packages using apt or an apt-based tool to automatically resolve dependencies.

# Update the list of available packages
$ sudo apt update

# To understand what will be installed, use a command like:
$ apt install --dry-run /path/to/ros-<ros-version>-slamcore-ros*.deb
#                       ^~~~~ Specify the .deb file.

# To install the slamcore-dev Debian package, use a command like:
$ sudo apt install /path/to/slamcore-dev*.deb
#                  ^~~~~~~~~~ Specify the .deb file.

# To install the slamcore-ros Debian package, use a command like:
$ sudo apt install /path/to/ros-<ros-version>-slamcore-ros*.b
#                  ^~~~~~~~~~ Specify the .deb file.

In this case, no ROS 2 installation is required before installing the Slamcore ROS 2 Wrapper. Follow these instructions to set up our ROS 2 wrapper in a docker container. This is the preferred method if you are on a platform which does not match the Ubuntu version required by ROS.

  1. Download the Slamcore C++ API which matches your desired ROS distribution from the Download Slamcore Software link on the Slamcore Portal. This will download a Debian package named slamcore-dev.

  2. Download the Slamcore ROS 2 Wrapper from the Download Slamcore Software link on the Slamcore Portal. Select the package for the ROS 2 distribution you want to use, e.g., ROS 2 Humble.

  3. Clone the slamcore-ros2-docker repository and follow the instructions on GitHub to build Slamcore’s ROS 2 Docker image.

    $ git clone https://github.com/slamcore/slamcore-ros2-docker.git
    

Below we describe how to use the Slamcore ROS 2 Wrapper on Ubuntu with Foxy or Galactic or Humble natively installed. Details for running with the Docker container are provided at slamcore-ros2-docker. Instructions and examples on how to integrate the ROS 2 wrapper with Nav2 can be found in the Nav2 Integration Overview tutorial.

Usage - SLAM

Our ROS 2 node streams poses only when the tracking status is OK.

Note

Make sure you have sourced the setup.bash (adjust suffix according to your shell) before running any of the RO2 nodes or launchfiles provided.

The following modes can be used through the ROS 2 wrapper:

Mode

Description

SLAM

A sparse map is built on the go, without any prior knowledge about the environment.

Odometry

Positioning without using a map.

Localisation

Use a sparse map previously created in SLAM mode.

Height Mapping

Build a height and occupancy map for autonomous navigation.

Panoptic Segmentation

Segment and detect object instances. See Panoptic Segmentation API for requirements and installation.

Live SLAM

To run live SLAM make sure you have connected a registered, supported camera and have sourced the ROS 2 environment.

$ source /opt/ros/<ros-version>/setup.bash

Launch live Single Session SLAM with the following command:

$ ros2 launch slamcore_slam slam_publisher.launch.py

In a new terminal, verify that the node is streaming poses

$ ros2 topic echo /slamcore/pose

Expect output in the form:

# header:
#   stamp:
#     sec: 1627314066
#     nanosec: 961842688
#   frame_id: map
# pose:
#   position:
#     x: -0.00044667969888344613
#     y: -0.00023092165271676365
#     z: 0.0003275341432887679
#   orientation:
#     x: 0.059299501965805336
#     y: 0.003310471496864338
#     z: -0.0005325452114257774
#     w: 0.998234604810249

To save a SLAM session in ROS 2, call the /slamcore/save_session service:

$ ros2 service call /slamcore/save_session std_srvs/srv/Trigger

The session file is saved by default in the working directory from which you called ros2 launch slam_publisher.launch.py.

Note

It is possible to record a dataset whilst running live SLAM. See Dataset Recording during a SLAM run for more information.

SLAM using a recorded dataset

To run SLAM using a prerecorded dataset instead of a live camera you have to set the path to the dataset using either the dataset_path roslaunch argument or using the /slamcore/dataset_path ROS 2 parameter:

# set parameter and run
$ ros2 launch slamcore_slam slam_publisher.launch.py dataset_path:=</path/to/dataset>

VIO Mode

To run in Visual-Inertial Odometry mode, launch the SLAM node whilst passing in the following JSON configuration file:

{
    "Version": "1.0.0",
    "Base":
    {
        "ProfileName": "vio_mode"
    },
    "Position":
    {
        "PositioningMode": "ODOMETRY_ONLY",
    }
}

Launch with:

$ ros2 launch slamcore_slam slam_publisher.launch.py config_file:=</path/to/vio.json>

Learn more about VIO in the Visual-Inertial Odometry Positioning tutorial.

Localisation Mode

Warning

Slamcore SDK v23.01 brings several breaking changes, one of them being the session file format. Session files (.session) generated using older software versions (v21.06) will no longer be usable with v23.01 and above. You may use the new software to generate a new session file if you have retained the original dataset.

Please contact support@slamcore.com, providing your Slamcore account email address, if you require access to the v21.06 SDK packages and its supporting documentation to use in conjunction with your old session files.

In localisation mode, our system uses a previously created session map. To load a session map, provide the full path to the session file when launching the SLAM wrapper ROS 2 node, e.g.:

$ ros2 launch slamcore_slam slam_publisher.launch.py session_file:=<path/to/session/file>

Height Mapping Mode

In mapping mode, our system runs in SLAM mode but also generates a height map and an occupancy map which can be used in autonomous navigation.

To generate a height map, set the generate_map2d parameter to true and ensure the depth stream is enabled (true by default on x64, false by default on arm64):

$  ros2 launch slamcore_slam slam_publisher.launch.py generate_map2d:=true override_realsense_depth:=true realsense_depth_override_value:=true

Panoptic Segmentation

Warning

To run panoptic segmentation, make sure you have installed the slamcore-panoptic-segmentation-plugin. This can be downloaded from the Slamcore Portal.

This feature is currently only supported with ROS 2 Foxy and Galactic on NVIDIA Jetson platforms running JetPack 5 and using an Intel RealSense D435i or D455 camera.

_images/panoptic_rviz.png

Fig. 20 Detected person in RViz2

To run real-time panoptic segmentation on top of other modes such as SLAM and Height Mapping, you will need to enable the panoptic plugin, as well as the color and depth streams.

$ ros2 launch slamcore_slam slam_publisher.launch.py \
>   enable_panoptic_plugin:=true \
>   override_realsense_depth:=true \
>   realsense_depth_override_value:=true \
>   enable_color:=true

Note

It is possible to run segmentation without depth, however, no 3D pose estimate or bounding box will be produced for the detected objects.

With panoptic segmentation enabled, users will be able to get 2D semantic segmentation results from the slamcore/segmentation/segmentation_mask image topic, where each pixel in the image is assigned a class based on the detected objects. The current available classes are:

Class

Class Label

0

Person

1

Background

2

Unknown

The slamcore/segmentation/instance_mask topic provides an image, where each pixel is assigned an instance id based on all instances currently present in the image, irrespective of their class. Pixels which do not contain an object instance will be assigned an id of 0.

Note

The panoptic segmentation API does not currently track object instances over time. Therefore, the instance id of an object may change over multiple frames.

Lastly, if depth is enabled, the slamcore/segmentation/detections_3d topic provides an array of vision_msgs/Detection3D messages, containing the 3D pose (X-Y-Z) as well as the 3D bounding box for each detected object, relative to the slamcore/base_frame, also known as the Body frame. Note this topic will only be published when objects have been detected in the frame and depth data is available for the segmented pixels in the RGB image.

Visualize Panoptic Segmentation topics in RViz2

RViz2 expects image pixel values to be in the range 0-255, however, the pixels in the slamcore/segmentation/segmentation_mask and slamcore/segmentation/instance_mask images correspond to segmentation classes and instances and, therefore, will typically have values like 0, 1, 2. Due to the limited range of these figures, these image topics are currently unsuitable for human visual inspection without further post-processing. To be able to visualize the image topics you will have to remap the detected classes to specific colors and republish the images.

To visualise the 3D Detections and their respective bounding boxes, you can use the vision_msgs_rviz_plugins package. As this package is only available from apt for ROS 2 Humble, you will have to build the package from source to be able to visualize this topic in Foxy or Galactic. To do that, follow the steps below.

First clone the vision_msgs repo:

$  git clone https://github.com/slamcore/vision_msgs

Checkout the correct branch for your ROS 2 version:

$  cd vision_msgs
$  git checkout foxy

Build the package:

$  colcon build

Finally source the setup.bash file and launch RViz2:

$  source install/setup.bash
$  rviz2

In RViz2, click Add then scroll down the By display type list and add the Detection3DArray message.

_images/rviz2_add_detection3darray.png

Then open the menu of the Detection3DArray message you just added and edit the Topic field to /slamcore/segmentation/detections_3d.

Next time a person stands in front of the camera with the SLAM node running and the panoptic plugin enabled, you should be able to see a bounding box appear around the detected person.

Usage - Dataset Recording

It is possible to record a dataset with our standalone Dataset Recorder node or whilst running SLAM.

Dataset Recording during a SLAM run

To record a dataset while SLAM is running, simply provide the dataset_write_path when launching slam_publisher.launch.py.

$ ros2 launch slamcore_slam slam_publisher.launch.py dataset_write_path:=/home/<user>/<output-directory>

All streams that have been enabled will be recorded to the provided directory, taking into account any camera configuration settings that may have been loaded on launch using a JSON config_file, e.g., config_file:=$HOME/depth_preset.json.

Standalone Dataset Recorder

To record a dataset with our standalone Dataset Recorder, without running SLAM, run:

$ ros2 launch slamcore_slam dataset_recorder.launch.py output_dir:=/home/<user>/<output-directory>

By default, the dataset will be saved in the current working directory, unless the output_dir argument is specified. When recording a dataset, you can also provide a config file, for example to adjust the camera/depth settings config_file:=$HOME/depth_preset.json.

See the troubleshooting page to help diagnose common errors.

ROS 2 API

This section outlines the topics, services, and parameters that the Slamcore ROS 2 Wrapper uses as either inputs or outputs.

Lifecycles Nodes

Note

See the official ROS 2 documentation for more information about lifecycle nodes.

The Slamcore ROS 2 nodes can be launched as managed nodes and therefore you can change their state using the lifecycle package:

# execute a state change
$ ros2 lifecycle set /slamcore/slam_node <lifecycle-state>

Run the following to query about the current state of the node:

$ ros2 service call /slamcore/slam_node/get_state lifecycle_msgs/GetState

Run the following to view the full list of available transitions at the current state of the node:

$ ros2 service call /slamcore/slam_node/get_available_transitions lifecycle_msgs/srv/GetAvailableTransitions

Run the following to reset SLAM:

$ ros2 lifecycle set /slamcore/slam_node deactivate
$ ros2 lifecycle set /slamcore/slam_node cleanup
$ ros2 lifecycle set /slamcore/slam_node configure
$ ros2 lifecycle set /slamcore/slam_node activate

Note

The list of available transitions will be different depending on what the current state of the node is.

  • /slamcore/slam_node/configure
  • /slamcore/slam_node/cleanup
    • Type: lifecycle_msgs/Transition

    • Description: Clear all state and return the node to a functionally equivalent state as when first created. If the cleanup cannot be successfully achieved it will transition to ErrorProcessing.

  • /slamcore/slam_node/activate
  • /slamcore/slam_node/deactivate
    • Type: lifecycle_msgs/Transition

    • Description: This method is expected to do any cleanup to start executing, and should reverse the onActivate changes.

Advertised Topics

List of topics advertised by slam_publisher. All topics are published using the default Quality of Service settings unless otherwise specified:

  • /diagnostics
  • /slamcore/accel
    • Type: sensor_msgs/Imu

    • Description: Accelerometer data, interpolated to align with the gyroscope measurements.

  • /slamcore/gyro
  • /slamcore/imu_bias
    • Type: slamcore_msgs/ImuBias

    • Description: IMU biases (systematic errors or offsets present in the measurements provided by the IMU) estimated once every frame. The IMU biases reference frame is the IMU and the units for the accelerometer bias and the gyroscope bias are meter/s^2 and radian/s respectively. The timestamp of the IMU biases corresponds to the source acquisition timestamp of the latest pose, which is the camera timestamp of the frame.

  • /slamcore/{infrared, visual, depth}{, _0, _1, …}/camera_info
  • /slamcore/{infrared, visual}{_0, _1, …}/image_raw
  • /slamcore/depth/image_raw
    • Type: sensor_msgs/Image

    • Description: Depth data. Data is provided with float encoding, where each pixel value is the depth in metres.

  • /slamcore/local_point_cloud
  • /slamcore/segmentation/segmentation_mask
  • /slamcore/segmentation/instance_mask
    • Type: sensor_msgs/Image

    • Description: Raw mono8 instance segmentation image, where each pixel is assigned an instance id out of all detected instances. See Panoptic Segmentation for more information.

  • /slamcore/segmentation/detections_3d
    • Type: vision_msgs/Detection3DArray

    • Description: Array of 3D detections, containing detected objects’ pose and 3D bounding box relative to the slamcore/base_frame. This topic will only be published when objects are detected in the frame. See Panoptic Segmentation for more information.

  • /slamcore/metadata/{num_features, tracked_features}
    • Type: std_msgs/Int64

    • Description: Metadata published periodically by the SLAM algorithm.

  • /slamcore/metadata/tracking_status
    • Type: slamcore_msgs/TrackingStatus

    • Description: Metadata published periodically reporting the SLAM positional tracking status. See Tracking Information for more information on possible tracking statuses.

  • /slamcore/metadata/slam_event
    • Type: slamcore_msgs/SLAMEvent

    • Description: Metadata published when a SLAM event such as a loop closure occurs. See Tracking Information for more information on possible SLAM events.

  • /slamcore/metadata/distance_travelled
    • Type: std_msgs/Float64

    • Description: Distance travelled (metres) published periodically by the SLAM algorithm.

  • /slamcore/pose
  • /slamcore/odom
    • Type: nav_msgs/Odometry

    • Description: Smooth Pose, and Twist of the sensor. The pose is in the odometry frame of reference (see the /slamcore/odom_frame ROS Parameter) and the twist is in the instantaneous body frame of the sensor (see the /slamcore/base_frame ROS Parameter).

  • /slamcore/sparse_map
  • /slamcore/map
    • Type: nav_msgs/OccupancyGrid

    • Description: Computed occupancy grid map. This will be published using a Volatile durability QoS policy in Single Session SLAM mode and Transient Local durability QoS policy in Localisation mode.

  • /tf
  • /tf_static

Run the following to view the full list of topics:

$ ros2 topic list

Advertised Services

Below is the list of services advertised by slam_publisher. These are used mainly for interaction of the client/user with the SLAM machine. Run the following to call the service:

$ ros2 service call /slamcore/<service-name> <service_type>

List of advertised services:

  • /slamcore/save_session
    • Type: std_srvs/Trigger

    • Description: Save session map file (saved in the current working directory by default).

  • /slamcore/static_map
    • Type: nav_msgs/GetMap

    • Description: Retrieve the static map (only advertised if loading from a session file).

Run the following to view the full list of services:

$ ros2 service list -t

Advertised Parameters

Below is the list of parameters that slam_publisher and dataset_recorder fetch during initialisation. To set the parameters, provide the corresponding launch arguments when launching:

# set the parameter via the corresponding roslaunch argument
$ ros2 launch slamcore_slam slam_publisher.launch.py <param-name>:=<desired-value>   # for slam_publisher.launch.py OR
$ ros2 launch slamcore_slam dataset_recorder.launch.py <param-name>:=<desired-value> # dataset_recorder.launch.py

Run the following to view the full list of advertised parameters.

For slam_publisher:

$ ros2 launch slamcore_slam slam_publisher.launch.py --show-args

For dataset_recorder

$ ros2 launch slamcore_slam dataset_recorder.launch.py --show-args

Note

The full path(s) must be provided for the relevant parameter values e.g. /home/<user>/<path/to/file>.

List of parameters advertised by slam_publisher:

  • /slamcore/slam_node/device_id
    • Type: String

    • Description: Integer device index for multiple reader types.

  • /slamcore/slam_node/config_file
    • Type: String or a List of strings

    • Description: Path to the configuration/preset file(s). One can provide a single config file, e.g. config_file:=/usr/share/slamcore/presets/outdoor.json, or multiple config files as a list config_file:=[/usr/share/slamcore/presets/outdoor.json,$HOME/depth_preset.json]

  • /slamcore/slam_node/advertise_images
    • Type: Boolean

    • Default: True

    • Description: Advertise (publish) images.

  • /slamcore/slam_node/override_realsense_depth
    • Type: Boolean

    • Default: False

    • Description: Overwrite the depth stream, and therefore the depth topic for a RealSense, with the value provided with realsense_depth_override_value camera. See FAQs for more information.

  • /slamcore/slam_node/realsense_depth_override_value
    • Type: Boolean

    • Default: False

    • Description: Overwrite the depth stream, and therefore the depth topic for a RealSense camera. See FAQs for more information.

  • /slamcore/slam_node/enable_color
    • Type: Boolean

    • Default: False

    • Description: Publish RGB image.

  • /slamcore/slam_node/generate_map2d
    • Type: Boolean

    • Default: False

    • Description: Generate a new 2D map (Requires depth to be ON).

  • /slamcore/slam_node/enable_panoptic_plugin
    • Type: Boolean

    • Default: False

    • Description: Enable Panoptic Segmentation. Color stream must also be enabled. For 3D detections (3D bounding boxes) to be published, the depth stream must also be enabled. See Panoptic Segmentation for more information.

  • /slamcore/slam_node/session_file
    • Type: String

    • Description: Path to the session map to load to run in localisation mode (details).

  • /slamcore/slam_node/session_save_dir
    • Type: String

    • Default: current working directory

    • Description: Specify directory to save session map at. If none is provided, maps will be saved to the current working directory from which you called ros2 launch slam_publisher.launch.py.

  • /slamcore/slam_node/odom_reading_topic
    • Type: String

    • Description: Topic to read input wheel odometry from. The slam_publisher will listen on this topic for nav_msgs/Odometry messages. A VIK calibration is required to run SLAM with wheel odometry, see the Wheel Odometry Integration tutorial.

  • /slamcore/slam_node/map_frame
    • Type: String

    • Default: map

    • Description: Name given to the frame_id the pose is published relative to.

  • /slamcore/slam_node/odom_frame
    • Type: String

    • Default: odom

    • Description: Name given to the frame_id of the smooth world frame published on /tf (see REP-105).

  • /slamcore/slam_node/base_frame
    • Type: String

    • Default: base_link

    • Description: Name given to the frame_id the pose refers to.

  • /slamcore/slam_node/dataset_write_path
    • Type: String

    • Description: Directory to save dataset to whilst running live SLAM. If none is provided, no streams will be recorded during the SLAM run.

  • /slamcore/slam_node/dataset_path
  • /slamcore/slam_node/dataset_timescale
    • Type: Float

    • Default: -1 (runs as fast as possible).

    • Description: Control the playback speed by selecting a playback timescale. Setting it to values greater than zero sets a corresponding playback speed, and setting the timescale to 1.0 will play back in emulated real time. If no timescale is specified the playback will run as fast as possible to process every frame.

List of parameters advertised by dataset_recorder:

  • /dataset_recorder/device_id
    • Type: String

    • Description: Integer device index for multiple reader types.

  • /dataset_recorder/config_file
    • Type: String or a List of strings

    • Description: Path to the configuration/preset file(s). One can provide a single config file, e.g. config_file:=/usr/share/slamcore/presets/autoexposure_limit.json, or multiple config files as a list config_file:=[/usr/share/slamcore/presets/autoexposure_limit.json,$HOME/depth_preset.json]

  • /dataset_recorder/override_realsense_depth
    • Type: Boolean

    • Default: False

    • Description: Overwrite the depth stream, and therefore the depth topic for a RealSense, with the value provided with realsense_depth_override_value camera. See FAQs for more information.

  • /dataset_recorder/realsense_depth_override_value
    • Type: Boolean

    • Default: False

    • Description: Overwrite the depth stream, and therefore the depth topic for a RealSense camera. See FAQs for more information.

  • /dataset_recorder/enable_color
    • Type: Boolean

    • Default: False

    • Description: Publish RGB image.

  • /dataset_recorder/odom_reading_topic
    • Type: String

    • Description: Odometry input topic to subscribe to. The dataset_recorder will listen on this topic for nav_msgs/Odometry messages.

  • /dataset_recorder/output_dir
    • Type: String

    • Default: $PWD/output

    • Description: Specify directory to save the recorded dataset at. If none is provided, it will be saved under the current working directory with the name output.