Slamcore SDK Logo

Introduction

  • Overview
    • Technology
    • Software
    • Hardware
  • Getting Started
    • Register
    • Download
    • Install
      • Install on a Headless machine
  • Run with Public Datasets
  • Setting up a Camera
    • Setting up an Intel® RealSense™ Camera
      • Install Intel® RealSense™ Kernel Modules
      • Verify RealSense Setup
      • Install the Camera Firmware
      • Register Your Camera
      • RealSense Camera Calibration (Optional)
    • Set up an OAK-D S2 Camera
      • Set Udev Rules on your Machine
      • Register the Camera
      • OAK-D Camera Calibration
  • Requirements
    • Minimum
    • ROS Wrappers
    • ROS1 Visualisation
    • Recording Datasets
  • Glossary
  • FAQs
  • Troubleshooting
    • Camera Issues
    • SLAM Issues
    • Connection Issues
    • Software Dependencies
    • Dataset Recording Issues
    • ROS Issues

Software Tools

  • Slamcore Visualiser
    • Usage - SLAM
      • Run SLAM on Live Camera Feed
      • Run SLAM on a Recorded Dataset
    • SLAM, Localisation, Mapping and Odometry Mode
      • SLAM mode
      • Localisation mode
      • Height Mapping Mode
    • Usage - Dataset Recording while running SLAM
    • Saving Trajectories
    • The Application
      • 3D Camera Controls
      • Menus
    • Command-Line Options
      • Dataset Subcommand Options
      • RealSense Subcommand Options
      • OAK-D S2 Fixed-Focus Subcommand Options
  • Slamcore Dataset Recorder
    • Usage
    • Application UI
    • Command-Line Options
      • RealSense Subcommand Options
      • OAK-D S2 Fixed-Focus Subcommand Options
  • Slamcore CLI
    • Dataset Recorder CLI
      • Command-Line Options
    • Dataset Compressor
      • Command-Line Options
    • Dataset Processor
      • SLAM, Localisation, Mapping and Odometry Mode
      • Dataset Recording during a SLAM Run
      • Command-Line Options
  • Slamcore Session Explorer
    • Usage
    • The Application
      • Top Menu Bar
      • Left Sidebar
    • Command-Line Options
  • Slamcore Scripts
    • Plot Trajectory
      • Dependencies
      • Usage
      • Command-Line Options
    • Plot Map
      • Dependencies
      • Usage
      • Command-Line Options

API

  • Slamcore C++ API
    • Overview
      • Introduction
      • Initialisation
      • Main SLAM interface
      • Subsystems
      • External Sensor Support
      • General remarks
    • Classes
      • Class slamcore::BadTaggedObjectAccess
      • Class slamcore::BasePoseInterface
      • Class slamcore::ConstImageView
      • Class slamcore::EncoderInterface
      • Class slamcore::EncoderListInterface
      • Class slamcore::ErrorCodeInterface
      • Class slamcore::FixedMeasurementPoint
      • Class slamcore::FrameInterface
      • Class slamcore::FrameSyncInterface
      • Class slamcore::GPSLocationInterface
      • Class slamcore::HeightMappingSubsystemInterface
      • Class slamcore::IMUListInterface
      • Class slamcore::IMUSensorDataInterface
      • Class slamcore::IMUTripletInterface
      • Class slamcore::IMUTripletListInterface
      • Class slamcore::ImageInterface
      • Class slamcore::LIDARScanInterface
      • Class slamcore::LandmarkInterface
      • Class slamcore::LogMessageInterface
      • Class slamcore::Map2DAccessInterface
      • Class slamcore::Map2DInterface
      • Class slamcore::MapChannel2DInterface
      • Class slamcore::Matrix
      • Class slamcore::MeasurementPoint
      • Class slamcore::MetaDataInterface
      • Class slamcore::MultiFrameInterface
      • Class slamcore::OptimisedTrajectorySubsystemInterface
      • Class slamcore::PointCloudInterface
      • Class slamcore::PoseInterface
      • Class slamcore::PoseListInterface
      • Class slamcore::PoseWriteInterface
      • Class slamcore::PropertiesInterface
      • Class slamcore::Range1D
      • Class slamcore::Range2D
      • Class slamcore::RangeIterator
      • Class slamcore::ReferenceFrame
      • Class slamcore::SLAMAsyncTasksInterface
      • Class slamcore::SLAMCoreInterface
      • Class slamcore::SLAMSubsystemAccessInterface
      • Class slamcore::SLAMSystemCallbackInterface
      • Class slamcore::SLAMSystemInterface
      • Class slamcore::SensorSourceInterface
      • Class slamcore::SensorsInfoInterface
      • Class slamcore::SparseMapInterface
      • Class slamcore::StaticPoseInterface
      • Class slamcore::SubsystemInterface
      • Class slamcore::TaskStatusInterface
      • Class slamcore::TrackingStatusListInterface
      • Class slamcore::TrajectoryHelper
      • Class slamcore::TransformationSample
      • Class slamcore::VelocityInterface
      • Class slamcore::Version
      • Class slamcore::WheelOdometrySample
      • Class slamcore::WheelOdometrySensorInterface
      • Class slamcore::slam_exception
    • Files
      • File base_pose.hpp
      • File basic.hpp
      • File clocks.hpp
      • File common.hpp
      • File const_image_view.hpp
      • File distortion_type.hpp
      • File encoder.hpp
      • File encoder_list.hpp
      • File error_code.hpp
      • File errors.hpp
      • File fixed_measurement_point.hpp
      • File frame.hpp
      • File frame_sync.hpp
      • File gps_location.hpp
      • File height_mapping.hpp
      • File image.hpp
      • File imu_list.hpp
      • File imu_sensor_data.hpp
      • File imu_triplet.hpp
      • File imu_triplet_list.hpp
      • File landmark.hpp
      • File lidar_scan.hpp
      • File logging.hpp
      • File map2d.hpp
      • File map2d_interface.hpp
      • File map_channel2d.hpp
      • File matrix.hpp
      • File measurement_point.hpp
      • File meta_data.hpp
      • File multi_frame.hpp
      • File multi_session_id.hpp
      • File optimised_trajectory.hpp
      • File point_cloud.hpp
      • File pose.hpp
      • File pose_list.hpp
      • File pose_write.hpp
      • File positioning_mode.hpp
      • File properties_interface.hpp
      • File property.hpp
      • File range.hpp
      • File range_iterator.hpp
      • File reference_frame.hpp
      • File reference_frame_category.hpp
      • File sensor_id.hpp
      • File sensor_source.hpp
      • File sensor_source_interface.hpp
      • File sensors_info.hpp
      • File slam_async_tasks.hpp
      • File slam_core.hpp
      • File slam_create.hpp
      • File slam_event.hpp
      • File slam_subsystem_access.hpp
      • File slam_system.hpp
      • File slam_system_callback.hpp
      • File slamcore.hpp
      • File sparse_map.hpp
      • File static_pose.hpp
      • File strcasecmp.hpp
      • File stream.hpp
      • File stream_type.hpp
      • File subsystem.hpp
      • File system_configuration.hpp
      • File tagged_object.hpp
      • File task_status.hpp
      • File tracking_status.hpp
      • File tracking_status_list.hpp
      • File transform.hpp
      • File transformation_sample.hpp
      • File velocity.hpp
      • File version.hpp
      • File wheel_odometry.hpp
      • File wheel_odometry_sample.hpp
    • Structs
      • Struct slamcore::CellCoordinates
      • Struct slamcore::ConstTaggedObject
      • Struct slamcore::GenericMultiSessionId
      • Struct slamcore::GenericMultiSessionId< detail::MultiSessionIdT::Base >
      • Struct slamcore::GenericMultiSessionId< detail::MultiSessionIdT::KeyFrame >
      • Struct slamcore::ImageFormatTraits
      • Struct slamcore::ImageFormatTraits< ImageFormat::Mono_16 >
      • Struct slamcore::ImageFormatTraits< ImageFormat::Mono_32 >
      • Struct slamcore::ImageFormatTraits< ImageFormat::Mono_64 >
      • Struct slamcore::ImageFormatTraits< ImageFormat::Mono_8 >
      • Struct slamcore::ImageFormatTraits< ImageFormat::Mono_F >
      • Struct slamcore::ImageFormatTraits< ImageFormat::RGBA_8 >
      • Struct slamcore::ImageFormatTraits< ImageFormat::RGBA_F >
      • Struct slamcore::ImageFormatTraits< ImageFormat::RGB_8 >
      • Struct slamcore::ImageFormatTraits< ImageFormat::RGB_F >
      • Struct slamcore::StreamType
      • Struct slamcore::StreamType< Stream::ActiveMap >
      • Struct slamcore::StreamType< Stream::ErrorCode >
      • Struct slamcore::StreamType< Stream::FrameSync >
      • Struct slamcore::StreamType< Stream::IMU >
      • Struct slamcore::StreamType< Stream::LocalPointCloud >
      • Struct slamcore::StreamType< Stream::MetaData >
      • Struct slamcore::StreamType< Stream::Pose >
      • Struct slamcore::StreamType< Stream::SmoothPose >
      • Struct slamcore::StreamType< Stream::Velocity >
      • Struct slamcore::StreamType< Stream::Video >
      • Struct slamcore::TaggedObject
      • Struct slamcore::Transform
      • Struct slamcore::TypeTraits
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< EncoderInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< EncoderListInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< ErrorCodeInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< FrameInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< FrameSyncInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< GPSLocationInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< IMUListInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< IMUSensorDataInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< IMUTripletInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< IMUTripletListInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< ImageInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< LIDARScanInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< LandmarkInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< Map2DInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< MapChannel2DInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< MetaDataInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< MultiFrameInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< PointCloudInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< SparseMapInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< StaticPoseInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< TaskStatusInterface, T >::type >
      • Struct slamcore::TypeTraits< T, typename enable_if_is_base_of< TrackingStatusListInterface, T >::type >
      • Struct slamcore::camera_clock
      • Struct slamcore::gps_clock
      • Struct slamcore::host_clock
      • Struct slamcore::imu_clock
      • Struct slamcore::lidar_clock
      • Struct slamcore::odometry_clock
      • Struct slamcore::v0::SystemConfiguration
    • Namespaces
      • Namespace slamcore
      • Namespace slamcore::detail
      • Namespace slamcore::internal
      • Namespace slamcore::v0
      • Namespace std
    • Examples
      • Example ascii_image.cpp
      • Example ascii_mapping.cpp
      • Example hello_world_all_streams.cpp
      • Example high_rate_pose.cpp
      • Example measure_slam_performance.cpp
      • Example multisession.cpp
      • Example plot_map.py
      • Example plot_trajectory.py
      • Example record_dataset.cpp
      • Example reset.cpp
      • Example save_load_session.cpp
      • Example wheel_odometry.cpp
      • Example write_map.cpp
      • Example write_trajectory.cpp
  • Python API
    • Download
  • Slamcore ROS1 Wrapper
    • Software Prerequisites
    • ROS1 Wrapper Layout
    • Installation
    • Usage - SLAM
      • Live SLAM
      • SLAM using a recorded dataset
      • VIO Mode
      • Localisation Mode
      • Height Mapping Mode
    • Usage - Dataset Recording
      • Dataset Recording during a SLAM run
      • Standalone Dataset Recorder
    • ROS1 Visualisation
    • ROS1 API
      • Advertised Topics
      • Advertised Services
      • Advertised Parameters
  • Slamcore ROS2 Wrapper
    • Introduction
    • Usage - SLAM
      • Live SLAM
      • SLAM using a recorded dataset
      • VIO Mode
      • Localisation Mode
      • Height Mapping Mode
    • Usage - Dataset Recording
      • Dataset Recording during a SLAM run
      • Standalone Dataset Recorder
    • ROS2 API
      • Lifecycles Nodes
      • Advertised Topics
      • Advertised Services
      • Advertised Parameters
  • Migrating from Earlier API Versions
    • Slamcore SDK v23.01
      • Session file format
      • JSON configuration file migration
      • Tracking Information
    • Slamcore SDK v21.06.85
      • Layout
      • Subsystems and I/O
      • Types and Objects

Configuration

  • Configuration Overview
    • Download
    • Usage
      • Visualiser, Dataset Recorder and CLI
      • ROS1 Wrapper
      • ROS2 Wrapper
    • Configuration Files
      • Environment-Specific Presets
      • Feature-Specific Presets
      • Custom Configuration
  • RealSense Configuration
  • SLAM Configuration
    • Display Name
    • Positioning Mode
    • Estimator Parameters
    • Frontend Parameters
      • Tracking On Plane
    • Multisession Parameters
  • Height Mapping Configuration
    • EnableFeatureMap2D
    • Height Mapping Parameters
      • Occupancy Extraction Parameters
  • Point Cloud Configuration
    • Point Cloud Trimming
      • Vertical Height Trimming
      • Depth Trimming

Tutorials & Examples

  • Tutorial Overview
    • 1. Capturing Datasets
    • 2. Processing Datasets
      • Visual-Inertial Odometry (VIO)
      • Single-session SLAM mode
      • Localisation using a pre-built point cloud
    • 3. Multi-Agent Localisation
    • 4. Wheel Odometry Integration
    • 5. 2D Occupancy Mapping
    • 6. ROS1 Melodic Navigation Stack Integration
    • 7. ROS2 Foxy Nav2 Integration
  • Capturing Datasets
    • Step 1 - Capture a Master Dataset of the entire test space
      • Step 1.1 - Launch the Dataset Recorder
      • Step 1.2 - Start the recorder
      • Step 1.3 - Move the sensor around the test environment twice
      • Step 1.4 - Stop recording
    • Step 2 - Capture Evaluation Datasets
  • Localisation Mode
    • Step 1 - Create a master configuration file
    • Step 2 - Create the Offline Master Point-cloud
      • Step 2.1 - Launch the slamcore_visualiser tool
      • Step 2.2 - Generate the Offline Master Point-cloud
      • Step 2.3 - Save the Point-cloud
    • Step 3 - Create an Evaluation Configuration File
    • Step 4 - Evaluation: Localise within the Point-cloud
      • Step 4.1 - Load the Evaluation Datasets
      • Step 4.2 - Load the Point-cloud to Run the Dataset Within
      • Step 4.3 - Process the Evaluation Dataset
    • Step 5 - Evaluate the System with a Live Sensor Feed
      • Step 5.1 - Launch the Positioning Tool
      • Step 5.2 - Load the Point-cloud to Use for Live Localisation
      • Step 5.3 - Run the System Live
  • Single Session SLAM Positioning Mode
    • Step 1 - Create a Configuration File
    • Step 2 - Process a Dataset
    • Step 3 - Try the System in Live Mode
  • Visual-Inertial Odometry Positioning
    • Step 1 - Create a Configuration File
    • Step 2 - Process a Dataset
    • Step 3 - Try the System in Live Mode
  • Multi-Agent Localisation
    • Step 1 - Create a Master Configuration File
    • Step 2 - Create the Offline Master Point-cloud
      • Step 2.1 - Launch the slamcore_visualiser Tool
      • Step 2.2 - Generate the Offline Master Point-cloud
      • Step 2.3 - Save the Point-cloud
    • Step 3 - Copy the Session Map File to any Compatible Machine
    • Step 4 - Run Localisation using a Pre-built Point-cloud
      • Step 4.1 - Launch the Positioning Tool on any Compatible Machine
      • Step 4.2 - Load the Point-cloud to Use for Live Localisation
      • Step 4.3 - Run the System Live
  • RealSense Camera Calibration
    • Requirements
    • Recording Environment
    • Capturing a Sequence
    • Send Files to Slamcore for Calibration
      • Dataset Folder Structure
      • Dataset Validation
      • Compress the files
      • Upload the sequences
      • Calibration Confirmation
  • Wheel Odometry Integration
    • Requirements
    • VI Calibration
    • VIK Calibration
      • VIK Calibration Dataset
      • Recording Procedure
      • Calibration Sequences
      • Dataset Folder Structure
    • Send to Slamcore for Calibration
      • Dataset Validation
      • Compress the files
      • Obtain the Camera-Odometry Frame of Reference Transformation
      • Email Slamcore
    • Run the system in VIK Mode
      • Run VIK SLAM on live camera (ROS)
      • Run VIK SLAM with a dataset
  • 2D Occupancy Mapping
    • Requirements
    • Usage
      • Step 1 - Record a dataset of the entire test environment
      • Step 2 - Prepare the mapping configuration file
      • Step 3 - Create a session map
      • Step 4 - Inspect Map in Session Explorer
      • Step 5 - Load Occupancy Map in ROS
  • ROS1 Navigation Stack Integration
    • Goal
    • Hardware Setup
    • ROS1 Navigation Stack Setup
    • Outline
    • Set up Dependencies
      • Set up Binary Dependencies
      • Set up ROS1 work workspace
    • Run Visual-Inertial-Kinematic Calibration
    • Compute the slamcore/base_link ➞ base_footprint Transformation
    • Record Dataset to Map the Environment
    • Create Session and Map for Navigation
      • Edit the Generated Session/Map
    • Specify the Session and the Configuration File Paths
    • Launch Live Navigation
      • Interact with the Navigation Demo
      • Waypoint Navigation
    • Appendix
      • Navigation Demo Video
      • Remote Visualisation of the Navigation Demo
      • Obstacle Avoidance
      • Troubleshooting
  • Nav2 Integration
    • Goal
    • Hardware Setup
    • Nav2 Setup
    • Outline
    • Set Up Robot
    • Set Up Visualisation Machine
    • Run Visual-Inertial-Kinematic Calibration [Paid Add-on]
    • Compute the slamcore/base_link ➞ base_footprint Transformation
    • Create a Map and Run Live Navigation
      • Navigation in single session SLAM mode
      • Navigation in localisation mode using a prerecorded map
      • Interact with the Navigation Demo
      • Nav2 Configuration
    • Appendix
      • Troubleshooting
      • PS4 Button Mapping
      • Existing issues in Nav2 / ROS2

Technical Guides

  • Guide to Trajectories
  • Tracking Information
    • Tracking Status
    • SLAM Events
  • Frames of Reference Convention
    • map and odom frames

Additional Information

  • Telemetry Data
  • Feature Roadmap
    • Coming Soon
    • Future
  • Feedback
  • Documentation License
  • Contact Information
  • Release Notes
    • Version 23.01
      • New
      • Breaking Changes
    • Version 21.06.3
      • Improved
    • Version 21.06.2
      • Fixed
    • Version 21.06.1
      • New
      • Improved
    • Version 21.06
      • New
      • Improved
      • Fixed
    • Version 21.05.3
      • Improved
    • Version 21.05.2
      • Improved
      • Fixed
    • Version 21.05.1
      • Improved
    • Version 21.05
      • New
      • Improved
    • Version 21.04.1
      • Improved
    • Version 21.04
      • New
      • Improved
      • Fixed
    • Version 21.03
      • New
      • Improved
      • Fixed
    • Version 21.02.1
      • Improved
    • Version 21.02
      • New
      • Fixed
    • Version 21.01.1
      • Fixed
    • Version 21.01
      • New
      • Improved
      • Fixed
    • Version 0.6
      • New
      • Improved
      • Fixed
    • Version 0.5.2
      • New
    • Version 0.5.1
      • Fixed
    • Version 0.5
      • New
      • Improved
      • Fixed
    • Version 0.4
      • New
      • Improved
      • Fixed
    • Version 0.3
      • New
      • Improved
    • Version 0.2
      • New
    • Version 0.1.0
Slamcore SDK
  • »
  • Nav2 Integration
Next Previous

Nav2 Integration¶

The current page presents a working example of integrating the Slamcore SLAM algorithms into Nav2 (the ROS2 Navigation Stack) and using it as the core component to map the environment as well as to provide accurate positioning of the robotic platform, in ROS2 Foxy. This tutorial will include steps detailing how to run the example natively on Ubuntu 20.04 or using a docker container for systems like the NVIDIA Jetson NX, which do not yet support Ubuntu 20.04.

Note

Using ROS1? Visit the ROS1 Navigation Stack Integration Tutorial Page

Goal¶

The goal of this demonstration is to use the Slamcore SDK as the main source of positioning during navigation as well as for mapping the environment before or during navigation. In the Nav2 documentation’s examples, these two tasks are normally carried out using SLAM Toolbox, an open-source 2D graph-based SLAM library which uses a 2D laser scan for these tasks. AMCL, which also uses 2D laser scans, is also suggested as a localisation alternative.

Instead, we’ll be using the 2D Occupancy Mapping capabilities of our SDK to generate an occupancy grid map and our visual-inertial SLAM positioning to localise in that map. Additionally, we will integrate wheel odometry into our SLAM system to increase localisation robustness - this is available for customers as a paid add-on.

Hardware Setup¶

We are using the Kobuki robotic platform, the Intel RealSense D435i camera and the NVIDIA Jetson NX during this demonstration. We’re also using a custom mounting plate for placing the board and the camera on the robot. Lastly, we will use a separate laptop as a visualisation machine.

Note

The Jetson Xavier NX used for this example uses JetPack 4.6 (based on Ubuntu 18.04), however, ROS2 Foxy targets Ubuntu 20.04. Therefore, for similar cases where Ubuntu 20.04 might not be available, instructions have been provided below on how to run this example using a docker container.

Robotic platform in use
_images/kobuki.png
_images/slamcore-ros-setup.png

Fig. 69 Main setup for navigation¶

Nav2 Setup¶

Traditionally Nav2 requires the following components to be in place:

  • An occupancy grid map of the environment, either generated ahead of time, or live.

  • A global planner and a controller (also known as local planner in ROS1) which guide your robot from the start to the end location. The default choice for these are NavFn for the global planner and DWB for the controller. Other available options are discussed in the Selecting the Algorithm Plugins section of the Nav2 docs.

  • A global and local costmap which assign computation costs to the aforementioned grid map so that the planner chooses to go through or to avoid certain routes in the map.

  • A localisation module, such as SLAM Toolbox or AMCL.

As discussed earlier, we’ll be using Slamcore software to generate a map of the environment as well as localising the robot in the environment. On top of that, we’ll use the NavFn global planner and DWB controller for navigation. Lastly, we will be using the local point cloud published by our software for obstacle avoidance with costmap2D’s obstacle layer plugin.

Positioning information is transmitted to Nav2 using TF, so we will need to make sure the correct transforms are set up and being broadcast for Nav2 to function correctly. A short introduction to the required transforms is provided in the Nav2 Setting Up Transformations tutorial page. As explained in the drop-down below, the Slamcore ROS Wrapper abides to REP-105 and will publish both the the map \(\rightarrow\) odom and odom \(\rightarrow\) base_footprint transforms required for navigation by default.

Abiding to REP-105 - map and odom frames

Note

Many popular ROS frameworks, like the navigation stack abide to the ROS Coordinate Frames convention - REP-105, and thus requires two transformations to operate:

  • map \(\rightarrow\) base_footprint

  • odom \(\rightarrow\) base_footprint

where base_footprint (or sometimes called base_link) is the main frame of reference of the robot platform in use.

This way a ROS node that’s interested in the pose of the robot can query either map \(\rightarrow\) base_footprint or odom \(\rightarrow\) base_footprint. If it queries the former, it will get the most accurate estimation of the robot pose, however, that may include discontinuities or jumps. On the other hand, the odom \(\rightarrow\) base_footprint transform is drift-y and overall less-accurate but is guaranteed to change smoothly over time.

Traditionally the localisation module would compute the map \(\rightarrow\) base_footprint transform, and would use the latest odom \(\rightarrow\) base_footprint, as published by the odometry node (e.g. dead reckoning via wheel odometry) to eventually publish map \(\rightarrow\) odom and abide to REP-105.

To abide to this standard and also increase the overall accuracy of these transforms, the Slamcore ROS Wrappers incorporate the latest odometry information and publish both the map \(\rightarrow\) odom and odom \(\rightarrow\) base_footprint transforms. This way we can provide a smooth odom \(\rightarrow\) base_footprint transform that potentially uses the wheel-odometry, as well as information from the visual and inertial sensors.

For more on Slamcore’s frames of reference convention, see Frames of Reference Convention.

As seen above, Nav2 has similar requirements to the ROS1 Navigation Stack - you need a map, some sort of positioning (TF) and some sensor streams for obstacle avoidance, and these are all provided by our software. The main difference with ROS1 is that Nav2 no longer uses the move_base finite state machine and instead uses Behaviour Trees to call modular servers to complete an action (e.g. compute a path, navigate…). This allows the user to configure the navigation behaviour easily using plugins in a behaviour tree xml file. A detailed comparison with the ROS1 Navigation stack can be found in the ROS to ROS2 Navigation Nav2 docs page.

_images/slamcore-nav2-setup.png

Fig. 70 Slamcore integration into Nav2¶

Nav2’s parameters and plugins, which can be configured for your unique use case, have been included and can be easily modified in the nav2-demo-params yaml file, in our repository. Details about Nav2 configuration and obstacle avoidance parameters can be found in the Nav2 Configuration section further below.

Outline¶

Following is the list of steps for this demo.

_images/nav2-steps.png

Fig. 71 Outline of the demo¶

We’ll delve into each one of these steps in more detail in the next sections.

  1. Set Up Robot

  2. [OPTIONAL] Set Up Visualisation Machine

  3. [OPTIONAL] Run Visual-Inertial-Kinematic Calibration, to improve the overall performance.

  4. Compute the slamcore/base_link ➞ base_footprint Transformation

  5. Create a Map and Run Live Navigation while teleoperating your robot. For detailed steps on map creation and navigation see, Navigation in single session SLAM mode or Navigation in localisation mode using a prerecorded map.

  6. Interact with the Navigation Demo, set waypoints and navigation goals using navigation_monitoring_launch.py.

Set Up Robot¶

You will need to download the Slamcore ROS2 wrapper, regardless of whether you would like to run this demo on Native Ubuntu 20.04 or through a docker container. See the Getting Started page for details on how to download the “Slamcore Tools” and “ROS2 Wrapper” Debian packages. Installation of “Slamcore Tools” on a separate laptop can be useful to inspect recorded datasets and the generated occupancy grids.

Once you have downloaded the Slamcore ROS2 Wrapper you may continue with set up and installation following the steps below.

On Native Ubuntu 20.04, a working ROS2 installation is required before installing the Slamcore ROS2 Wrapper. Follow the steps on the Slamcore ROS2 Wrapper page for details on how to install ROS2 Foxy and the Slamcore ROS2 Wrapper.

Set up Binary Dependencies

When not using a Dockerfile, you will need to manually install a series of packages using apt.

Installing apt dependencies
$ apt-get update && \
>   apt-get upgrade -y && \
>   apt-get install --no-install-recommends --assume-yes \
>   software-properties-common \
>   udev \
>   keyboard-configuration \
>   python3-colcon-* \
>   python3-pip \
>   python3-rosdep \
>   ros-foxy-diagnostic-updater \
>   ros-foxy-ecl-build \
>   ros-foxy-joint-state-publisher \
>   ros-foxy-kobuki* \
>   ros-foxy-nav2* \
>   ros-foxy-navigation2* \
>   ros-foxy-nonpersistent-voxel-layer
>   ros-foxy-rviz2 \
>   ros-foxy-xacro \

Set up ROS2 workspace

You will have to create a new ROS2 workspace by cloning the slamcore-ros2-examples repository. This repository holds all the navigation-related nodes and configuration for enabling the demo. Before compiling the workspace, install vcstool which is used for fetching the additional ROS2 source packages.

Install vcstool
$ pip3 install --user --upgrade vcstool

Collecting vcstool
  Downloading https://files.pythonhosted.org/packages/86/ad/01fcd69b32933321858fc5c7cf6ec1fa29daa8942d37849637a8c87c7def/vcstool-0.2.15-py3-none-any.whl (42kB)
Collecting PyYAML (from vcstool)
  Downloading https://files.pythonhosted.org/packages/7a/5b/bc0b5ab38247bba158504a410112b6c03f153c652734ece1849749e5f518/PyYAML-5.4.1-cp36-cp36m-manylinux1_x86_64.whl (640kB)
Collecting setuptools (from vcstool)
  Downloading https://files.pythonhosted.org/packages/4e/78/56aa1b5f4d8ac548755ae767d84f0be54fdd9d404197a3d9e4659d272348/setuptools-57.0.0-py3-none-any.whl (821kB)
Installing collected packages: PyYAML, setuptools, vcstool
Successfully installed PyYAML-5.4.1 setuptools-57.0.0 vcstool-0.2.15

After that, clone the repository, run vcstool and finally colcon build the packages.

Setting up ROS2 Workspace
$ git clone git@github.com:slamcore/slamcore-ros2-examples
Cloning into 'slamcore-ros2-examples'...

$ cd slamcore-ros2-examples
$ vcs import src < repos.yaml
...
=== src/kobuki_ros (git) ===
Cloning into '.'...
=== src/kobuki_ros_interfaces (git) ===
Cloning into '.'...

$ colcon build
...

$ source install/setup.bash
...

Once you have set up the new workspace, make sure you set up the appropriate udev rules to communicate with the Kobuki, as explained below.

In this case, no ROS2 installation is required before installing the Slamcore ROS2 wrapper. Follow these instructions to set up our ROS2 Wrapper and example repository in a docker container. This is the preferred method if you are on platforms such as NVIDIA’s Xavier NX, which do not yet support the installation of Ubuntu 20.04. or you prefer an isolated installation.

  1. You should have downloaded the Slamcore ROS2 Wrapper for your host system from the Download Slamcore Software link on the Slamcore Portal.

  2. Clone the slamcore-ros2-examples repository.

    $ git clone https://github.com/slamcore/slamcore-ros2-examples.git
    
  3. Set the SLAMCORE_DEB variable for the current shell to the path to your downloaded Slamcore ROS2 Wrapper by using the following command.

    $ export SLAMCORE_DEB=$HOME/path/to/the/downloaded/debian/package
    
  4. Run make build from within the /slamcore-ros2-examples directory. This will build the corresponding dockerfile image - downloading all the necessary Debian packages, setting up the ROS workspace, etc.

    $ cd slamcore-ros2-examples/
    $ make build
    
  5. To bring up the container for the first time use the following from within the current directory:

    $ make run
    

    To bring up additional terminals in the same running container, run the following from a new terminal:

    $ make login
    

    Note

    You can run make help to see all available commands along with their descriptions.

  6. Now you should be in the container and should have /ros_ws in your $PATH. The /slamcore-ros2-examples directory on your machine has been mounted in the container inside the /ros_ws workspace. Therefore, files saved inside the /ros_ws/slamcore-ros2-examples directory in the container should appear on the /slamcore-ros2-examples directory on your machine.

    We will use vcstool to fetch the additional ROS2 source packages needed for this demo.

    /ros_ws$ cd slamcore-ros2-examples/
    /ros_ws/slamcore-ros2-examples$ vcs import src < repos.yaml
    
  7. Build the packages from within the /ros_ws directory:

    /ros_ws/slamcore-ros2-examples$ cd ..
    /ros_ws$ colcon build
    

    Once built, source the install/setup.bash file. You will need to source this file in every new terminal.

    /ros_ws$ source install/setup.bash
    

If you would like to exit the running container, simply use Ctrl + D or type exit.

Once you have set up your container, make sure you also set up the Kobuki udev rules correctly on your system (from outside the container) to communicate with Kobuki, as detailed below.

Warning

In order to communicate with the Kobuki, you will also need to set up the appropriate udev rules. To do that, copy the 60-kobuki.rules file, available from the kobuki_ftdi repository to the /etc/udev/rules.d directory of your system.

# Download the file or clone the repository
$ wget https://raw.githubusercontent.com/kobuki-base/kobuki_ftdi/devel/60-kobuki.rules
# Copy it to the correct directory
$ sudo cp 60-kobuki.rules /etc/udev/rules.d
$ sudo service udev reload
$ sudo service udev restart

You may need to reboot your machine for changes to take effect.

Set Up Visualisation Machine¶

You may repeat the same steps outlined above on a second machine for visualisation. We will then be able to take advantage of ROS2’s network capabilities to visualise the topics being published by the robot on this second machine with minimal effort - as long as both machines are on the same network.

Note

You may use the dockerfile provided even if your system is Ubuntu 20.04 for an easier, isolated setup.

Run Visual-Inertial-Kinematic Calibration [Paid Add-on]¶

To increase the overall accuracy of the pose estimation we will fuse the wheel-odometry measurements of the robot encoders into our SLAM processing pipeline. This also makes our positioning robust to kidnapping issues (objects partially or totally blocking the camera field of view) since the algorithm can now depend on the odometry to maintain tracking.

To enable the wheel-odometry integration, follow the corresponding tutorial: Wheel Odometry Integration. After the aforementioned calibration step, you will receive a VIK configuration file similar to the one shown below:

VIK configuration file
{
  "Version": "1.1.0",
  "Patch": {
    "Base": {
      "Sensors": [
        {
          "EstimationScaleY": false,
          "InitialRotationVariance": 0.01,
          "InitialRotationVariancePoorVisual": 0.01,
          "InitialTranslationVariance": 0.01,
          "InitialTranslationVariancePoorVisual": 1e-8,
          "ReferenceFrame": "Odometry_0",
          "ScaleTheta": 0.9364361585576232,
          "ScaleX": 1.0009030227774692,
          "ScaleY": 1.0,
          "SigmaCauchyKernel": 0.0445684,
          "SigmaTheta": 1.94824,
          "SigmaX": 0.0212807,
          "SigmaY": 0.00238471,
          "TimeOffset": "42ms",
          "Type": [
            "Odometry",
            0
          ]
        }
      ],
      "StaticTransforms": [
        {
          "ChildReferenceFrame": "Odometry_0",
          "ReferenceFrame": "IMU_0",
          "T": {
            "R": [
              0.5062407414595297,
              0.4924240392715688,
              -0.4916330719846658,
              0.50944656222699
            ],
            "T": [
              0.0153912358519861,
              0.2357725115741995,
              -0.0873645490730017
            ]
          }
        }
      ]
    }
  },
  "Position": {
    "Backend": {
      "Type": "VisualInertialKinematic"
    }
  }
}

Warning

The configuration file format for running SLAM on customized parameters has changed in v23.01. This affects all VIK calibration files previously provided to you. Please see JSON configuration file migration for more information.

Compute the slamcore/base_link ➞ base_footprint Transformation¶

Before you can start navigating in the environment, you need to provide the transformation between the frame of reference of the robot base, base_footprint in our example and the frame of reference of the Slamcore pose estimation algorithm, i.e., slamcore/base_link.

The translation and rotation parts of this transform should be specified in the xyz and rpy fields of the slamcore_camera_to_robot_base_transform parameter in the nav2-demo-params yaml config file. You may copy this file and modify the parameters to suit your setup. You can then load this file by passing in the path to the file using the params_file argument when launching navigation.

Order of Transforms in TF - Validation of specified transform

The order of transforms in the TF tree is as follows:

map \(\rightarrow\) odom \(\rightarrow\) slamcore/base_link \(\rightarrow\) base_footprint

Note that slamcore/base_link, i.e. the frame of the Slamcore pose estimation is the parent of the base_footprint. Thus, xyz and rpy should encode the transformation of the base_footprint relative to the slamcore/base_link frame.

Also note that when the camera is pointing forwards and in parallel to the robot platform surface, the axes of the slamcore/base_link frame should be:

  • Z pointing forwards

  • X pointing to the right side

  • Y pointing downwards

In contrast, the robot base_footprint axes commonly are as follows:

  • X pointing forwards

  • Y pointing to the left side

  • Z pointing upwards

You can also visually inspect the validity of your transformation using the view_model_launch.py file. Note this might take some time to load during which RViz2 will be unresponsive. Here’s the relative transform of the aforementioned frames in our setup:

$ ros2 launch slamcore_ros2_examples view_model_launch.py
_images/frames-rviz.png

You can also refer to the Troubleshooting section for a simplified version of the overall TF Tree.

The more accurate this transform the better, but for now a rough estimation will do. In our case, since the camera is placed 23cm above and 9.3cm in front of the base_footprint we used the following values.

slamcore_camera_to_robot_base_transform:
 ros__parameters:
   parent_frame: slamcore/base_link
   child_frame: base_footprint
   xyz: [0.015, 0.236, -0.087]
   rpy: [0.000, -1.571, 1.571]
slamcore_camera_to_robot_base_transform variables and VIK configuration parameters

Note that, if you are also integrating wheel odometry measurements, the slamcore/base_link \(\rightarrow\) base_footprint transform will be specified in two places, once in the slamcore_camera_to_robot_base_transform variables in nav2-demo-params.yaml and a second time in the kinematic parameters (see Run Visual-Inertial-Kinematic Calibration [Paid Add-on] section) of the slam-config.json file. You can save some time and copy the T_SO.T section given by the VIK config file to the slamcore_camera_to_robot_base_transform xyz values in nav2-demo-params.yaml. Note however that calibration procedure cannot compute the height difference between the two frames since it’s not observable in planar motion. Therefore, you will have to measure and edit the (Y) value manually of slamcore_camera_to_robot_base_transform’s xyz variable.

Create a Map and Run Live Navigation¶

There are two ways of running this example:

  1. Navigation in single session SLAM mode, creating a map live and navigating inside it as the robot explores a space.

  2. Navigation in localisation mode using a prerecorded map, creating a map first and then loading the map at startup.

Note

This demo assumes that the visualisation and the processing (SLAM and navigation) happen on separate machines (a SLAM Machine and a Visualisation Machine) and takes advantage of ROS2’s convenient network capabilities. If, however, you have an external monitor connected to your robot, you can also run the visualisation commands on your robot. We currently do not support launching RViz2 via docker on Jetson platforms.

Navigation in single session SLAM mode¶

In single session SLAM mode, the robot will generate a map as it moves around a space. When SLAM is stopped, the map will be discarded unless the slamcore/save_session service is called before.

You can bring up the robot, SLAM and Nav2 in single session SLAM mode with the following command on your robot. If you have created new SLAM and Nav2 configuration files, don’t forget to pass these in with the config_file and params_file arguments respectively, to override the default ones. The config_file should contain the VIK calibration parameters which will enable the robot to use Visual-Inertial-Kinematic SLAM (if you have completed a VIK calibration) and any additional Slamcore configuration parameters, detailed in SLAM Configuration.

$ ros2 launch slamcore_ros2_examples kobuki_live_navigation_launch.py \
>   config_file:=</path/to/slam/config/json> \
>   params_file:=</path/to/params/yaml/>

Note

If you have already brought up the Kobuki with our kobuki_setup_comms_launch.py launch script, to for example teleoperate the robot, you can launch the above file with the comms argument set to false, to only launch the navigation and SLAM components.

$ ros2 launch slamcore_ros2_examples kobuki_live_navigation_launch.py \
>   session_file:=<path/to/session/file> \
>   config_file:=</path/to/slam/config/json> \
>   params_file:=</path/to/params/yaml/> \
>   comms:=false

To create the map, you can teleoperate the robot, using your keyboard or a PS4 controller, by running the following in another terminal:

$ # Use the arrow keys to move around
$ ros2 run slamcore_ros2_examples kobuki_teleop_key
$ # Hold L1 and use the left joystick for Fwd/Back, right joystick for Left/Right
$ ros2 run slamcore_ros2_examples kobuki_teleop_joy

Note

If the above mapping does not work with your joystick/driver, you may try the following alternative using the joystick_mode:=old argument:

$ # Press L1 and use the left joystick for Fwd/Back, right joystick for Left/Right
$ ros2 run slamcore_ros2_examples kobuki_teleop_joy --ros-args -p joystick_mode:=old

See the PS4 Button Mapping section in the Appendix for an illustration of the PS4 controller buttons to be used.

On your visualisation machine, run the following to visualise the map being created in RViz2:

$ ros2 launch slamcore_ros2_examples navigation_monitoring_launch.py

Note

This demo assumes that the visualisation and the processing (SLAM and navigation) happen on separate machines (a SLAM Machine and a Visualisation Machine). If, however, you have an external monitor connected to your robot, you can also open RViz2 by running the above in a new terminal or passing in the use_rviz=true argument when launching navigation with our launch file. We currently do not support launching RViz2 via docker on Jetson platforms.

$ ros2 launch slamcore_ros2_examples kobuki_live_navigation_launch.py use_rviz=true

Once Nav2 is up and running, see Interact with the Navigation Demo to learn how to set single goals or multiple waypoints for navigation.

If you would like to save the map to reuse it in the future, you can call the slamcore/save_session service from another terminal on your robot:

$ ros2 service call /slamcore/save_session std_srvs/Trigger

By default, the session file will be saved to the working directory from which kobuki_live_navigation_launch.py was called and SLAM will be paused while the file is being generated. If you would like the service to save the session file to a specific directory, you must set the session_save_dir parameter when launching kobuki_live_navigation_launch.py. Note that when using our Docker container, files should be saved inside the slamcore-ros2-docker directory of the container, as otherwise they will not be saved to your machine after exiting the container.

The next time you launch navigation you can provide the path to the session file using the session_file argument as explained in the section below.

Navigation in localisation mode using a prerecorded map¶

In localisation mode, we can navigate in a map that has been recorded previously. This map can either be generated from a recorded dataset or at the end of a previous live run by using the slamcore/save_session service as mentioned above. If running on a limited compute platform, it might be preferable to record a dataset and then generate the map on a more powerful machine. Check the drop-down below to see the benefits and disadvantages of creating a map live VS from a recorded dataset.

Pros/Cons of generating a map live vs from a recorded dataset

Instead of first generating a dataset and then creating a session and map from that dataset as the previous sections have described, you could alternatively create a session at the end of a standard SLAM run. Compared to the approach described above this has a few pros and cons worth mentioning:

  • ✅ No need to record a dataset, or move it to another machine and run SLAM there

  • ✅ You can interactively see the map as it gets built and potentially focus on the areas that are under-mapped

  • ❌ Generating a session at the end of the run may take considerably longer if you are running on a Jetson NX compared to running on an x86_64 machine.

  • ❌ You can’t modify the configuration file and see its effects as you would when having separate dataset recording and mapping steps.

  • ❌ If something goes wrong in the pose estimation or mapping procedure, you don’t have the dataset to further investigate and potentially report the issue back to Slamcore

The two options for map creation are detailed below.

We can generate a map by first recording a dataset and then processing it using Slamcore Visualiser (GUI) or slamcore_dataset_processor (command line tool).

Record Dataset to Map the Environment

We will be using the dataset_recorder.launch.py launch file to capture a dataset that contains visual-inertial, depth as well as kinematic information. We’ll also use the kobuki_teleop_key script to teleoperate the robot with the keyboard. We also need to bring up the Kobuki to be able to drive it around.

First, bring up the Kobuki:

$ ros2 launch slamcore_ros2_examples kobuki_setup_comms_launch.py

Then run teleoperation on a separate terminal:

$ # For Keyboard Teleop - Use the arrow keys to move around
$ ros2 run slamcore_ros2_examples kobuki_teleop_key

$ # For Joystick Teleop - Hold L1 and use the left joystick for Fwd/Back, right joystick for Left/Right
$ ros2 run slamcore_ros2_examples kobuki_teleop_joy

Note

If the above mapping does not work with your joystick/driver, you may try the following alternative using the joystick_mode:=old argument:

$ # Press L1 and use the left joystick for Fwd/Back, right joystick for Left/Right
$ ros2 run slamcore_ros2_examples kobuki_teleop_joy --ros-args -p joystick_mode:=old

See the PS4 Button Mapping section in the Appendix for an illustration of the PS4 controller buttons to be used.

Finally, launch the Slamcore dataset recorder:

$ ros2 launch slamcore_slam dataset_recorder.launch.py \
>   override_realsense_depth:=true \
>   realsense_depth_override_value:=true \
>   odom_reading_topic:=/odom

Notice that recording the kinematic measurements (by subscribing to a wheel odometry topic) is not necessary, since we can generate a map using purely the visual-inertial information from the camera. Kinematics will however increase the overall accuracy if recorded and used.

When you have covered all the space that you want to map, send a Ctrl-c signal to the application to stop. By default, the dataset will be saved in the current working directory, unless the output_dir argument is specified. Note that when using our Docker container, files should be saved inside the slamcore-ros2-docker directory of the container, as otherwise they will not be saved to your machine after exiting the container.

You now have to process this dataset and generate the .session file. In our case, we compressed and copied the dataset to an x86_64 machine in order to accelerate the overall procedure.

$ tar cvfz mydataset.tgz mydataset/
$ rsync --progress -avt mydataset.tgz <ip-addr-of-x86_64-machine>:

Create Session and Map for Navigation

Once you have the (uncompressed) dataset at the machine that you want to do the processing at, use the slamcore_visualiser to process the whole dataset and at the end of it, save the resulting session.

$ # Launch slamcore_visualiser, enable mapping features - `-m`
$ slamcore_visualiser dataset \
>   -u mydataset/ \
>   -c /usr/share/slamcore/presets/mapping/default.json \
>   -m

Alternatively, you can use the slamcore_dataset_processor command line tool, however, you won’t be able to visualise the map being built.

$ # Launch slamcore_dataset_processor, enable mapping features `-m` and session saving `-s`
$ slamcore_dataset_processor dataset \
>   -u mydataset/ \
>   -c /usr/share/slamcore/presets/mapping/default.json \
>   -m \
>   -s

Note

Refer to Step 2 - Prepare the mapping configuration file in case you want to tune the mapping configuration file in use or include the VIK configuration parameters.

_images/map-generation.png _images/map-generation2.png

2.5D Map and 2D Occupancy Grid being generated

Edit the Generated Session/Map

You can optionally use slamcore_session_explorer and the editing tool of your choice, e.g. Gimp to create the final session and corresponding embedded map. See Slamcore Session Explorer for more. When done, copy the session file over to the machine that will be running SLAM, if not already there.

Instead of recording a dataset, we can save a session map directly when running SLAM in Height Mapping/Single Session mode. If you are already running Nav2 in single session SLAM mode with the commands shown in Navigation in single session SLAM mode, you can simply call the slamcore/save_session service to save the current map that is being generated.

$ ros2 service call /slamcore/save_session std_srvs/Trigger

Otherwise, we can create a new map from scratch interactively by simply running the robot teleoperation script and launching SLAM in height mapping mode (Note that with the commands below we do not bring up Nav2 and, therefore, autonomous navigation capabilities will not be available).

First, bring up the Kobuki:

$ ros2 launch slamcore_ros2_examples kobuki_setup_comms_launch.py

Then run teleoperation on a separate terminal:

$ # For Keyboard Teleop - Use the arrow keys to move around
$ ros2 run slamcore_ros2_examples kobuki_teleop_key

$ # For Joystick Teleop - Hold L1 and use the left joystick for Fwd/Back, right joystick for Left/Right
$ ros2 run slamcore_ros2_examples kobuki_teleop_joy

Note

If the above mapping does not work with your joystick/driver, you may try the following alternative using the joystick_mode:=old argument:

$ # Press L1 and use the left joystick for Fwd/Back, right joystick for Left/Right
$ ros2 run slamcore_ros2_examples kobuki_teleop_joy --ros-args -p joystick_mode:=old

See the PS4 Button Mapping section in the Appendix for an illustration of the PS4 controller buttons to be used.

Finally, launch SLAM in Height Mapping Mode:

$ ros2 launch slamcore_slam slam_publisher.launch.py generate_map2d:=true

You can visualise the map being created in RViz2 by subscribing to the /slamcore/map topic.

To save the map, call the slamcore/save_session service from another terminal on your robot:

$ ros2 service call /slamcore/save_session std_srvs/Trigger

The session file will be saved to the current working directory by default. Note that when using our Docker container, files should be saved inside the slamcore-ros2-docker directory of the container, as otherwise they will not be saved to your machine after exiting the container.

Once you have the map, you can pass the path to the session file as an argument when launching navigation. If you have created new SLAM and Nav2 configuration files, don’t forget to pass these in with the config_file and params_file arguments respectively, to override the default ones. The config_file should contain the VIK calibration parameters which will enable the robot to use Visual-Inertial-Kinematic SLAM (if you have completed a VIK calibration) and any additional Slamcore configuration parameters, detailed in SLAM Configuration.

$ ros2 launch slamcore_ros2_examples kobuki_live_navigation_launch.py \
>   session_file:=<path/to/session/file> \
>   config_file:=</path/to/slam/config/json> \
>   params_file:=</path/to/params/yaml/>

Note

If you have already brought up the Kobuki with our kobuki_setup_comms_launch.py launch script, to for example teleoperate the robot, you can launch the above command with the comms argument set to false, to only launch the navigation and SLAM components.

$ ros2 launch slamcore_ros2_examples kobuki_live_navigation_launch.py \
>   session_file:=<path/to/session/file> \
>   config_file:=</path/to/slam/config/json> \
>   params_file:=</path/to/params/yaml/> \
>   comms:=false

You can visualise the robot navigating in the map on a separate machine by running:

$ ros2 launch slamcore_ros2_examples navigation_monitoring_launch.py

Note

This demo assumes that the visualisation and the processing (SLAM and navigation) happen on separate machines (a SLAM Machine and a Visualisation Machine). If, however, you have an external monitor connected to your robot, you can also open RViz2 by running the above in a new terminal or passing in the use_rviz=true argument when launching navigation with our launch file. We currently do not support launching RViz2 via docker on Jetson platforms.

$ ros2 launch slamcore_ros2_examples kobuki_live_navigation_launch.py use_rviz=true

If the map is not rendered in RViz2, you can launch a teleoperation node to manually drive the robot around until it relocalises in the loaded map:

$ # For Keyboard Teleop - Use the arrow keys to move around
$ ros2 run slamcore_ros2_examples kobuki_teleop_key

$ # For Joystick Teleop - Hold L1 and use the left joystick for Fwd/Back, right joystick for Left/Right
$ ros2 run slamcore_ros2_examples kobuki_teleop_joy

Note

If the above mapping does not work with your joystick/driver, you may try the following alternative using the joystick_mode:=old argument:

$ # Press L1 and use the left joystick for Fwd/Back, right joystick for Left/Right
$ ros2 run slamcore_ros2_examples kobuki_teleop_joy --ros-args -p joystick_mode:=old

See the PS4 Button Mapping section in the Appendix for an illustration of the PS4 controller buttons to be used.

Once Nav2 is up and running, see Interact with the Navigation Demo below to learn how to set single goals or multiple waypoints for navigation.

Interact with the Navigation Demo¶

At first, we may need to teleoperate the robot manually, until the SLAM algorithm relocalises. We can see that the relocalisation took place by looking at the RViz2 view where the local and global costmaps start getting rendered, or by subscribing to the /slamcore/pose where we start seeing incoming Pose messages.

This is how the RViz2 view should look like after the robot has relocalised.

_images/nav2-rviz2-view.png

Fig. 72 RViz2 view during navigation¶

Similar to ROS1, in RViz2 we can set individual navigation goals with the Nav2 Goal button or by publishing a Pose message to the /goal_pose topic. The robot will then try to plan a path and start navigating towards the goal if the path is feasible.

Example - How to publish a goal to the /goal_pose topic
$ ros2 topic pub /goal_pose geometry_msgs/PoseStamped "{header: {stamp: {sec: 0}, frame_id: 'map'}, pose: {position: {x: 0.2, y: 0.0, z: 0.0}, orientation: {w: 1.0}}}"
_images/nav2-rviz2-planning.png

Fig. 73 Robot navigating towards single goal¶

It is also possible to issue multiple waypoints using Nav2’s Waypoint Mode button. When Waypoint Mode is selected, we can set multiple waypoints on the map with the Nav2 Goal button. When all waypoints have been set, press the Start Navigation button and the robot will attempt to navigate through all the waypoints.

_images/nav2-rviz2-waypoints.png

Fig. 74 Waypoint mode¶

Nav2 Configuration¶

The nav2-demo-params yaml file contains parameters, such as the planner and costmap parameters, that can be tuned to obtain the best navigation performance. The Nav2 docs include a Configuration Guide with information on the available parameters and how to use them.

In this file, we set the obstacle_layer and voxel_layer that are used in the global and local costmap for obstacle avoidance. The observation source for these is the /slamcore/local_point_cloud published by our software. This point cloud can be trimmed using a Slamcore JSON configuration file to, for example, remove points below a certain height that should not be marked as obstacles. More details on point cloud trimming can be found on the Point Cloud Configuration page.

Note

In addition to trimming the point cloud using a Slamcore JSON configuration file, we set the ROS obstacle layer’s min_obstacle_height parameter in nav2-demo-params.yaml to -0.18.

This parameter lets you set a height (measured from the map frame) from which all points above are considered valid and can be marked as obstacles. All points below are simply ignored (they are not removed, as is the case with point cloud trimming). As in this example the map frame is at the camera height, we want to make sure that points in the cloud that are below the camera (between camera height (23cm) and the floor) are included. If the map frame was on the ground level, min_obstacle_height could be kept at e.g. 0.05- only points 5cm above the map Z coordinate would be considered when marking obstacles.

This parameter can be used together with point cloud trimming, as is the case in this demo, or without point cloud trimming, if you would like to keep the raw local point cloud.

You may copy these files, found in the config folder, and adjust them to your setup. Remember to provide the path to the modified files when launching navigation to override the default ones.

$ ros2 launch slamcore_ros2_examples kobuki_live_navigation_launch.py \
>   session_file:=<path/to/session/file> \
>   config_file:=</path/to/slam/config/json> \
>   params_file:=</path/to/params/yaml/>

Appendix¶

Troubleshooting¶

My TF tree seems to be split in two main parts

As described in Nav2 Setup, our ROS2 Wrapper publishes both the map \(\rightarrow\) odom and odom \(\rightarrow\) base_footprint transformations in the TF tree. Thus, to avoid conflicts when publishing the transforms, (note that TF allows only a single parent frame for each frame) make sure that there is no other node publishing any of the the aforementioned transformations. E.g it’s common for the wheel-odometry node, in our case the kobuki_node to also publish its wheel-odometry estimates in the transformation tree. We disable this behaviour by setting the Kobuki node’s publish_tf parameter to False in our nav2-demo-params.yaml config file.

For reference, here’s a simplified version of how TF Tree looks like when executing the Slamcore Nav2 demo:

_images/tf-tree.png

Fig. 75 Reference TF Tree during Navigation¶

PS4 Button Mapping¶

_images/nav2-ps4-controller.png

Existing issues in Nav2 / ROS2¶

  • RViz2 may crash when setting a new goal if the Controller visualisation checkbox is enabled. (See https://github.com/ros2/rviz/issues/703 for more details)

Next Previous

© Copyright 2021, Slamcore Limited.

Releases v: 23.01
21.06
23.01