Glossary

Dataset

Camera, IMU and other sensor data collected, for
example, using the SLAMcore Dataset Recorder tool. Each
dataset can be used as an input to create a session.

Gyroscope

Measures the orientation and angular velocity of the
camera.

IMU

An inertial measurement unit (IMU) measures the device’s
specific force, angular rate and orientation.

Localisation Mode

In this mode, a map that was previously created in SLAM
mode is loaded into the session, and the SLAM system
localises against this existing map. New landmarks
are discovered and used for tracking in localisation
mode but these will not be added to the existing map or
saved at the end of the run. See the
Localisation Mode tutorial for more.

Loop Closure

The assertion that the system has returned to a
previously visited location and updating pose estimation
beliefs accordingly.

Point Cloud

A point-cloud in this context is defined as a persistent
map of individual points that are plotted in three
dimensions.

ROS

The Robot Operating System is a flexible framework for
writing robot software. See https://www.ros.org/

Session

Instance during which a live camera feed or a
dataset is processed to output pose and mapping
data.

Session File

SLAMcore file with a .session file extension
containing the mapping data created after processing
a live camera feed or dataset.

SLAM

Simultaneous localization and mapping: constructing or
updating a map of an unknown environment while
simultaneously keeping track of an agent or robot’s
location within the environment.

SLAM Mode

In SLAM mode, the system tracks and stores the location
of natural features to create a live point-cloud which
is used to calculate the real-time position of the
robot. In this mode it is possible to detect locations
that have been visited before, triggering a loop closure
and correcting for any drift that may have accumulated.
See the Single Session SLAM Positioning Mode tutorial for more.

Visual-Inertial Tracking System

A system that uses visual features from camera(s) and
inertial data from an IMU to determine the position of
the device.

Visual Odometry

The process of determining the position and orientation
of the camera by analysing the camera images.

Visual-Inertial Odometry Mode

In this mode, the system tracks the location of natural
features to create a live point-cloud and calculate the
the real-time position of the robot. However, it does not
store any history of these features or the historic
position estimates of the robot. The position estimate
is smooth but subject to drift over time. See the