You’re reading an older version of the Slamcore SDK documenation. The latest one is 23.01.
Localisation Mode
This positioning mode uses a previously created, offline point-cloud map of the area to operate. When operating in this mode the system matches the live view from the sensor against the offline map, providing an accurate, real-time position in the offline-map’s reference frame. This mode provides the most accurate, robust and computationally efficient performance. New landmarks are also discovered and used for tracking in this mode but these will not be added to the existing map or saved at the end of the run. It is also possible to share this offline point-cloud with multiple robots such that they all calculate their position at the same time, in the same spatial-reference frame. This is described in the final tutorial of this series: Multi-Agent Localisation.
For this tutorial you will first use the master dataset captured in the Capturing Datasets tutorial to create a point-cloud map of your entire test space. This will provide the global coordinate frame. You then have the choice of running the system live or using one of the Evaluation Datasets you captured previously. The system will look to match each frame from the cameras to a position within the offline point-cloud and provide you with the sensors global position within the offline point-cloud coordinate frame.
Step 1 - Create a master configuration file
You are able to change some of the core parameters for the system using a
configuration file. Create a new file localisation_mode_master.json
in a
convenient directory, e.g. ~/slamcore/custom/localisation_mode_master.json
.
Edit the file by running this in a terminal window:
$ gedit ~/slamcore/custom/localisation_mode_master.json
Paste the following text into the editor:
{
"Version": "1.0.0",
"Base":
{
"ProfileName": "localisation_mode_master"
},
"Position":
{
"PositioningMode": "SLAM",
"Frontend":
{
"NumKeypoints": 300
}
}
}
The ProfileName
parameter can be set to any name that you wish to be
displayed on the SLAMcore Visualiser when the configuration file is being used.
The positioning mode is controlled by the PositioningMode
parameter and can
be set to SLAM
or ODOMETRY_ONLY
. For this step we will set it to
SLAM
as shown above. Save the text file.
You can also use the configuration file to set other parameters which will
affect the performance of the system. For example, we will use this config
file to set the number of features that will be detected on each frame and used
to triangulate the position of the sensor. The number of features can be varied
by adjusting the NumKeypoints
parameter. This can be set between 10 and
500. The higher the number the better the accuracy and robustness of the system
but computational resource will also increase. As this step does not need to
operate in real-time, you can choose a higher number. We recommend >300.
Step 2 - Create the Offline Master Point-cloud
Step 2.1 - Launch the slamcore_visualiser
tool
Launch the slamcore_visualiser
tool using the main dataset you have captured
of the test environment:
$ slamcore_visualiser dataset -u <path/to/top-level/folder/for/dataset> -c ~/slamcore/custom/localisation_mode_master.json
This will open the Slamcore Visualiser tool:

Fig. 33 SLAMcore Visualiser Opening Screen
Step 2.2 - Generate the Offline Master Point-cloud
Click the Start button to start processing the dataset.
The tool will now process the dataset to create a point-cloud of the space.

Fig. 34 Example Point Cloud
When the dataset has been fully processed, you will see the following message:

Fig. 35 Message when dataset finished processing
Step 2.3 - Save the Point-cloud
Now save the point-cloud by clicking the GENERATE
button:
You will be asked where you want to store this file.

Fig. 36 Message displayed whilst saving dataset
The point-cloud will be saved as a single .session
file which may take a few
minutes to an hour depending on how large the environment and how many points
have been mapped.
Step 3 - Create an Evaluation Configuration File
Now that you have a fixed point-cloud map of your space, you can start to evaluate how well the system localises the robot as it moves within it.
You will need to decide how many features you wish to track per frame during
the evaluation, and change the configuration file accordingly. This does not
need to be the same as the settings you used to create the original point-cloud.
Here real-time operation may be important so we would recommend setting
PositioningMode
to 100. So make a copy of the
localisation_mode_master.json
file you created earlier and name it
localisation_mode_eval.json
.
Edit the file by running this in a terminal window:
$ gedit ~/slamcore/custom/localisation_mode_eval.json
Edit or paste the following text into the editor and save it.
{
"Version": "1.0.0",
"Base":
{
"ProfileName": "localisation_mode_eval"
},
"Position":
{
"PositioningMode": "SLAM",
"Frontend":
{
"NumKeypoints": 100
}
}
}
Step 4 - Evaluation: Localise within the Point-cloud
We will use the Evaluation Datasets you captured previously to localise within the point cloud generated in the previous step. We can now process any dataset captured within the same environment and the system will calculate the robot’s trajectory in the reference frame of the pre-built point-cloud. If you wish to run this with a live sensor feed you can skip this step.
Step 4.1 - Load the Evaluation Datasets
In the terminal window run:
$ slamcore_visualiser dataset -u <path/to/evaluation/dataset> -c ~/slamcore/custom/localisation_mode_eval.json
This will launch the SLAMcore Visualiser tool.

Fig. 37 SLAMcore Visualiser Opening Screen
Step 4.2 - Load the Point-cloud to Run the Dataset Within
Before clicking start, you will need to load the point-cloud you generated in the previous step. Click the Load button:
Select the .session
file.
Step 4.3 - Process the Evaluation Dataset
Click the Start button to start processing the dataset:
It may take a few seconds for the system to recognise its location. When this happens you will see the 6DoF pose estimation marker update in real-time within the point-cloud.

Fig. 38 View when system is localising in the saved point-cloud
The current estimated position in 6DoF along with an estimate of total distance travelled are displayed as numbers in the top left hand corner of the screen.
Step 5 - Evaluate the System with a Live Sensor Feed
If no dataset is specified when the software is launched, the system will default to processing the live feed from the sensor, as long as you have a registered RealSense camera plugged in.
Step 5.1 - Launch the Positioning Tool
To launch the position software run the following in the terminal:
$ slamcore_visualiser -c ~/slamcore/custom/localisation_eval.json
This will launch the main positioning tool.

Fig. 39 SLAMcore Position Opening Screen
Step 5.2 - Load the Point-cloud to Use for Live Localisation
Before clicking start, you will need to load the saved point-cloud you wish to use to localise the live sensor feed. Click the Load button and select the session file you wish to use.
Step 5.3 - Run the System Live
Click the Start button to start processing the live sensor feed:
As before, it may take a few seconds for the system to recognise its location. When this happens you will see the 6DoF pose estimation marker update in real-time. If the system is not able to localise, the 6DoF axis will remain stationary even when you move the sensor. This can normally be fixed by moving the sensor in a figure of eight.

Fig. 40 View when system is localising in the saved point-cloud
Note
The system is very tolerant to both lighting and structural change. If there is an extreme difference in the conditions from when the original dataset was captured, then the localisation mode may struggle to estimate a position. To resolve this you will need to capture a new master dataset of the full environment, generate a point-cloud from this dataset and save this new point-cloud.