Capturing Datasets

Whilst it is possible to run all positioning modes in real-time with a live sensor feed, for detailed evaluation it is often easier and more efficient to pre-record your datasets and process them at a later date. This allows you to repeat your tests (using the recorded dataset) and evaluate the effects of changing key localisation/mapping parameters of the system.

The positioning software has been optimised to work on a ground based robot but will work when the sensor is mounted on any mobile platform, including a drone, headset or even just being held and moved by hand.

To fully assess all of the positioning modes you should look to capture a minimum of two datasets in your test environment.

  1. Master Dataset - This should be a single recording where the sensor is moved around the entire test environment.

  2. Evaluation Dataset - This can be a number of shorter datasets that will represent the sort of paths you expect to follow during live operation.

Note

If you do not have a supported camera, you may download the EuRoC MAV Datasets 1 to use in the other tutorials:

  1. Master Dataset: Download the file MH_03_medium.zip

  2. Evaluation Dataset: Download the file MH_01_easy.zip

  3. Download the SLAMcore configuration file capture_info.json and place it at the root of the uncompressed dataset.

Step 1 - Capture a Master Dataset of the entire test space

The aim of this step is to record a dataset of your entire test environment to create the master point-cloud that will be used for Localisation using a pre-built point-cloud positioning mode, also referred to as Localisation Mode.

Step 1.1 - Launch the Dataset Recorder

Type the following into the terminal:

$ slamcore_dataset_recorder --no-depth

The --no-depth flag is used to turn off the infrared projector and depth stream of the RealSense sensor. SLAMcore’s positioning software only requires the passive stereo camera pair and IMU so these other sensor feeds are not required (and may reduce map accuracy if left on). We will make use of the IR functionality later in the Mapping Software Tutorial (coming soon).

You should see the following:

_images/dataset_recorder_gui_annotated.png

Fig. 21 SLAMcore Dataset Recorder

Step 1.2 - Start the recorder

Start recording your dataset by clicking the Record button in the top left:

_images/record_btn_full.png

Fig. 22 Button to start recording

Click on the folder icon Select Directory Button. The tool will automatically create a folder with the current date in the location you specify here:

_images/save_dir_btn_full.png

Fig. 23 Location where dataset will be saved

Step 1.3 - Move the sensor around the test environment twice

If your sensor is mounted on a robot, manually move/drive it around the test environment using a similar motion and speed to the way it will move during live operations. If the sensor is mounted on a wearable or simply being held by hand then try to move with a smooth motion around the environment at walking speed. Move the sensor around the entire test environment ensuring that the camera “sees” each area, preferably from less than two metres. For optimal results, travel the path twice during the same recording session.

Step 1.4 - Stop recording

Click the Stop button to stop the recording:

_images/stop_btn_full.png

Fig. 24 Button to stop recording

Step 2 - Capture Evaluation Datasets

Once you have a full dataset of your test space, you may wish to record shorter test sequences within that space to evaluate the position estimation for different test cases using “Localisation in pre-built point-cloud” mode. You launch the recorder in the same way as before:

$ slamcore_dataset_recorder --no-depth

You can start and stop recording and set the location the dataset will be saved in the same way as described earlier.

1

https://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets