This challenge has ended!

This documentation is only for the Real Robot Challenge 2020 which has ended. Following challenges have their own documentation, see the challenge website for more information.

Robot/Camera Data Files

During execution, all data from the robot and cameras are stored into files robot_data.dat and camera_data.dat.

When running locally with simulation, these files are stored to the directory that is specified with the --output-dir argument. When running on the actual robots, you can download them once the job has finished, see Accessing Recorded Data.

The data is stored in a custom binary format. We provide APIs to load them.

Load Recorded Robot Data

Use the robot log reader class to load the log file.

Python Example:

import robot_interfaces

if __name__ == "__main__":
    log = robot_interfaces.trifinger.BinaryLogReader("robot_data.dat")
    for entry in
        print("Step {}: Torque: {}".format(
            entry.timeindex, entry.observation.torque

You can also convert the binary file to a plain-text CSV file using You can easily execute it from the challenge Singularity image:

./realrobotchallenge.sif rosrun robot_fingers \
    robot_data.dat robot_data.csv

After the conversion, you can use to plot data from the CSV file. For example, to plot the observed position of the first three joints:

./realrobotchallenge.sif rosrun robot_fingers \
    robot_data.csv observation_position_{0,1,2}

Load Recorded Camera Data

Python Example:

import cv2

import trifinger_object_tracking.py_tricamera_types as tricamera
from trifinger_cameras.utils import convert_image

if __name__ == "__main__":
    log_reader = tricamera.LogReader("camera_data.dat")

    for observation in
        print("Object position:", observation.object_pose.position)

        image = convert_image(observation.cameras[0].image)
        cv2.imshow("camera60", image)

There are also tools to view a log file and to convert the camera stream to a video file, so it can easily be shared with others. You can execute them directly using the Singularity image.

To view the recordings of all three cameras:

./realrobotchallenge.sif rosrun trifinger_object_tracking \

There are several options for visualizing the result of the object tracking, e.g. -v to draw the cube pose in the images. See --help for a complete list.

To convert the recordings of one camera into a video file (you need to specify one of the cameras “camera60”, “camera180”, “camera300”):

./realrobotchallenge.sif rosrun trifinger_object_tracking \
    camera_data.dat video.avi -c camera60

Load Robot and Camera Data and Synchronize

The robot and the camera are providing observations at different rates so technically there are separate “time indices” for the robot and the camera observations. To get some synchronization, the TriFingerPlatformFrontend class allows accessing the camera observation using robot time indices by mapping to the corresponding camera time index internally.

The TriFingerPlatformLog class provides an API to have the same kind of access when reading the log files. It is basically a wrapper around the robot and camera log readers described above, performing the same “robot index to camera index”-mapping as the TriFingerPlatformFrontend class.

Python Example:

import robot_fingers

if __name__ == "__main__":
    log = robot_fingers.TriFingerPlatformLog("robot_data.dat",

    for t in range(log.get_first_timeindex(), log.get_last_timeindex() + 1):
        # TriFingerPlatformLog provides the same getters as
        # TriFingerPlatformFrontend:
        robot_observation = log.get_robot_observation(t)
        camera_observation = log.get_camera_observation(t)

API Documentation

class robot_interfaces.trifinger.BinaryLogReader(filename: str)

See read_file().

property data

Contains the log entries.



read_file(filename: str)

Read data from the specified binary robot log file.

The data is stored to data.


filename (str) – Path to the robot log file.

class robot_interfaces.trifinger.LogEntry

Represents the logged of one time step.

timeindex: int
timestamp: float
observation: Observation
desired_action: Action
applied_action: Action
status: Status
class trifinger_object_tracking.py_tricamera_types.LogReader(filename: str)

See read_file()

data: List[trifinger_object_tracking.py_tricamera_types.TriCameraObjectObservation]

List of camera observations from the log file.

timestamps: List[float]

List of timestamps of the camera observations. This refers to the time when the observation was added to the time series. The timestamps of when the images where acquired from the cameras are stored in the observations themselves.

read_file(filename: str)

Read data from the specified camera log file.

The data is stored in data and timestamps.


filename (str) – Path to the camera log file.

class robot_fingers.TriFingerPlatformLog(robot_log_file: str, camera_log_file: str)

Load robot and camera log and match observations like during runtime.

The robot and camera observations are provided asynchronously. To access both through a common time index, TriFingerPlatformFrontend maps “robot time indices” to the corresponding camera observations based on the time stamps. This mapping is not explicitly saved in the log files. Therefore, TriFingerPlatformLog class provides an interface to load robot and camera logs together and performs the mapping from robot to camera time index in the same way as it is happening in TriFingerPlatformFrontend.

  • robot_log_file (str) – Path to the robot log file.

  • camera_log_file (str) – Path to the camera log file.

get_applied_action(t: int) → Action

Get actually applied action of time step t.

get_camera_observation(t: int)trifinger_object_tracking.py_tricamera_types.TriCameraObjectObservation

Get camera observation of robot time step t.

get_desired_action(t: int) → Action

Get desired action of time step t.

get_first_timeindex() → int

Get the first time index in the log file.

get_last_timeindex() → int

Get the last time index in the log file.

get_robot_observation(t: int) → Observation

Get robot observation of time step t.

get_robot_status(t: int) → Status

Get robot status of time step t.

get_timestamp_ms(t: int) → float

Get timestamp (in milliseconds) of time step t.