Robot/Camera Data Files

During execution, all data from the robot and cameras are stored into files robot_data.dat and camera_data.dat.

When running locally with simulation, these files are stored to the directory that is specified with the --output-dir argument. When running on the actual robots, you can download them once the job has finished, see Accessing Recorded Data.

The data is stored in a custom binary format. We provide APIs to load them.

Load Recorded Robot Data

Use the class robot_interfaces.trifinger.BinaryLogReader to load the robot data file.

Python Example:

import robot_interfaces

if __name__ == "__main__":
    log = robot_interfaces.trifinger.BinaryLogReader("robot_data.dat")
    for entry in log.data:
        print("Step {}: Torque: {}".format(
            entry.timeindex, entry.observation.torque
        ))

To simply plot selected fields of the log file, you can use plot_trifinger_log. You can execute it using the challenge Singularity image. It expects as arguments the log file and a list of fields that are added to the plot. For example to compare the desired and observed position of joint 3:

./rrc2021.sif ros2 run robot_interfaces plot_trifinger_log robot_data.dat \
    desired_action.position[3] observation.position[3]
../_images/plot_robot_log_example.png

Convert to CSV

You can also convert the binary file to a plain-text CSV file using robot_log_dat2csv:

./rrc2021.sif ros2 run robot_fingers robot_log_dat2csv robot_data.dat robot_data.csv

Load Recorded Camera Data

Important

The camera observation type is different in Task 1 and 2. In Task 1 the object pose estimation is integrated in the observation while in Task 2 no object tracking is provided. Unfortunately, this results in the need of separate classes/tools for handling the log files.

The examples below are for Task 1, where object tracking is provided. If not noted otherwise, they work the same for log files of Task 2 when replacing all occurrences of “trifinger_object_tracking” with “trifinger_cameras”.

Use the class trifinger_object_tracking.py_tricamera_types.LogReader to load the camera data file.

Python Example:

import cv2

import trifinger_object_tracking.py_tricamera_types as tricamera
from trifinger_cameras.utils import convert_image

if __name__ == "__main__":
    log_reader = tricamera.LogReader("camera_data.dat")

    for observation in log_reader.data:
        print("Object position:", observation.object_pose.position)

        image = convert_image(observation.cameras[0].image)
        cv2.imshow("camera60", image)
        cv2.waitKey(100)

There are also tools to view a log file and to convert the camera stream to a video file, so it can easily be shared with others. You can execute them directly using the Singularity image.

View Recorded Images

./rrc2021.sif ros2 run trifinger_object_tracking tricamera_log_viewer camera_data.dat

There are several options for visualizing the result of the object tracking, e.g. -v to draw the cube pose in the images. See --help for a complete list.

Convert to Video

To convert the recordings of one camera into a video file (you need to specify one of the cameras “camera60”, “camera180”, “camera300”):

./rrc2021.sif ros2 run trifinger_object_tracking tricamera_log_converter \
    camera_data.dat video.avi -c camera60

Extract Object Poses as CSV (Only for Task 1)

If you are only interested in the object poses and want to convert them to a CSV file (e.g. for easier processing with other applications):

./rrc2021.sif ros2 run trifinger_object_tracking tricamera_log_extract_object_poses \
    camera_data.dat object_poses.csv

Load Robot and Camera Data and Synchronize

The robot and the camera are providing observations at different rates so technically there are separate “time indices” for the robot and the camera observations. To get some synchronization, the TriFingerPlatformFrontend/TriFingerPlatformWithObjectFrontend class allows accessing the camera observation using robot time indices by mapping to the corresponding camera time index internally.

The robot_fingers.TriFingerPlatformLog/robot_fingers.TriFingerPlatformWithObjectLog class provides an API to have the same kind of access when reading the log files. It is basically a wrapper around the robot and camera log readers described above, performing the same “robot index to camera index”-mapping as the TriFingerPlatformFrontend class.

Python Example:

import robot_fingers

if __name__ == "__main__":
    # for logs of stage 1
    log = robot_fingers.TriFingerPlatformWithObjectLog("robot_data.dat", "camera_data.dat")
    # for logs of stage 2
    # log = robot_fingers.TriFingerPlatformLog("robot_data.dat", "camera_data.dat")

    for t in range(log.get_first_timeindex(), log.get_last_timeindex() + 1):
        # TriFingerPlatformLog provides the same getters as
        # TriFingerPlatformFrontend:
        robot_observation = log.get_robot_observation(t)
        camera_observation = log.get_camera_observation(t)

Important

As shown in the example above, there are different classes for reading logs of stages 1 and 2. This is equivalent to the distiction of the frontend classes, see TriFingerPlatformFrontend vs TriFingerPlatformWithObjectFrontend.

API Documentation

class robot_interfaces.trifinger.BinaryLogReader(filename: str)

See read_file().

property data

Contains the log entries.

Type

List[LogEntry]

read_file(filename: str)

Read data from the specified binary robot log file.

The data is stored to data.

Parameters

filename (str) – Path to the robot log file.

class robot_interfaces.trifinger.LogEntry

Represents the logged of one time step.

timeindex: int
timestamp: float
observation: Observation
desired_action: Action
applied_action: Action
status: Status
class trifinger_cameras.py_tricamera_types.LogReader(filename: str)

See read_file()

Note

This is for camera logs of stage 2. For logs of stage 1 (where the object pose is added to the camera observation) use trifinger_object_tracking.py_tricamera_types.LogReader.

data: List[trifinger_cameras.py_tricamera_types.TriCameraObservation]

List of camera observations from the log file.

timestamps: List[float]

List of timestamps of the camera observations. This refers to the time when the observation was added to the time series. The timestamps of when the images where acquired from the cameras are stored in the observations themselves.

read_file(filename: str)

Read data from the specified camera log file.

The data is stored in data and timestamps.

Parameters

filename (str) – Path to the camera log file.

class trifinger_object_tracking.py_tricamera_types.LogReader

Same as trifinger_cameras.py_tricamera_types.LogReader but for logs with observation type TriCameraObjectObservation (stage 1).

class robot_fingers.TriFingerPlatformLog(robot_log_file: str, camera_log_file: str)

Load robot and camera log and match observations like during runtime.

The robot and camera observations are provided asynchronously. To access both through a common time index, TriFingerPlatformFrontend maps “robot time indices” to the corresponding camera observations based on the time stamps. This mapping is not explicitly saved in the log files. Therefore, TriFingerPlatformLog class provides an interface to load robot and camera logs together and performs the mapping from robot to camera time index in the same way as it is happening in TriFingerPlatformFrontend.

Parameters
  • robot_log_file (str) – Path to the robot log file.

  • camera_log_file (str) – Path to the camera log file.

Note

This is for camera logs of stage 2. For logs of stage 1 (where the object pose is added to the camera observation) use trifinger_object_tracking.py_tricamera_types.LogReader.

get_applied_action(t: int)Action

Get actually applied action of time step t.

get_camera_observation(t: int)

Get camera observation of robot time step t.

get_desired_action(t: int)Action

Get desired action of time step t.

get_first_timeindex()int

Get the first time index in the log file.

get_last_timeindex()int

Get the last time index in the log file.

get_robot_observation(t: int)Observation

Get robot observation of time step t.

get_robot_status(t: int)Status

Get robot status of time step t.

get_timestamp_ms(t: int)float

Get timestamp (in milliseconds) of time step t.

class robot_fingers.TriFingerPlatformWithObjectLog

This API of this class is the same as robot_fingers.TriFingerPlatformLog with the only difference that the type of the camera observation is TriCameraObjectObservation (including the object pose). See also TriFingerPlatformFrontend vs TriFingerPlatformWithObjectFrontend.