*********************** Robot/Camera Data Files *********************** During execution, all data from the robot and cameras are stored into files ``robot_data.dat`` and ``camera_data.dat``. When running locally with simulation, these files are stored to the directory that is specified with the ``--output-dir`` argument. When running on the actual robots, you can download them once the job has finished, see :ref:`download_log_files`. The data is stored in a custom binary format. We provide APIs to load them. Load Recorded Robot Data ======================== Use the class :class:`robot_interfaces.trifinger.BinaryLogReader` to load the robot data file. **Python Example:** .. literalinclude:: ../examples/robot_log_reader.py To simply plot selected fields of the log file, you can use ``plot_trifinger_log``. You can execute it using the challenge Singularity image. It expects as arguments the log file and a list of fields that are added to the plot. For example to compare the desired and observed position of joint 3:: ./rrc2021.sif ros2 run robot_interfaces plot_trifinger_log robot_data.dat \ desired_action.position[3] observation.position[3] .. image:: ../images/plot_robot_log_example.png Convert to CSV -------------- You can also convert the binary file to a plain-text CSV file using ``robot_log_dat2csv``:: ./rrc2021.sif ros2 run robot_fingers robot_log_dat2csv robot_data.dat robot_data.csv Load Recorded Camera Data ========================= .. important:: **The camera observation type is different in Task 1 and 2.** In Task 1 the object pose estimation is integrated in the observation while in Task 2 no object tracking is provided. Unfortunately, this results in the need of separate classes/tools for handling the log files. The examples below are for Task 1, where object tracking is provided. If not noted otherwise, they work the same for log files of Task 2 when replacing all occurrences of "trifinger_object_tracking" with "trifinger_cameras". Use the class :class:`trifinger_object_tracking.py_tricamera_types.LogReader` to load the camera data file. **Python Example:** .. literalinclude:: ../examples/camera_log_reader.py There are also tools to view a log file and to convert the camera stream to a video file, so it can easily be shared with others. You can execute them directly using the Singularity image. View Recorded Images -------------------- .. code-block:: bash ./rrc2021.sif ros2 run trifinger_object_tracking tricamera_log_viewer camera_data.dat There are several options for visualizing the result of the object tracking, e.g. ``-v`` to draw the cube pose in the images. See ``--help`` for a complete list. Convert to Video ---------------- .. _camera_log_to_video: To convert the recordings of one camera into a video file (you need to specify one of the cameras "camera60", "camera180", "camera300"): .. code-block:: bash ./rrc2021.sif ros2 run trifinger_object_tracking tricamera_log_converter \ camera_data.dat video.avi -c camera60 Extract Object Poses as CSV (Only for Task 1) --------------------------------------------- If you are only interested in the object poses and want to convert them to a CSV file (e.g. for easier processing with other applications): .. code-block:: bash ./rrc2021.sif ros2 run trifinger_object_tracking tricamera_log_extract_object_poses \ camera_data.dat object_poses.csv Load Robot and Camera Data and Synchronize ========================================== The robot and the camera are providing observations at different rates so technically there are separate "time indices" for the robot and the camera observations. To get some synchronization, the :class:`~robot_fingers.TriFingerPlatformFrontend`/:class:`~robot_fingers.TriFingerPlatformWithObjectFrontend` class allows accessing the camera observation using robot time indices by mapping to the corresponding camera time index internally. The :class:`robot_fingers.TriFingerPlatformLog`/:class:`robot_fingers.TriFingerPlatformWithObjectLog` class provides an API to have the same kind of access when reading the log files. It is basically a wrapper around the robot and camera log readers described above, performing the same "robot index to camera index"-mapping as the :class:`~robot_fingers.TriFingerPlatformFrontend` class. **Python Example:** .. literalinclude:: ../examples/trifinger_platform_log.py .. important:: As shown in the example above, there are different classes for reading logs of stages 1 and 2. This is equivalent to the distiction of the frontend classes, see :ref:`with_object_vs_without`. API Documentation ================= .. autoclass:: robot_interfaces.trifinger.BinaryLogReader :members: .. autoclass:: robot_interfaces.trifinger.LogEntry .. attribute:: timeindex :type: int .. attribute:: timestamp :type: float .. attribute:: observation :type: Observation .. attribute:: desired_action :type: Action .. attribute:: applied_action :type: Action .. attribute:: status :type: Status .. autoclass:: trifinger_cameras.py_tricamera_types.LogReader .. note:: This is for camera logs of stage 2. For logs of stage 1 (where the object pose is added to the camera observation) use :class:`trifinger_object_tracking.py_tricamera_types.LogReader`. .. attribute:: data :type: List[trifinger_cameras.py_tricamera_types.TriCameraObservation] List of camera observations from the log file. .. attribute:: timestamps :type: List[float] List of timestamps of the camera observations. This refers to the time when the observation was added to the time series. The timestamps of when the images where acquired from the cameras are stored in the observations themselves. .. automethod:: read_file .. class:: trifinger_object_tracking.py_tricamera_types.LogReader Same as :class:`trifinger_cameras.py_tricamera_types.LogReader` but for logs with observation type :class:`~trifinger_object_tracking.py_tricamera_types.TriCameraObjectObservation` (stage 1). .. autoclass:: robot_fingers.TriFingerPlatformLog :members: .. note:: This is for camera logs of stage 2. For logs of stage 1 (where the object pose is added to the camera observation) use :class:`trifinger_object_tracking.py_tricamera_types.LogReader`. .. class:: robot_fingers.TriFingerPlatformWithObjectLog This API of this class is the same as :class:`robot_fingers.TriFingerPlatformLog` with the only difference that the type of the camera observation is :class:`~trifinger_object_tracking.py_tricamera_types.TriCameraObjectObservation` (including the object pose). See also :ref:`with_object_vs_without`. .. _robot log reader: http://people.tuebingen.mpg.de/mpi-is-software/robotfingers/docs/robot_interfaces/breathe_apidoc/class/classrobot__interfaces_1_1_robot_binary_log_reader.html