Attention

This challenge has ended!

This documentation is only for the Real Robot Challenge 2020 which has ended. Following challenges have their own documentation, see the challenge website for more information.

About the Robot Platform

../_images/finger_and_camera_names.png

Manipulators

The three fingers are referred to as “finger0”, “finger120” and “finger240”. The number denotes the angular position of the finger in the platform. In the robot observations, they are ordered like they are listed above.

Initial Position and Limits

If all joint positions are set to zero, the fingers point straight down. Note, however, that this configuration is not possible as it would collide with the floor.

The following table shows the initial position (after initializing the robot) as well as the lower and upper limits for each joint.

initial

lower limit

upper limit

upper joint

0

-0.33

1.0

middle joint

0.9

0.0

1.57

lower joint

-1.7

-2.7

0.0

The limits are enforced in the software. If an action contains a position command with a position outside of the limits, the value will be set to the limit value. Likewise in case a joint moves out of the limits, actions for that joint will be overwritten by a position command pushing it back inside the valid range. These modifications of the desired actino can be seen by looking at the applied action. If a joint exceeds the limits too much, the robot will stop with an error.

Note

When starting a job, the robot moves to the initial position but then the motors are disabled until the first action is sent by the user. The fingers will “fall down” a bit due to gravity, so the actual position in the first robot observation will not match exactly the initial position specified above.

Torque and Velocity Limits

To avoid damage of the system, the maximum torque is limited to +/- 0.396 Nm on all joints.

Further there is some velocity damping happening to prevent the robot from moving too fast.

Push Sensors

Each finger is equipped with a push sensor at the finger tip. The readings of these sensors are included as tip_force in the robot observations. The value is unit-less and lies in the interval of [0, 1] where a higher value corresponds to a higher force.

Note

The sensors have some offset, so they typically report a non-zero value even when there is no contact. You may compensate for this by measuring the offset at the beginning in a position where there is no contact to be expected (the “initial position” mentioned above should be suitable for this).

Cameras

The platform is equipped with three cameras which are distributed as shown in the image above. The numbers in the camera names follow the same principle as in the finger names and refer to the (very) approximate angular position at which they are mounted. In the camera observations, the cameras are ordered like this:

  1. camera60

  2. camera180

  3. camera300

Specification

  • Frame rate: 10 Hz

  • Resolution: 270x270 pixels

  • Format: Raw Bayer pattern (see Converting Images)

Converting Images

For performance reasons and to keep the logs smaller, the images in the camera observations are provided as raw Bayer pattern (“BayerBG” in OpenCV’s nomenclature), so they need to be demosaiced first.

For Python we provide a utility function which does the conversion and also takes care of converting the image, coming from the underlying C++ code, to a NumPy array:

from trifinger_cameras.utils import convert_image

image_bgr = convert_image(observation.cameras[0].image, format="bgr")

The argument format specifies the output format. Supported options are “bgr” (default), “rgb” and “gray”.

In C++ you can directly use OpenCV’s cvtColor:

#include <opencv2/opencv.hpp>

cv::Mat image_bgr;
cv::cvtColor(observation.cameras[0].image, image_bgr, cv::COLOR_BayerBG2BGR);

Example Images

An example, how the images look like:

camera60

../_images/camera60.png

camera180

../_images/camera180.png

camera300

../_images/camera300.png

Calibration Parameters

The calibration parameters can be found in /etc/trifingerpro when running a job on the robot. The relevant files are

  • /etc/trifingerpro/camera60_cropped_and_downsampled.yml

  • /etc/trifingerpro/camera180_cropped_and_downsampled.yml

  • /etc/trifingerpro/camera300_cropped_and_downsampled.yml

The files follow loosely the format used by ROS. It contains the camera matrix, distortion coefficients and transformation matrix from world to camera frame (tf_world_to_camera). Example:

camera_name: camera60
image_height: 270
image_width: 270
camera_matrix:
  rows: 3
  cols: 3
  data:
  - 292.94724166159165
  - 0.0
  - 136.68649965201232
  - 0.0
  - 295.25639707403457
  - 140.63058822128338
  - 0.0
  - 0.0
  - 1.0
distortion_coefficients:
  rows: 1
  cols: 5
  data:
  - -0.2378013349466862
  - 0.11109761570329618
  - -0.003131726005877879
  - 0.00011634395549213093
  - -0.03310351988084355
tf_world_to_camera:
  rows: 4
  cols: 4
  data:
  - -0.6955754909119396
  - 0.7184373308961673
  - 0.002795771925792812
  - -0.00569374483526391
  - 0.538670722840072
  - 0.5240940434744955
  - -0.6596561387518308
  - -0.006918245900169414
  - -0.4753891602040141
  - -0.4573386516273895
  - -0.7515513715926616
  - 0.5231744133393436
  - 0.0
  - 0.0
  - 0.0
  - 1.0