*************** Robot Interface *************** We provide an easy-to-use interface for safely controlling the robot either with C++ or Python 3. Quick Start Example =================== Sending actions to and getting observations from the robot is very easy. See the following example that simply sends a constant position command: Python: .. literalinclude:: ../examples/trifinger_platform_frontend.py C++: .. literalinclude:: ../examples/trifinger_platform_frontend.cpp :language: c++ When using C++ you need to add the package ``robot_fingers`` as build dependency to your package. .. _with_object_vs_without: TriFingerPlatformFrontend vs TriFingerPlatformWithObjectFrontend ---------------------------------------------------------------- The examples above use different frontend types for stages 1 and 2. This is because in stage 1 object tracking of the cube is provided and integrated into the camera observation, while in stage 2 no object tracking is provided (thus resulting in a different observation type). The API of both classes is identical with the only difference is that in the ``WithObject`` version the camera observations are of type :class:`~trifinger_object_tracking.py_tricamera_types.TriCameraObjectObservation` while for the other it is :class:`~trifinger_cameras.py_tricamera_types.TriCameraObservation`. On Time Series and Time Relation of Actions and Observations ============================================================ All data transfer between the front end (= user code) and the back end (= robot hardware) goes through so called time series. When calling :meth:`~robot_fingers.TriFingerPlatformFrontend.append_desired_action`, the action is not applied immediately but is *appended* to the time series of desired actions. The back end runs in a loop at around 1 kHz and in each iteration it takes the next action from that time series and applies it on the robot. The time index ``t`` returned by the function indicates the time step at which the given action will be applied. All the get methods (e.g. :meth:`~robot_fingers.TriFingerPlatformFrontend.get_robot_observation`) require a "time index" as argument. If the specified time step has already passed, they immediately return the value from the corresponding step. If it lies in the future, the method will block and wait until the specified time step is reached and then return. Note that only the last 1000 elements are kept in the buffer. Trying to access an older time index results in an exception. Time Relation of Actions and Observations ----------------------------------------- The time index ``t`` returned by :meth:`~robot_fingers.TriFingerPlatformFrontend.append_desired_action` identifies the time step at which the given action is sent to the robot, that is at this moment the action *starts* to being applied. The observation that is returned by :meth:`~robot_fingers.TriFingerPlatformFrontend.get_robot_observation` also belongs to this very moment. This means the observation of step ``t`` belongs to the moment when the action of step ``t`` just starts to being applied, that is this observation is not yet affected by that action! Send Action to Start Backend ---------------------------- In the beginning of the program execution, the back end is idle and waiting for the first action. Only after the first action is received, the loop is started that applies actions and writes observations to the time series. .. important:: This means **you first have to send an action before you can read the first observation!** There are applications where an observation is needed before sending the first real action (e.g. when the action depends on the current position). A safe solution in this case is to start with a zero-torque action: Python: .. code-block:: Python # an action without arguments defaults to zero torque zero_torque_action = robot_interfaces.trifinger.Action() t = frontend.append_desired_action(zero_torque_action) first_observation = frontend.get_robot_observation(t) C++: .. code-block:: C++ Action zero_torque_action = Action::Zero(); auto t = frontend.append_desired_action(zero_torque_action); auto first_observation = frontend.get_robot_observation(t); When Next Action Is Not Provided In Time ---------------------------------------- If the back end reaches a time step ``t`` but the user did not yet provide an action for this time step (e.g. because the user code is running slower than 1 kHz), the back end automatically sets the desired action for step ``t`` to the same as the one of ``t - 1``. This is indicated to the user through the :attr:`~robot_interfaces.Status.action_repetitions` field in the status message which contains the number of times the current action has been repeated. Safety Checks: Difference between Desired and Applied Action ============================================================ The *desired* action that the user sends through the front end is not necessarily the same as the one that is actually *applied* to the robot. The reason is that before applying the action, some safety checks are performed that may modify the action. The purpose of these checks is to make sure that the robot does not break no matter what actions are sent to it. The actually applied action for a given time index ``t`` is returned the method :meth:`~robot_fingers.TriFingerPlatformFrontend.get_applied_action`. Further there is an equivalent method :meth:`~robot_fingers.TriFingerPlatformFrontend.get_desired_action` which returns the desired action (i.e. the one sent by the user) of that time step. The following safety checks are performed: - Position Limits: Prevent the joints from moving out of their allowed range. - Velocity Damping: Prevents too fast movements. - Torque Limits: Limit the torque to avoid damage on collisions and overheating. For more details on the safety checks see :doc:`about_platform`. Robot Frontend API ================== Below is the API documentation of the Python API. For C++ see the C++ documentation of the packages robot_interfaces_ and robot_fingers_ (names of classes and methods are the same). .. autoclass:: robot_fingers.TriFingerPlatformFrontend :members: .. class:: robot_fingers.TriFingerPlatformWithObjectFrontend This API of this class is the same as :class:`robot_fingers.TriFingerPlatformFrontend` with the only difference that the type of the camera observation is :class:`~trifinger_object_tracking.py_tricamera_types.TriCameraObjectObservation` (including the object pose). See also :ref:`with_object_vs_without`. Order of Joints --------------- Values like angular position, velocity or torque of the joints are represented as a 1-dimensional vector with one element per joint. The order of the joints in these vectors is as follows (see :doc:`about_platform` for the position of the fingers in the platform): 0. Finger 0, upper joint 1. Finger 0, middle joint 2. Finger 0, lower joint 3. Finger 120, upper joint 4. Finger 120, middle joint 5. Finger 120, lower joint 6. Finger 240, upper joint 7. Finger 240, middle joint 8. Finger 240, lower joint Actions ------- .. autoclass:: robot_interfaces.trifinger.Action :members: Observations ------------ .. autoclass:: robot_interfaces.trifinger.Observation :members: .. autoclass:: trifinger_cameras.py_tricamera_types.TriCameraObservation :members: .. autoclass:: trifinger_object_tracking.py_tricamera_types.TriCameraObjectObservation :members: .. autoclass:: trifinger_object_tracking.py_object_tracker.ObjectPose :members: .. autoclass:: trifinger_cameras.camera.CameraObservation :members: Status ------ Each status message contains the following fields: If there is an error reported in the status, the robot is not in an operational state anymore. Trying to append another action will result in an exception in this case. .. autoclass:: robot_interfaces.Status .. attribute:: action_repetitions :type: int Number of times the current action has been repeated. See `When Next Action Is Not Provided In Time`_. .. autoattribute:: error_status .. method:: get_error_message Get error message. Only meaningful if ``error_status != NO_ERROR`` .. autoclass:: robot_interfaces.Status.ErrorStatus Direct Simulation API ===================== You can use the API described above in simulation to test your code before submitting it to the actual robots, see :doc:`run_in_simulation_locally`. However, in case you want to pre-train in simulation, this might be a bit cumbersome. For this case it might be better to use :class:`trifinger_simulation.TriFingerPlatform` from the pre-stage which provides mostly the same API as :class:`robot_fingers.TriFingerPlatformFrontend`. Gym Environment =============== In the example package rrc_example_package_, we provide a Gym environment ``RealRobotCubeTrajectoryEnv`` which works mostly the same as the :class:`~rrc_example_package.cube_trajectory_env.SimCubeTrajectoryEnv` with some modifications for the real robot. The relevant differences are: - Instead of the simulation, it uses :class:`robot_fingers.TriFingerPlatformFrontend`, so it can be used for the real robot. - In the observation of step ``t`` it provides the robot observation of this step, not of step ``t + 1`` (because the latter is not possible on the real system). Instead we add the current action to observation as we describe it in `this paper `_. See `usage example `_. For Stage 2, there is another environment ``RealRobotRearrangeDiceEnv`` which is mostly the same but with some modifications for the :doc:`"rearrange dice" task <../task_dice>`. These environments serve as example. You may them it as they are but you can also modify according to your needs.