FBM2 - “Object Manipulation”

From RoCKIn Wiki
Revision as of 21:13, 29 April 2015 by Rockinadmin (Talk | contribs)

Jump to: navigation, search
To be updated for the 2015 RoCKIn Competition

List of objects

The following list of classes and instances of objects are going to be used in this functionality benchmark:

  • Containers
    • EM-01 (Aid Tray)
    • EM-02 (Coverplates Box)
  • Bearing Boxes
    • AX-01 (Bearing Box Type A)
    • AX-16 (Bearing Box Type B)
  • Transmission Parts
    • AX-02 (Bearing)
    • AX-09 (Motor with Gearbox)
    • AX-03 (Axis)

The following pictures shows the possible object placement for the functionality benchmark:

!Images_FBM2_1.jpg! !Images_FBM2_2.jpg! !Images_FBM2_3.jpg!

Benchmark execution

  1. An object of unknown class and unknown instance will be placed on a table in front of the robot
  2. The robot must determine the object’s class and its instance within that class
  3. The robot must position the end effector in order to grasp the object
  4. The robot must grasp the object, lift it, and notify that grasping has occurred
  5. The robot must keep the grip for a given time while the referee verifies the lifting
  6. The preceding steps are repeated with different objects

See the Rule Book for further details.

For each presented object, the robot must produce the result consisting of:

  • object class name [string]
  • object instance name [string]

Example of expected result:

object_class_name: Containers
object_instance_name: EM-02


RefBox-Communication

The communication between the refbox and the robot is relatively straightforward:

  1. The robot sends a "BeaconSignal":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BeaconSignal.proto message at least every second.
  1. The robot waits for "BenchmarkState":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkState.proto messages. The robot is supposed to grasp the object(s) as soon as _state_ is equal to _RUNNING_. Otherwise, the robot should wait until the value changes from _PAUSED_ to _RUNNING_.
  1. As soon as the robot has produced the benchmarking data which has to be sent to the referee box (see Section "List of variables to be logged") online, it has to send a message of type "BenchmarkFeedback":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkFeedback.proto with the required recognition results back to the referee box. The robot should do this until the _state_ variable of the "BenchmarkState":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkState.proto messages changes from _RUNNING_ to _PAUSED_. The robot should continue with step 2.


List of variables to be logged

The robot is required to log any sensor data used to perform the benchmark (e.g., images, point clouds). The modalities for this are explained by "this document":http://rm.isr.ist.utl.pt/attachments/626/robot_data.txt. Only relevant data is expected to be logged (i.e., the pointcloud used to classify an object, more than one if an algorithm requiring multiple pointclouds is used). There are no restriction about the framerate: data can be saved, for the relevant parts of the benchmark, at the rate they are acquired or produced. The offline log may be a rosbag or the corresponding YAML representation, as specified in document:"RoCKIn YAML Data File Specification". Online data has to be sent directly to the referee box.

Online data: In order to send online benchmarking data to the referee box, the robot has to use the "BenchmarkFeedback":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkFeedback.proto message.

For this FBM the following variables need to be filled in the "BenchmarkFeedback":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkFeedback.proto message:


The following are expected ROS topic names and corresponding data types:

  • image [sensor_msgs/Image]: sensorial data used to localize and manipulate the object
  • pointcloud [sensor_msgs/PointCloud2]: data used to localize and manipulate the object

Important! Calibration parameters for cameras must be saved. This must be done also for other sensors (e.g., Kinect) that require calibration, if a calibration procedure has been applied instead of using the default values (e.g., those provided by OpenNI).