Difference between revisions of "FBM1 - “Object perception”"
(Created page with "== List of objects and associated frames == The following list of objects and their illustrated reference frame is going to be used in this functionality benchmark: * AX-01 B...")
|Line 1:||Line 1:|
== List of objects and associated frames ==
== List of objects and associated frames ==
Revision as of 20:55, 29 April 2015
To be updated for the 2015 RoCKIn Competition
List of objects and associated frames
The following list of objects and their illustrated reference frame is going to be used in this functionality benchmark:
- AX-01 Bearing Box Type A
- AX-16 Bearing Box Type B
- AX-02 Bearing
- AX-03 Axis
- AX-09 Motor with Gearbox
- EM-01 Aid Tray (the teams can choose which of the two types of bearing box they want to use)
- EM-02 Coverplates Box (this is the filecard box from the rulebook)
IMPORTANT The pose of each object has to be delivered to the referee box in the reference frame of the table on which the object is placed. This reference frame of the table is marked by a OR-Code as depicted in the following image:
- An object of unknown class and unknown instance will be placed on a table in front of the robot
- The robot must determine the object’s class, its instance within that class as well as the 2D pose of the object w.r.t. the reference system specified on the table
- The preceding steps are repeated until time runs out or 10 objects have been processed
See the Rule Book for further details.
For each presented object, the robot must produce the result consisting of:
- object class name [string]
- object instance name [string]
- object localization (x [m], y [m], theta [rad])
List of FBM-classes and FBM-instances:
- EM-01 (Aid Tray)
- EM-02 (Coverplates Box)
- Bearing Boxes
- AX-01 (Bearing Box Type A)
- AX-16 (Bearing Box Type B)
- Transmission Parts
- AX-02 (Bearing)
- AX-09 (Motor with Gearbox)
- AX-03 (Axis)
Example of expected result:
object_class: Transmission Parts object_name: AX-09 object_pose: x: 0.1 y: 0.2 theta: 1.23
The communication between the refbox and the robot is relatively straightforward:
- The robot sends a "BeaconSignal":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BeaconSignal.proto message at least every second.
- The robot waits for "BenchmarkState":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkState.proto messages. The robot is supposed to start its object recognition as soon as _state_ is equal to _RUNNING_. Otherwise, the robot should wait until the value changes from _PAUSED_ to _RUNNING_.
- As soon as the robot has produced the benchmarking data which has to be sent to the referee box (see Section "List of variables to be logged") online, it has to send a message of type "BenchmarkFeedback":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkFeedback.proto with the required recognition results back to the referee box. The robot should do this until the _state_ variable of the "BenchmarkState":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkState.proto messages changes from _RUNNING_ to _PAUSED_. The robot should continue with step 2.
- The task benchmark ends when the _state_ variable of the "BenchmarkState":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkState.proto messages changes to _FINISHED_.
List of variables to be logged
The robot is required to log any sensor data used to perform the benchmark (e.g., images, point clouds). The modalities for this are explained by "this document":http://rm.isr.ist.utl.pt/attachments/626/robot_data.txt. Only relevant data is expected to be logged (i.e., the pointcloud used to classify an object, more than one if an algorithm requiring multiple pointclouds is used). There are no restriction about the framerate: data can be saved, for the relevant parts of the benchmark, at the rate they are acquired or produced. The offline log may be a rosbag or the corresponding YAML representation, as specified in document:"RoCKIn YAML Data File Specification". Online data has to be sent directly to the referee box.
Online data: In order to send online benchmarking data to the referee box, the robot has to use the "BenchmarkFeedback":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkFeedback.proto message.
For this FBM the following variables need to be filled in the "BenchmarkFeedback":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/BenchmarkFeedback.proto message:
- object_class_name (type: string)
- object_instance_name (type: string))
- object_pose (type: "Pose3D":https://github.com/mas-group/rockin-refbox/blob/rockin/rockin/msgs/Pose3D.proto)
Offline data: RoCKIn requires that robots save some data when executing the benchmarks. The modalities for this are explained by "this document":http://rm.isr.ist.utl.pt/attachments/624/robot_data.txt. The following are expected ROS topic names and corresponding data types stored in a YAML file or rosbag:
- image [sensor_msgs/Image]: sensorial data used to classify the object
- pointcloud [sensor_msgs/PointCloud2]: sensorial data used to classify the object
Important! Calibration parameters for cameras must be saved. This must be done also for other sensors (e.g., Kinect) that require calibration, if a calibration procedure has been applied instead of using the default values (e.g., those provided by OpenNI).