FBM1 - “Object Perception”

From RoCKIn Wiki
Revision as of 13:45, 24 June 2015 by BRSU (Talk | contribs) (Benchmark execution)

Jump to: navigation, search
To be updated for the 2015 RoCKIn Competition

List of objects and associated frames

moved to Section 5.1.2 (Feature Variation) + 5.1.3 (Associated Frames)

Benchmark execution

moved to Section 5.1.8 (List of Variables to be Logged)

List of variables to be logged

The robot is required to log any sensor data used to perform the benchmark (e.g., images, point clouds). The modalities for this are explained by "this document". Only relevant data is expected to be logged (i.e., the pointcloud used to classify an object, more than one if an algorithm requiring multiple pointclouds is used). There are no restriction about the framerate: data can be saved, for the relevant parts of the benchmark, at the rate they are acquired or produced. The log may be a rosbag or the corresponding YAML representation, as specified in document: "RoCKIn YAML Data File Specification".

The following are expected ROS topic names and corresponding data types:

  • object_class [std_msgs/String]: the recognized object class
  • object_instance [std_msgs/String]: the recognized object instance
  • object_pose2d [geometry_msgs/Pose2D]: the 2D pose of the recognized object
  • object_pose [geometry_msgs/Pose]: the 3D pose of the recognized object (if available)
  • image [sensor_msgs/Image]: sensorial data used to classify the object
  • pointcloud [sensor_msgs/PointCloud2]: sensorial data used to classify the object

+Important!+ Calibration parameters for cameras must be saved. This must be done also for other sensors (e.g., Kinect) that require calibration, if a calibration procedure has been applied instead of using the default values (e.g., those provided by OpenNI).