wiki:Projects/LabIGR2011
Last modified 7 years ago Last modified on 03.04.2012 11:48:05

Lab Course Interactive Games and Robots WS 2011/2012

Game scene is based on a human who stands in front of the Kinect, and tries to control Robotino with bending and rotation based on shoulders. The Robotino is traced by the robot Caesar.
There is a virtual path that Robotino must be kept inside, otherwise user loses the game. The player can see the Robotino and the path on a screen. The path consists of some checkpoints, by reaching to the last checkpoint, user wins the game.

robotino-model

Detection of the Robotino, Execution of the Checkpoint Game Loop

The program is stored in the labigr git branch, ~/fawkes-athome/src/plugins/perception/robotino-model/ . It can be simply compiled via makefile, type "make" inside the directory.
robotino-model plugin must run on caesar.

To run the robotino-model plugin following should be done,

  1. roscore should be activated in one terminal
  2. plugins must be loaded in this order ./fawkes -p static-transforms,pantilt,openni,openni-data,ros,robotino-model,ros-pcl,ros-tf
    1. if we want to have speech during the game "festival" plugin must be also loaded
  3. In order to visualize the path, run rviz on another machine (e.g. Vespucci) by "rosrun rviz rviz"
  4. Additionally we need an agent to follow robotino, so we should run "./skillgui" for "caesar" and select also "skiller" and "lua". The robotino-follow-agent must be defined in the config file "~/fawkes-athome/cfg/default.sql" as the default agent.
  5. To be more safe, run also "./ffptu" on caesar and check if the value is around 0.7. That is needed to detect Robotino properly, since it changes the pan of the Kinect.
  6. in rviz, we need markers which show the path and Robotino's current position as a red bull, and if someone wants to see the Robotino cloud in addition, Point Cloud2 is also needed. They must be added from Displays. (In Displays push Add button and select Markers and Point Cloud2).
    In "Global Options" inside "Displays", put "/map" as "Fixed Frame" and "/base_link" as "Target Frame"
    1. In "Markers", add "marker array", as "Topic"
    2. if You want to visualize the Robotino Cloud in Point Cloud2, add clusters_ in Topic for Point Cloud2. Pay attention that it's color is not red/green/black/purple, because they make confusion during game. White is a good choice.

This programs is used by Caesar to detect the Robotino. Furthermore it determines whether the Robotino is inside the level boundaries. Precise localization is critical to obtain reasonable results. Thus Caesar should be localized before the game starts. It is further recommended to position Caesar such that it starts behind Robotino. It will take a while to find the robot if the Robotino moves behind Caesar. Also Caesar should not stand between the human and the Robotino or else the player cannot see the robot, which makes it hard to control it.

For the visualization holds: green lines and circles mark the level boundaries. Leaving them will cause the player to lose eventually. Yellow circles mark the checkpoints. The color of the next checkpoint is purple. The user should move the centroid of the Robotino into it.

human-detect

Human Detection, Pose Recognition, and Interpretation as Robotino Movement Commands

The program "human-detect" analyzes a point cloud, provided by a Microsoft Kinect, detects the human cluster, recognizes his poses, and interpret them as a movement command which is sent to the Robotino. It is part of the game of the lab "Interactive Games and/with Robots 2011/12".

The program is stored in the labigr git branch, ~/fawkes-athome/src/plugins/perception/human-detect/ . It can be simply compiled via makefile, type "make" inside the directory.
In order to have better speed, human-detect plugin should be run on stoertebecker.

Usage:

  1. Activate the Robotino.
  2. Start fawkes on the robot.
  3. (Perform the following steps on another PC, e.g. Stoertebeker.) Connect to fawkes on Robotino via ffplugingui.
  4. Activate the robotino-joystick plugin.
  5. Execute another fawkes with the following plugins: ./fawkes -p static-transforms,openni,openni-data,ros,human-detect,ros-pcl,ros-tf
    1. if you do not want to have visualization, execute "fawkes" without ros plugins:
      ./fawkes -p static-transforms,openni,openni-data,human-detect

The human should stand straight in front of the Kinect, about 2-3 meters. Tight clothes are recommended since the camera tries to detect the hips as the part with maximal width. A dress may not work. Due the first seconds an initialization is performed. Therefore you should keep your arms below at your sides, but not attached to your body. After this step you can control the robotino movement by bending (max 40° down) and its rotation by your rotation (max 40°). The camera tries to detect your head, your hips, and your shoulders. Keep your shoulders up, do not let them hang down. First try to find a position in which the Robotino does not move. Then start sending commands. Do not bend your knees, because the initialization stores a fixed height for your hips. Do not turn your back towards the Kinect, since the program always assumes, that your face is turned towards the camera. The light may influence the results. Sometimes the camera has difficulties to detect black surfaces, so do not wear a black suit.

"human-detect" supports an optional visualization of its results. Perform the following steps on the machine, on which "human-detect" is executed.

  1. Start "roscore" .
  2. Start "rosrun rviz rviz" .
  3. From "Display" button in rviz, add Point Cloud2
  4. Select the cloud provided by the program, named "human-detect-poses" to the "Topic" field in "Point Cloud2". Take into account that it's color is not black.

The cloud shows the human points (or the whole cloud after filtering, if no human has been found), the bending vector, the shoulders, and the transformed vector which is send as Robotino command.

Students

Safoura Rezapour Lakani

Albert Johannes Helligrath

Advisors

Tim Niemueller

Stefan Schiffer