Hand Gesture Command allows a person to command BWI Segbot through hand gesture rather than traditional ways of user manual keyboard input.
Hand Gesture Command requires the running of the following nodes.
- background_people_perception (git clone: https://github.com/utexas-bwi/bwi_experimental)
- fri_object_following_creeper (git clone: https://github.com/JGOOSH/fri_object_following)
After running these two nodes, you can start the prgogram by run "hand_gesture_command"
How to Run:
- Download utexas-bwi library (if not installed). Git clone : https://github.com/utexas-bwi/bwi
- Download background_people_perception (git clone: https://github.com/utexas-bwi/bwi_experimental)
- Download fri_object_following_creeper (git clone: https://github.com/JGOOSH/fri_object_following)
- Download this repo (git clone: https://github.com/JGOOSH/Hand_Gesture_Command/)
- catkin_make
- source devel/setup.bash
- roslaunch bwi_launch segbot_v2.launch
- Localize the robot
- Run background_pcl_perception: roslaunch pcl_perception background_people_detection_v2.launch
- Run creeper: rosrun fri_object_following creeper
- Run hand gesture command : rosrun hand_gesture_command hand_gesture_command
Created by Jamin Goo and Mehrdad Darraji
Hand Tracking implemention credit to http://simena86.github.io/blog/2013/08/12/hand-tracking-and-recognition-with-opencv/
Link to working demo: https://www.youtube.com/watch?v=8Fok6OR0DR8
- currently only support version 2 only.