Skip to content
This repository has been archived by the owner on Aug 7, 2018. It is now read-only.

A Sample Demo With Baxter

SchmitzSam edited this page Apr 10, 2017 · 1 revision

This is a demo with a Baxter humanoid robot in a blocksworld domain. Baxter is asked to pickup or stack blocks. There are at most three blocks with different color.

After opening each window you should connect to baxter: cd ros_ws ./baxter.sh

1- In one window: cd ~/ros_ws/src/baxter_srv roslaunch buffer_srv.launch

2- In another window: cd ~/ros_ws/src/baxter_srv roslaunch outgoing_srv.lunch

3- To make the robot to go the initial position: cd ~/ros_ws/src/baxter_srv/scripts --to tuck the arms: python tuck_cmds.py --raise the right hand to watch the table rosrun baxter_examples joint_trajectory_file_playback.py -f initMove2 --For calibration: python ObjectDetector.py

4-run voice recognition scripts: ~/git/baxter_cog/speech_recog/baxter_sphinx

5-run grabbing script: python ~/git/MIDCA/examples/_baxter/grabbing.py

6-run Object detection script: python git/baxter_srv/experiment/OD.py

7-run MIDCA: python ~/git/MIDCA/examples/baxter_run_OD.py

Clone this wiki locally