ROS integration for the FLASH MK II. The stack is structured according to the TIAGO robot ROS Integration.
The full stack has only been tested on Ubuntu 16.04 and ROS Kinetic!
Most of the code should be compatible with Python3. The only exception is the navigation.py script which depends on the tf package. If you want to make it compatible with Python3, then you should also build tf with Python3 (see this thread).
In order to install the Python3 required dependencies, we recommend creating a virtual environment and using the provided requirements.txt
file (see Building the Stack).
The robot odometry is computed based on planar laser scans using the rf2o_laser_odometry package. The official package has been released only for ROS Indigo, hence you will need to clone the repository and build it within your ROS workspace (use provided fork):
cd <YOUR_PATH>/kinetic_ws/src
git clone https://github.com/joselpart/rf2o_laser_odometry.git
cd ..
catkin_make
In order to show the battery level in RViz, the topics_rviz_plugin package is used:
cd <YOUR_PATH>/kinetic_ws/src
git clone https://gitlab.com/InstitutMaupertuis/topics_rviz_plugin.git
cd topics_rviz_plugin
git checkout kinetic
cd ../..
catkin_make
In the directory where you want to create your Python3 virtual environment do:
python3 -m venv python3_venv
source python3_venv/bin/activate
Then, clone the flash_robot stack, install the dependencies and build it:
cd <YOUR_PATH>/kinetic_ws/src
git clone https://github.com/BrutusTT/flash_robot.git
pip install -r flash_robot/requirements.txt
cd ..
catkin_make
If you followed the previous instructions, then your ROS workspace should look something like this:
kinetic_ws/
src/
CMakeLists.txt
flash_robot/
README.md
requirements.txt
flash_2dnav/
flash_behaviors/
flash_bringup/
flash_controller/
flash_description/
flash_experiments/
flash_maps/
flash_odom/
flash_robot/
rf2o_laser_odometry/
topics_rviz_plugin/
build/
devel/
This package contains configuration files and launch files for navigation and mapping. We use gmapping for mapping and amcl for localization, available from the navigation stack. You can change the configuration of the planners and the localization algorithm in the config files provided.
If you need to create a new map, then run:
roslaunch flash_2dnav mapping.launch
Then, you can use a joystick to teleoperate the robot around (see flash_controller). Once you are happy with the map, then you can save it:
# If you don't provide a directory, it will be saved in the current directory.
rosrun map_server map_saver -f <MAP_NAME>
To navigate in the created map, first change the corresponding parameter in move_base.launch
to point to your map. In this file, you can also change the local planner to be used. Then, you can run it and provide goals using the available RViz plugin:
roslaunch flash_2dnav move_base.launch
The package also contains a python script navigate.py
with scripted waypoints (with respect to the provided maps) and behaviors. It should be run after move_base.launch
has been run:
# Make sure your Python3 environment has been deactivated unless you have built tf with Python3!
rosrun flash_2dnav navigate.py
This package contains various action servers for speech, behaviors, gestures, etc. It also contains a configuration file that defines a series of URBI functions that are loaded onto the robot when the action server starts.
The launch file available here is meant to set up the system to run experiments since it starts all the necessary components.
This package contains the joystick configuration file and a few demo launch files.
This package contains all the interfaces between ROS and URBI and the nodes that publish sensor data and subscribe to commands.
This package contains description files for the flash robot. At the moment, these are dummy files mainly for visualization in RViz.
This package contains code for running the experiment for the human condition. In summary, there is a launch file called experiment_human.launch
that runs a node that plays sounds and a node that listens for the psychopy process signals and sends a request to the sound playing node when the sugnal arrives:
roslaunch flash_experiments experiment_human.launch
Where the maps should be stored.
This package contains a launch file that configures and runs the odometry node.
Metapackage.