Skip to content

Commit

Permalink
Update install instructions (#141)
Browse files Browse the repository at this point in the history
* remove client

* update readme

* updating documetnation

* update readme

* fixing dependencies

* update hab lab and clean up instructions

* remove some guidance

* update readme here - add some guidance for starting things

* fix path in stretch ovmm

* redo readme for installation

* update detic instructions

* remove workstation code - not good

* try formatting

* reorganization

* move network config up

* fix hab sim version

* add pytorch back into env files - makes installation easier

* making some changes to installation and configuration

* add detic instructions

* add requirements in case we need it

* download the dataset

* episode instructions

* some more refactoring of the instructions

* update commit

* update readme and env

* update commit for hab lab

* update commit and remove codeowners
  • Loading branch information
cpaxton authored Apr 25, 2023
1 parent e0e09fd commit 2e862e0
Show file tree
Hide file tree
Showing 12 changed files with 412 additions and 163 deletions.
4 changes: 2 additions & 2 deletions .github/CODEOWNERS
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
# @global-owner1 and @global-owner2 will be requested for
# review when someone opens a pull request.
# * @global-owner1 @global-owner2
* @exhaustin @cpaxton @zephirefaith
# * @exhaustin @cpaxton @zephirefaith

# Order is important; the last matching pattern takes the most
# precedence. When someone opens a pull request that only
Expand Down Expand Up @@ -57,4 +57,4 @@
# directory in the root of your repository except for the `/apps/github`
# subdirectory, as its owners are left empty.
# /apps/ @octocat
# /apps/github
# /apps/github
194 changes: 179 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Home Robot
# HomeRobot

[![GitHub license](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/facebookresearch/home-robot/blob/main/LICENSE)
[![Python 3.9](https://img.shields.io/badge/python-3.9-blue.svg)](https://www.python.org/downloads/release/python-370/)
Expand All @@ -7,38 +7,201 @@
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Imports: isort](https://img.shields.io/badge/%20imports-isort-%231674b1?style=flat)](https://timothycrosley.github.io/isort/)

Your open-source robotic mobile manipulation stack
Your open-source robotic mobile manipulation stack!

HomeRobot lets you get started running a range of robotics tasks on a low-cost mobile manipulator, starting with _Open Vocabulary Mobile Manipulation_, or OVMM. OVMM is a challenging task which means that, in an unknown environment, a robot must:
- Explore its environment
- Find an object
- Find a receptacle -- a location on which it must place this object
- Put the object down on the receptacle.

## Core Concepts

This package assumes you have a low-cost mobile robot with limited compute -- initially a [Hello Robot Stretch](hello-robot.com/) -- and a "workstation" with more GPU compute. Both are assumed to be running on the same network.

In general this is the recommended workflow for hardware robots:
This is the recommended workflow for hardware robots:
- Turn on your robot; for the Stretch, run `stretch_robot_home.py` to get it ready to use.
- From your workstation, SSH into the robot and start a [ROS launch file](http://wiki.ros.org/roslaunch) which brings up necessary low-level control and hardware drivers.
- If desired, run [rviz](http://wiki.ros.org/rviz) on the workstation to see what the robot is seeing.
- Start running your AI code on the workstation!
- Start running your AI code on the workstation - For example, you can run `python projects/stretch_grasping/eval_episode.py` to run the OVMM task.

We provide a couple connections for useful perception libraries like [Detic](https://github.com/facebookresearch/Detic) and [Contact Graspnet](https://github.com/NVlabs/contact_graspnet), which you can then use as a part of your methods.

## Installation & Usage
## Installation

### Preliminary

Installation on a workstation requires [conda](https://docs.conda.io/projects/conda/en/latest/user-guide/install/linux.html) and [mamba](https://mamba.readthedocs.io/en/latest/user_guide/mamba.html).

Installation on a robot assumes Ubuntu 20.04 and [ROS Noetic](http://wiki.ros.org/noetic).

To set up the hardware stack on a Hello Robot Stretch, see the [ROS installation instructions](src/home_robot_hw/install_robot.md) in `home_robot_hw`.

Proper network setup is crucial to getting good performance with HomeRobot. Low-cost mobile robots often do not have sufficient GPU to run state-of-the-art perception models. Instead, we rely on a client-server architecture, where ROS and low-level controllers run on the robot, and CPU- and GPU-intensive AI code runs on a workstation.

After following the installation instructions, we recommend setting up your `~/.bashrc` on the robot workstation:

```
# Whatever your workstation's IP address is
export WORKSTATION_IP=10.0.0.2
# Whatever your robot's IP address is
export HELLO_ROBOT_IP=10.0.0.6
# Path to the codebase
export HOME_ROBOT_ROOT=/path/to/home-robot
export ROS_IP=$WORKSTATION_IP
export ROS_MASTER_URI=http://$HELLO_ROBOT_IP:11311
# Optionally - make it clear to avoid issues
echo "Setting ROS_MASTER_URI to $ROS_MASTER_URI"
echo "Setting ROS IP to $ROS_IP"
# Helpful alias - connect to the robot
alias ssh-robot="ssh hello-robot@$HELLO_ROBOT_IP"
```

On the robot side, start up the controllers with:
```
roslaunch home_robot_hw startup_stretch_hector_slam.launch
```

### Workstation Instructions

To set up your workstation, follow these instructions:

#### 1. Create Your Environment
```
# Create a conda env - use the version in home_robot_hw if you want to run on the robot
# Otherwise, you can use the version in src/home_robot
mamba env create -n home-robot -f src/home_robot_hw/environment.yml
conda activate home-robot
```

This should install pytorch; if you run into trouble, you may need to edit the installation to make sure you have the right CUDA version. See the [pytorch install notes](docs/install_pytorch.md) for more.

#### 2. Install Home Robot Packages
```
# Install the core home_robot package
pip install -e src/home_robot
# Install home_robot_hw
pip install -e src/home_robot_hw
```

_Testing Real Robot Setup:_ Now you can run a couple commands to test your connection. If the `roscore` and the robot controllers are running properly, you can run `rostopic list` and should see a list of topics - streams of information coming from the robot. You can then run RVIZ to visualize the robot sensor output:

```
rviz -d $HOME_ROBOT_ROOT/src/home_robot_hw/launch/mapping_demo.rviz
```

#### 3. Hardware Testing

Run the hardware manual test to make sure you can control the robot remotely. Ensure the robot has one meter of free space before running the script.

```
python tests/hw_manual_test.py
```

Follow the on-screen instructions. The robot should move through a set of configurations.


#### 4. Install Detic

Install [detectron2](https://detectron2.readthedocs.io/tutorials/install.html):
```
pip install -e src/third_party/detectron2
pip install -r src/home_robot/home_robot/perception/detection/detic/Detic/requirements.txt
```

Download Detic checkpoint as per the instructions [on the Detic github page](https://github.com/facebookresearch/Detic):
```bash
cd $HOME-ROBOT-PATH/src/home_robot/perception/detection/detic/Detic/
mkdir models
wget https://dl.fbaipublicfiles.com/detic/Detic_LCOCOI21k_CLIP_SwinB_896b32_4x_ft4x_max-size.pth -O models/Detic_LCOCOI21k_CLIP_SwinB_896b32_4x_ft4x_max-size.pth --no-check-certificate
```

You should be able to run the Detic demo script as per the Detic instructions to verify your installation was correct:
```bash
python demo.py --config-file configs/Detic_LCOCOI21k_CLIP_SwinB_896b32_4x_ft4x_max-size.yaml --input desk.jpg --output out2.jpg --vocabulary custom --custom_vocabulary headphone,webcam,paper,coffe --confidence-threshold 0.3 --opts MODEL.WEIGHTS models/Detic_LCOCOI21k_CLIP_SwinB_896b32_4x_ft4x_max-size.pth
```

#### 5. Run Open Vocabulary Mobile Manipulation on Stretch

You should then be able to run the Stretch OVMM example.

Run a grasping server; either Contact Graspnet or our simple grasp server.
```
# For contact graspnet
cd $HOME_ROBOT_ROOT/src/third_party/contact_graspnet
conda activate contact_graspnet_env
python contact_graspnet/graspnet_ros_server.py --local_regions --filter_grasps
# For simple grasping server
cd $HOME_ROBOT_ROOT
conda activate home-robot
python src/home_robot_hw/home_robot_hw/nodes/simple_grasp_server.py
```

Then you can run the OVMM example script:
```
cd $HOME_ROBOT_ROOT
python projects/stretch_ovmm/eval_episode.py
```

#### 6. Simulation Setup

To set up the simulation stack with Habitat, see the [installation instructions](src/home_robot_sim/README.md) in `home_robot_sim`. You first need to install AI habitat and the simulation package:
```
# Install habitat sim and update submodules
mamba env update -f src/home_robot_sim/environment.yml
# Install habitat lab on the correct (object rearrange) branch
git submodule update --init --recursive
pip install -e src/third_party/habitat-lab/habitat-lab
pip install -e src/third_party/habitat-lab/habitat-baselines
# Install home robot sim interfaces
pip install -e src/home_robot_sim
```

And then download the assets as described in the [installation instructions](src/home_robot_sim/README.md#Ddataset-setup).

To test your installation, you can run:
```
python projects/stretch_ovmm/eval_vectorized.py
```

For more details on the OVMM challenge, see the [Habitat OVMM readme](projects/stretch_ovmm/README.md).

This project contains numerous packages. See individual package docs for corresponding details & instructions.

## Code Contribution

We welcome contributions to HomeRobot.

There are two main classes in HomeRobot that you need to be concerned with:
- *Environments* extend the [abstract Environment class](https://github.com/facebookresearch/home-robot/blob/main/src/home_robot/home_robot/core/abstract_env.py) and provide *observations* of the world, and a way to *apply actions*.
- *Agents* extend the [abstract Agent class](https://github.com/facebookresearch/home-robot/blob/main/src/home_robot/home_robot/core/abstract_agent.py), which takes in an [observation](https://github.com/facebookresearch/home-robot/blob/main/src/home_robot/home_robot/core/interfaces.py#L95) and produces an [action](https://github.com/facebookresearch/home-robot/blob/main/src/home_robot/home_robot/core/interfaces.py#L50).

Generally, new methods will be implemented as Agents.

### Organization

[HomeRobot](https://github.com/facebookresearch/home-robot/) is broken up into three different packages:

| Resource | Description |
| -------- | ----------- |
| [home_robot](src/home_robot) | Core package |
| [home_robot_hw](src/home_robot_hw) | ROS package containing hardware drivers for the Hello Stretch Robot |
| [home_robot_sim](src/home_robot_sim) | Simulation |
| [home_robot_client](src/home_robot_client) | Minimal remote client |
| [home_robot](src/home_robot) | Core package containing agents and interfaces |
| [home_robot_sim](src/home_robot_sim) | OVMM simulation environment based on [AI Habitat](https://aihabitat.org/) |
| [home_robot_hw](src/home_robot_hw) | ROS package containing hardware interfaces for the Hello Robot Stretch |

Entry points:
- To set up the hardware stack with a Hello Stretch Robot, see instructions in `home_robot_hw`.
- To set up the simulation stack with Habitat, see instructions in `home_robot_sim`.
- For the OVMM challenge, see [here](projects/stretch_ovmm/README.md).
The [home_robot](src/home_robot) package contains embodiment-agnostic agent code, such as our [ObjectNav agent](https://github.com/facebookresearch/home-robot/blob/main/src/home_robot/home_robot/agent/objectnav_agent/objectnav_agent.py) (finds objects in scenes) and our [hierarchical OVMM agent](https://github.com/facebookresearch/home-robot/blob/main/src/home_robot/home_robot/agent/ovmm_agent/ovmm_agent.py). YThese agents can be extended or modified to implement your own solution.

Importantly, agents use a fixed set of [interfaces](https://github.com/facebookresearch/home-robot/blob/main/src/home_robot/home_robot/core/interfaces.py) which are overridden to provide access to

## Code Contribution
The [home_robot_sim](src/home_robot_sim) package contains code for interface

### Style

We use linters for enforcing good code style. The `lint` test will not pass if your code does not conform.

Expand All @@ -51,6 +214,7 @@ pre-commit install

To format manually, run: `pre-commit run --show-diff-on-failure --all-files`


## License
Home Robot is MIT licensed. See the [LICENSE](./LICENSE) for details.

Expand Down
27 changes: 27 additions & 0 deletions docs/install_pytorch.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@

## Help Installing PyTorch

### Installing Pytorch

See [here](https://pytorch.org/get-started/locally/) to install PyTorch. Example command:

```
conda install pytorch torchvision torchaudio pytorch-cuda=11.7 -c pytorch -c nvidia
```

### PyTorch3d

To install PyTorch3d, run:

```
conda install pytorch3d -c pytorch3d
```

If this causes trouble, building from source works reliably, but you must make sure you have the correct CUDA version on your workstation:
```
pip install "git+https://github.com/facebookresearch/pytorch3d.git"
```

See the [PyTorch3d installation page](https://github.com/facebookresearch/pytorch3d/blob/main/INSTALL.md) for more information.


79 changes: 4 additions & 75 deletions projects/habitat_ovmm/README.md
Original file line number Diff line number Diff line change
@@ -1,83 +1,17 @@
## Table of contents
1. [Environment setup](#environment-setup)
2. [Dataset setup](#dataset-setup)
3. [Demo setup](#demo-setup)
4. [DETIC setup](#install-detic)
5. [Run!](#run)

## Environment Setup

On an Ubuntu machine with GPU:
```
conda env create -n home-robot --file=src/home_robot/environment.yml
conda activate home-robot
git clone https://github.com/facebookresearch/habitat-sim
cd habitat-sim
git checkout 7b99db753272079d609b88e00f24ca0ad0ef23aa # latest main forces Python > 3.9
python -m pip install -r requirements.txt
python setup.py install --headless --with-bullet
# (if the above commands runs out of memory)
# python setup.py build_ext --parallel 8 install --headless
cd ..
git clone --branch modular_nav_obj_on_rec https://github.com/facebookresearch/habitat-lab.git
cd habitat-lab
python -m pip install -e ./habitat-baselines
cd habitat-lab
python -m pip install -r requirements.txt
python -m pip install -e .
cd ../..
python -m pip install "git+https://github.com/facebookresearch/pytorch3d.git"
```

On Mac:
```
conda create -n home-robot python=3.10 cmake
conda activate home-robot
conda install -y pytorch torchvision -c pytorch
git clone https://github.com/facebookresearch/habitat-sim
cd habitat-sim
git checkout 7b99db753272079d609b88e00f24ca0ad0ef23aa # latest main forces Python > 3.9
pip install -r requirements.txt
python setup.py install --with-bullet
cd ..
git clone --branch modular_nav_obj_on_rec https://github.com/facebookresearch/habitat-lab.git
cd habitat-lab
pip install -e habitat-baselines
cd habitat-lab
pip install -r requirements.txt
# Not clear if this should have --all or just be a pip install .
python setup.py develop
cd ../..
pip install natsort scikit-image scikit-fmm pandas trimesh scikit-learn
conda install -c pytorch3d pytorch3d
```

**[IMPORTANT]: Add habitat-lab path to PYTHONPATH**:

```
export PYTHONPATH=$PYTHONPATH:/path/to/home-robot-dev/habitat-lab/
```
# Habitat OVMM

## Dataset Setup

### Scene dataset setup

```
cd `HOME_ROBOT_ROOT/data/`
# Download the scenes
git clone https://huggingface.co/datasets/osmm/fpss --branch osmm
# Download the objects and metadata
git clone https://huggingface.co/datasets/osmm/objects
```

The google scanned objects and amazon berkeley objects will need to be in `data/objects/google_object_dataset` and `data/objects/amazon_berkeley` respectively. These datasets can be downloaded from [here](https://drive.google.com/drive/u/0/folders/1Qs99bMMC7ZpZwksZYDC_IkNqK_IB6ONU). They are also available on Skynet at: `/srv/flash1/aramacha35/habitat-lab/data/objects`.

TODO: Download these using git clone https://huggingface.co/datasets/osmm/objects

### Other instructions

Rough notes; some things were missing for configuring a new environment:
Expand All @@ -92,11 +26,6 @@ cd `HOME_ROBOT_ROOT/data/`
git clone https://huggingface.co/datasets/osmm/episodes
```

### Download CLIP embeddings
Download from `https://drive.google.com/file/d/1sSDSKZgYeIPPk8OM4oWhLtAf4Z-zjAVy/view?usp=sharing` and place them under `HOME_ROBOT_ROOT/data/objects` directory.

TODO: Remove this after we start downloading `objects` folder from huggingface.

## Demo setup

Run
Expand Down
2 changes: 1 addition & 1 deletion projects/stretch_ovmm/eval_episode.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ def main(
goal_recep="chair",
dry_run=False,
):
config_path = "projects/stretch_grasping/configs/agent/floorplanner_eval.yaml"
config_path = "projects/stretch_ovmm/configs/agent/floorplanner_eval.yaml"
config, config_str = get_config(config_path)
config.defrost()
config.NUM_ENVIRONMENTS = 1
Expand Down
Loading

0 comments on commit 2e862e0

Please sign in to comment.