Skip to content

Capstone project of Udacity Self-Driving Car Nanodegree

License

Notifications You must be signed in to change notification settings

frgfm/sdcnd-capstone

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Capstone project - Team Hot Wheels

License Codacy Badge Build Status

Capstone project of Udacity Self-Driving Car Nanodegree (cf. repo).

capstone_result

Team Hot Wheels

Table of Contents

Getting started

Prerequisites

  • Unity3D: 3D game engine used for our simulation.

Installation

This is the project repo for the final project of the Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car. For more information about the project, see the project introduction here.

Please use one of the two installation options, either native or docker installation.

Native Installation

  • Be sure that your workstation is running Ubuntu 16.04 Xenial Xerus or Ubuntu 14.04 Trusty Tahir. Ubuntu downloads can be found here.

  • If using a Virtual Machine to install Ubuntu, use the following configuration as minimum:

    • 2 CPU
    • 2 GB system memory
    • 25 GB of free hard drive space

    The Udacity provided virtual machine has ROS and Dataspeed DBW already installed, so you can skip the next two steps if you are using this.

  • Follow these instructions to install ROS

  • Dataspeed DBW

  • Download the Udacity Simulator.

Docker Installation

Install Docker

Build the docker container

docker build . -t capstone

Run the docker file

docker run -p 4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstone

Runtime dependencies

Install the dependencies to run the Python code on each ROS node:

cd sdcnd-captstone
pip install -r requirements.txt
wget https://github.com/frgfm/sdcnd-capstone/releases/download/v0.1.0/faster_rcnn_resnet50_coco_finetuned.pb
mv faster_rcnn_resnet50_coco_finetuned.pb ros/src/tl_detector/light_classification/

Unity

After installing Unity3D, you will need an environment build to run the simulation. Download the appropriate build for your OS and extract it:

If you encounter an issue with the above builds, please refer to the "Available Game Builds" section of this readme.

Usage

Styx

Now you should be able to build the project and run the styx server:

cd ros
catkin_make
source devel/setup.sh
roslaunch launch/styx.launch

Real world testing

  1. Download training bag that was recorded on the Udacity self-driving car.
  2. Unzip the file
unzip traffic_light_bag_file.zip
  1. Play the bag file
rosbag play -l traffic_light_bag_file/traffic_light_training.bag
  1. Launch your project in site mode
cd sdncd-capstone/ros
roslaunch launch/site.launch
  1. Confirm that traffic light detection works on real life images

Other library/driver information

Outside of requirements.txt, here is information on other driver/library versions used in the simulator and Carla:

Specific to these libraries, the simulator grader and Carla use the following:

Simulator Carla
Nvidia driver 384.130 384.130
CUDA 8.0.61 8.0.61
cuDNN 6.0.21 6.0.21
TensorRT N/A N/A
OpenCV 3.2.0-dev 2.4.8
OpenMP N/A N/A

We are working on a fix to line up the OpenCV versions between the two.

Approach

This project involves an agent (vehicle on a highway) exposed to a continuous state space and continuous action space. The environment can be switched to manual mode to give controls to the user, by default the only accepted inputs are the default controls communicated by ROS through a WSGI application.

Environment

This Unity environment gives a large state space with inherent constraints on the agent state.

capstone_start

In manual mode, if we violate common highway driving rules, warnings will be thrown. The environment will have the car position to strictly take values in what the controller sends and will expose both the agent state and sensor measurements to our codebase.

Please refer to this repository for further details.

Goals

  • Respect traffic lights stops
  • Comply with speed limits
  • Keep the lane along the way

In the ros/src folder, you will different nodes:

  • tl_detector: responsible for object detection on traffic lights
  • twist_controller: responsible for vehicle controls
  • waypoint_follower: responsible for trajectory following
  • waypoint_loader: loads the waypoints on the map (position of traffic lights)
  • waypoint_updater: selects an appropriate behavior based on tl_detector information.

#### Perception

perception

This node is responsible for detecting traffic lights in range and classify their color for the planning module to work. For inference speed, we could have selected single-shot detectors such as SSD or YOLOv3, but as a first attempt, we used Faster RCNN for performance reasons.

faster-rcnn

It is available on TensorFlow model zoo, and is a well-known architecture used by some self-driving car makers for their own vehicles. The dataset of Alex Lechner was used for training to avoid manually labeling a dataset as well as its instructions for training models on it.

capstone_red

capstone_green

For efficiency purposes, we only do model inference when the traffic light is in close range since the information is not actionable beforehand.

#### Planning

planning

The planning node publishes waypoints for the vehicle to follow, along with the velocity for each one of them. We had to reduce the default number of waypoints to avoid significant lags on workspace, down to 20.

Control

controller

Udacity provided an Autoware ROS node that will publish the twist commands for linear and angular velocities

Results

The previously mentioned implementation yields a smooth driving agent able to evolve in the highway environment. For better insights on the perception module, the below visualization include another window with the output of the object detection model.

capstone_result

The trained object detection model is available for download here.

The full-length lap recording is available for download in the release attachments:

Credits

This implementation is of my own design but widely uses the following methods and concepts:

License

Distributed under the MIT License. See LICENSE for more information.