This is the project repo for the final project of the Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car. For more information about the project, see the project introduction here (available to students).
This project consists of two parts:
- Program a self driving car using ROS to navigate autonomously in a highway, detecting traffic lights and accelerate/deaccelarate accordingly to maintain a good and safe driving experience.
- Test the code on a real self-driving car 'Carla', a Lincoln MKZ model running Ubuntu.
Team name: ADAS 2.0
Name | |
---|---|
Anastasios Stathopoulos | stathopoan@gmail.com |
Aruul Mozhi Varman S | aruulmozhivarman@outlook.com |
Francesco Fantauzzi (lead) | Francesco_Fantauzzi@yahoo.com |
This project consists of several ROS nodes implementing functionalities such as traffic light detection, control and waypoint following. The overall architecture is displayed below and illustrates the three basic components: perception, planning, and control.
For every component a representative node has been implemented as illustrated below:
This node is responsible for navigating the car in the road adjusting the velocity for every waypoint ahead based on the traffic light state. It receives data from the topics:
- /base_waypoints, a complete list of waypoints the car will be following.
- /current_pose, the vehicle's current position
- /traffic_waypoint, the locations to stop for red traffic lights
and subscribes to the topic: /final_waypoints
which is a list of waypoints head of the car. The code is located at the file /ros/src/waypoint_updater/waypoint_updater.py
.
This node receives all the waypoint positions of the track and stores them. Everytime it receives data from the current car position it tries to find the next closest waypoint to localize itself. If there is no traffic light ahead it adjusts the speed of the next specified waypoints (variable:LOOKAHEAD_WPS
) making sure the speed limit is not exceeded while maintaining the acceleration and jerk values below maximum allowed values. If there is a yellow or red traffic light near, it deaccelerates the car's speed as smoothly as possible respecting the speed, acceleration and jerk limit. While the light color is red the car is not moving by sending an empty list to the /final_waypoints
topic. When the traffic light color turns green the car accelerates to the maximum allowed speed until the next red traffic light.
This node is responsible for detecting a yellow or red traffic light ahead and notify for its state. It receives data from the topics:
- /base_waypoints, a complete list of waypoints the car will be following.
- /current_pose, the vehicle's current position
- /image_color, the image taken from a camera in front of the car
and subscribes to the topic: /traffic_waypoint
which is the index of the waypoint closest to the red light's stop line if near the car else -1. The code is located at the file /ros/src/tl_detector/tl_detector.py
.
The node receives initially all the traffic light positions and the waypoint positions of the track and stores them after mapping each traffic light position with the nearest waypoint. Then it subscribes to the /image_color
topic to be able to receive images from a camera in front of the car. If the light position is larger than the specified distance the node publices -1 to the /traffic_waypoint
topic. If the light position is smaller than the specified distance, the image taken is processed and a color classification is being conducted. If the color is yellow or red then the waypoint index closest to the traffic light position is being published indicating the car must fully stop at that specific waypoint otherwise -1 is being published.
This node is responsible for controlling the car in terms of throttle, brake, and steering. Carla is equipped with a drive-by-wire (dbw) system, meaning the throttle, brake, and steering have electronic control. It receives data from the topics:
- /current_velocity, receive target linear and angular velocities
- /twist_cmd, receive target linear and angular velocities
- /vehicle/dbw_enabled, indicates if the car is under dbw or driver control
and subscribes to the topics:
- /vehicle/throttle_cmd, throttle
- /vehicle/brake_cmd, brake
- /vehicle/steering_cmd, steering
The code is located at the files:
/ros/src/twist_controller/twist_controller.py
/ros/src/twist_controller/yaw_controller.py
This project requires the use of a GPU. Make sure you have available a Nvidia GPU. Traffic lights classification is very demanding and requires a lot of computational power. To ensure smooth and proper simulation please follow the above recommendation.
Run the ROS code and open simulator (see instructions at Usage section). In order for the car to move autonomously uncheck "Manual" checkbox.
To turn on the camera and allow traffic lights recognition, make sure "Camera" is checked.
After the first time you check "Camera" during the simulation, it takes a few seconds for traffic lights recognition to start working. If the camera moves before that, it is likely to ignore the first traffic light.
-
Be sure that your workstation is running Ubuntu 16.04 Xenial Xerus or Ubuntu 14.04 Trusty Tahir. Ubuntu downloads can be found here.
-
If using a Virtual Machine to install Ubuntu, use the following configuration as minimum:
- 2 CPU
- 2 GB system memory
- 25 GB of free hard drive space
The Udacity provided virtual machine has ROS and Dataspeed DBW already installed, so you can skip the next two steps if you are using this.
-
Follow these instructions to install ROS
- ROS Kinetic if you have Ubuntu 16.04.
- ROS Indigo if you have Ubuntu 14.04.
-
- Use this option to install the SDK on a workstation that already has ROS installed: One Line SDK Install (binary)
-
Download the Udacity Simulator.
Build the docker container
docker build . -t capstone
Run the docker file
docker run -p 4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstone
- Clone the project repository
git clone https://github.com/udacity/CarND-Capstone.git
- Install python dependencies
cd CarND-Capstone
pip install -r requirements.txt
- Make and run styx
cd ros
catkin_make
source devel/setup.sh
roslaunch launch/styx.launch
- Run the simulator
- Download training bag that was recorded on the Udacity self-driving car (a bag demonstraing the correct predictions in autonomous mode can be found here)
- Unzip the file
unzip traffic_light_bag_files.zip
- Play the bag file
rosbag play -l traffic_light_bag_files/loop_with_traffic_light.bag
- Launch your project in site mode
cd CarND-Capstone/ros
roslaunch launch/site.launch
- Confirm that traffic light detection works on real life images