I present to You, the autonomous car, (almost) entirely written in Python.
Contents:
- Old Version
- New Version
- Hardware Guide
- List of components
- Connecting the wires
- Software Guide
- Device setup
Video
The software is communicating with Raspberry Pi via WiFi network, using sockets. Sensors, camera, and steering - each one is implemented as a separate service on socket. You can steer it with keyboard on PC, and have an live feed from camera. All of the sensors, stearing and video can be dumped to files on PC. All written in Python.
To implement a SLAM capability, that would enable the car to map the environemnt and navigate.
- Working camera feeed
- Working steering system
- Working IMU
- Partian SLAM implemented
- Using only the feed from camera, it can map the environment to "birds eye view"
Attention! This is cool! ORB-slam working on the camera feed (video):
C A R E F U L! This is cool as well 2D map from the video feed (video):
Third version was created because of the problems with the steering (cheap remote controled cars have low quality gears inside). Also, in the previous version, any manipulation of the circuits, was difficult because of the casing. Also it was difficult to get good odometry readout.
So i changed the body of the car as well as the wheels. Looks a bit worse, but is waaaaay more effective ;]
Third version has working odometry system
If you would like to help, or have questions regarding the project, see .
- Raspberry Pi Zero W
- [Pololu MinIMU-9 v5] ~20$ - It does not need to be a exact match, but you will have to modify the code to cover it.
- Pi Camera with connecting tape ~ 16$ - You need specifically this connecting tape for pi zero, as it is different than for other raspberry versions
- camera case this camera case, or you can 3Dprint one.
- Li-Pol Redox 900mAh 20C 3S 11,1V - I could not find that part on amazon, but battery with simmilar specs could do, but you might need to change the connector
- Charger for battery
- L298N - Dual Drive control ~ 7$
-
- Pair of T-DEAN connectors
- for RPI0 GPIO pins you may find already in some raspberry pi zero sets
- some connecting cables
- something like this chassis to mount everythin with a pair of wheels with motors
- something like this photo interrupter
This should be pretty intuitive.
I used ot glue and zip-tie. Remember to orient it correct way, and leave the video connector accessible.
Mount Imu as well. Ideally youwant it as close to the camera as possible. I used some bolts and nuts to mount over front wheel.
Attach the battery under the chassis. I used a cable tie around PI and created a loop underneath chassis.
I used hot glue. Do not let cables touch the radiatior. Make it as stable as possible.
For the reference to the RPI0 pinout, use this website. I will use both the board numbering and BCM. So here is the wiring scheme (ITS NOT CORRECT - SEE ISSUE #18 ):
Some components may not match their real counterparts. Don't worry, I will guide through each component.
This is easy; just lift the plugs a little bit both at camera and PI. The one on the raspberry is very ease to break so be careful, but dont feel bad if you accidentaly will break it. It just happens ;)
This one is is the trickiest.
Steering pins:
Pin | Raspberry BCM | L298N |
---|---|---|
37 | BCM 26 | ENA |
35 | BCM 19 | IN1 |
33 | BCM 13 | IN2 |
31 | BCM 6 | IN3 |
29 | BCM 29 | IN4 |
27 | BCM 0 | ENB |
Power:
Pin | Raspberry BCM | L298N |
---|---|---|
2 | 5v Power | +5V |
6 | Ground | GND |
The colors used on the wiring scheme are coresponding with photos.
Wiring of the pins and ground are as follow:
Pin | Raspberry BCM | Encoder |
---|---|---|
36 | BCM 16 | Right Encoder OUT |
34 | Ground | GND |
32 | BCM 12 | Left Encoder OUT |
30 | Ground | GND |
When it comes to voltage, I just split the cable coming from Pin 4 (5v Power) of RPI0
Be sure not to mix those up. They are a bit confusing.
Pin | Raspberry BCM | IMU |
---|---|---|
1 | 3v3 Power | VDD |
3 | BCM 2 | SDA |
5 | BCM 3 | SCL |
9 | Ground | GND |
Just connect OUT1 and OUT2 to the right motor, and OUT3 and OUT4 to the left. If steering does not work properly, try switching cables in the motor.
Use power switch on one of the cables. Make sure that they cannot touch anything else. If you close the circuit on the battery it might blow up I also used t-dean connectors, and unplug it when im transporting the robot.
Im using 1500 micro-farad 16V capacitor. Its polarised so make sure to properly orient it, and to put the minus into the ground It acts as a stabiliser, its not necessary but greatly improves the stability of the circuit.
Of course, Tonic is written almost entirely in python, so you will be needing python 3.6 at least. I highly recommend using Anaconda
For some scripts it will be necessesary to use OpenCV. No way around it. If you are using anaconda you can simply do :
conda install -c menpo opencv
You are using python, so just do:
pip install -r requirements.txt
Warning I did not test different configrations, hit the issues in github if something is not working for you.
There is some set up required, before you can run this:
- Set up the RPI0, and SLAM docker
- callibrate the camera
- callibrate the IMU
- Set up the
settings.yml
correctly - Set up desired services on the RPI0(Video, Steering, Imu, Odometry), or on laptop (SLAM).
- Run Tonic
For guide for setting up RPI0 look here. How to set up SLAM docker see here.
In the pc
folder there is settings.yml
file. Lets take a look:
info:
settings: v1
car:
build: mark2
name: car1
hardware:
camera:
calibration_file:
path: camera_calibration/calib.json
type: json
image:
shape: [320, 240, 3]
server:
ip: '192.168.1.98'
video:
port: 2201
command: "python -u video_streaming.py"
steering:
port: 2203
command: "python3 -u steering_server.py"
imu:
port: 2204
command: "python -u imu_server.py"
odo:
port: 2206
command: "python3 -u odometry.py"
slam:
ip: '127.0.0.1'
port: 2207
master:
port: 2205
We can divide it into three parts:
info
- just some informations, not really important.
hardware
- stores information about desired images shape as well as path to json containing calibration.
server
- stores the configuration of services. the ip
is the ip of the RPI0, fill it in accordingly.
Also fill in the ip
in slam
, if you are not running it on the same machine as Tonic.
You can find the guide here. You have to be running services to access them with Tonic.
python run.py -v -s
This should run the Tonic with connected Video feed and active steering.
The steering (WASD keys only!) is only active when you have clicked into the window with the feed.
You can run Tonic with any number of options specified in the help menu.
Altough, to run SLAM you have to run it with Imu (-i
).
To collect the all data from all sensors, you should run:
python run.py -s -v -i -o --dump_video --dump_steering --dump_imu --dump_odo /path/to/folder/to/dump/data/to
Each of the options is described in help python run.py -h
.
Remember to change the folder if you want to take new data, because it will defaultly overwrite existing data
There was an attempt to simplify the process of turning everything on, you can see the code used to do that in the file service_run.py
.
This file will guide throug the usage of the rpi
folder.
The code was used on Raspberry Pi Zero with WIFI.
It was not tested on any other device.
First of all, you need to e connected to the same network as the device that you will be running Tonic on. There is enough guides on how to connect the RPI to the wifi already so go and find one. Remember to put the Raspberrys IP into the settings If you will do that, you should build the device toghether, as it is inthe hardware guide
The recommended workflow is to open as many ssh connections from your machine to the RPI as needed and in each ssh run one of the services. Yes, this is not nice, but unifying everything so it would work as reliably and would be as easy to monitor is not trivial task.
This service requires Python 2 picamera, but it should already be in your raspberry. Remember to enable the camera in the RPI settings!
So video server is very basic and simple. You can run it by
python video_streaming.py
And thats it. If the connection is interupted, you might need to restart the service.
This service is well build and designed. If your steering does not match with your hardware, you can easilly manipulate the code to get the desired output.
Nothing more than RPI GPIO and python 3
Just run:
python3 steering_server.py
If the connection is interupted, you might need to restart the service.
The imu services uses external IMU driver, minimu9-ahrs by DavidEGrayson.
This one might be tough, but the guide on the drivers repo got me through anyway, be patient, make sure that you have nabled the correct settings, and everything is well soldered and connected. the guide is here. Godspeed!
Internally this service opens subprocess.Popen
to call miminu9-ahrs (yeah, i know, thats not the best solution, but I tried wrapping it in python and it took too much time).
Make sure that minimu9-ahrs is working and is accesible for that script, and that it is callibrated!!!!
python imu_server.py
Additionaly, if you will stop connection with the Tonic/pc, you dont need to reset this service.
This service was the last that i have written. Its by far one of the most engineered one. The goal here was to make some unifying modules for all services, hence the server_magement
module.
Nothing more than RPI GPIO and python 3
python3 odometry.py
There was an attempt to unify every service, so it would not need 4 windows to be managed.
You can see the code in the file master_server.py
.
Create orb slam docker, using this repo