/Drone-BCI

Controlling a drone with brain control - Emotiv, 3DR solo drone

Primary LanguageCApache License 2.0Apache-2.0

Drone-BCI

Controlling a drone with brain control. Aka awakening the force, or telekinesis.

first test flight

Note: works on OSX

Materials

Set up and run

Server-side

  1. Put on your Emotiv headset and ensure it's connected
  2. Open the project in XCode (open OSX_Project/MentalCommand/MentalCommand.xcodeproj) and hit CMD + R to build and run

Client-side

  1. Git clone this directory, then navigate to it in your terminal
  2. Run sudo pip install -r requirements.txt to install dependencies
  3. Run python setup.py install to install local dependencies
  4. Turn on Solo and controller, and connect your computer to its Wifi
  5. Run python drone-control.py

Dev setup instructions

Setting up your Solo development environment

  1. Install solo-cli with pip install -UI git+https://github.com/3drobotics/solo-cli (you may need sudo for pip installs)
  2. Install virtualenv with pip install virtualenv

Preparing your directory

  1. Git clone this directory, then navigate to it in your terminal
  2. Run sudo pip install -r requirements.txt to install dependencies
  3. Run python setup.py install to install local dependencies

If you want a SITL aka virtual Solo

Get one here

Running code on Solo

  1. Run solo script pack while connected to the internet to bundle your script
  2. Turn on your Solo and connect to its Wifi from your computer
  3. Run solo script run <myscript.py>

Process of making this

Figure out how to talk to Solo

Try out the DroneDirect repo and ensure you can talk to/direct a Solo from there.

The template.py file in the Examples folder is a great place to start. Run it locally, and put some commands in the "your code here" section.

If you're looking for commands, they're well documented in dronedirect/init.py of this repo.

Figure out how to get what you need from the Emotiv headset

We used this Emotiv Objective-C Example

Set up a UDS

This tutorial for the Python side makes this pretty easy. Test server/client to make sure it works, then integrate the client side into the main Python code.

We then sent actions over the server-side UDS from the Mac app (Emotiv-side) packaged as JSON, and parsed the incoming JSON into actions on the client (drone) side.

Useful links

Credit where credit is due

Much of the drone control piece of this project is adapted from DroneDirect: https://github.com/djnugent/dronedirect