/bfft_formula-student_driverless

Foundation for the future developments of autonomous features of the Black Forest Formula Team. Including: CAN data recording in ROS1, visualization in Tableau, automated data conversion pipeline, remote control of Jetson AGX with Windows PC via SSH and Wifi/LAN

MIT LicenseMIT

BFFT_Logo

Black Forest Formula Team - Formula Student Driverless 2021

This repository lays the foundation for the future developments of autonomous features of the Black Forest Formula Team located in Offenburg. You can find an overview to get started in this ReadMe, for more information we suggest to refer to the Wiki you can find in this Repo. This repository as well as our subrepositories are created and maintained by the Black Forest Formula Team at University of Applied Sciences Offenburg.


Repository organisation


Introduction

This setup and code implementation has three main purposes:

  1. Visualize Data of our first electric racecar when doing testdrives and real races as close to the run as possible (best of course would be real-time)
  2. Record all relevant incoming data while doing a test run for simulation purposes later on (be it for the controll system, temp simulation, object detection or others)
  3. Build the foundation to push our car to one day drive with autonomous features

Therefore, we decided to use ROS (and soon ROS2) for implementing the given goals as stated above. Right now we are almost ready to fulfill goal Nr. 1 as well as goal Nr. 2. Until goal Nr. 3 there is still quite a way to got. Our progress of the software setup and how it plays together with our hardware can be seen in the image below.


Installation

Preconditions & Current Setup

Software

Additional and mandatory Libraries and Tools

Please visit this Wiki Page to install all tools and libraries you will need for this system to run.

Install ROS Melodic

To get the system running we first have to install ROS1 melodic (and in the future probably ROS2). The needed steps are mentioned here in the Wiki. If you already installed ROS you can skip this step.

Setup ROS Catkin-Workspace and Download needed Packages

Create folder structure for catkin workspace. If you already have one go ahead, you might need to adjust the folder path accordingly.

mkdir -p ~/catkin_ws/src

Clone packages into src folder inside workspace

Due to the structure of ROS, functionalities are structured in packages. The following packages need to be installed to be able to use all functionalities of the system. If you only need to visualize data from CAN-bus you can skip everything related to the realsense SDK from Intel referring to the cameras.

More detail on the setup process can be found in the Wiki

  • ros_canopen: Forward incoming and outgoing CAN Messages to and from ROS topics, interface between logic and CAN-hardware.
cd ~/catkin_ws/src/
git clone https://github.com/ros-industrial/ros_canopen.git
cd ~/catkin_ws/src/
git clone https://github.com/IntelRealSense/realsense-ros.git
sudo apt-get install ros-melodic-ddynamic-reconfigure
cd ~/catkin_ws/src/
git clone https://github.com/Black-Forest-Formula-Team/bfft_can_bus_msgs_to_ros_topic.git
  • bfft_rosbag_data_conversion: Take data recorded in ROSBAGS (internal data format) and export it into CSV files (one per topic). Use CSV files for data visualization purpose
cd ~/catkin_ws/src/
git clone https://github.com/Black-Forest-Formula-Team/bfft_rosbag_data_conversion.git

For more input please refer to the Catkin Docs

Build Process

Now we are able to build the workspace (if we have all libraries installed) with the packages downloaded above.

catkin_make

If a library is missing make sure to install it via sudo apt-get install ros-melodic-libraryname if its a ROS library or via pip3 install libraryname if its a python3 lib.

Source setup file to be able to execute ros commands from every terminal

echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc
source ~/.bashrc

More detail in the Wiki.


Getting started

Our setup includes the Jetson AGX, two D455 cameras, one IMU from Genesys (ADMA Slim) as well as several CAN-Sensors and actors (for example two motors, inverters, wheelspeed sensors, BMS, ...) as can be seen in the image below.

Intel Camera - Setup

Connect Intel D455 or similar Intel camera via USB3.0 (make sure that you use a 3.0 cable...). To check if ROS Realsense SDK works accordingly try out: roslaunch realsense2_camera demo_pointcloud.launch A pointcloud displayed in Gazebo should show up.

Here you can find a little guide with more input on the camera setup.

CAN Bus of Jetson AGX Xavier - Setup

To be able to receive and send CAN bus data from a sensor to the AGX you need to setup and wire the hardware. This is described in our Wiki here.

Connecting IMU and sensors via CAN bus

Our Wiki guide for this section can be found here.

Assuming that the previous steps worked correctly when setting up the AGX as well as building the ROS packages in the catkin workspace using catkin_make it should now be possible to call the following roslaunch command to begin the listening to CAN0, decoding the CAN messages and writing them to topics.

roslaunch bfft_CAN_msgs_to_ROS_topic Start_Data_Collection.launch 

If you would like to see data comming in you can try one of the following as long as the IMU is attached:

rostopic echo /imu/imu_data
rostopic echo /imu/gps_data

It is possible to get a list of all available topics by typing rostopic list.

How to start recording of Data into ROSBAGS

rosbag record -a

For more detail have a look at the Wiki Page.


Usage examples

Convenience scripts for AGX ROS remote control via Ethernet/Wifi

  • Scripts to enable remote control of AGX: Useful if not monitor is available to connect AGX to, for example when racecar is standing on the race track. Remote control needed to start & stop autonomous system or to copy the recorded data from AGX to Windows PC.
cd
git clone https://github.com/Black-Forest-Formula-Team/bfft_scripts.git

Start autonomous system on Ubuntu (Jetson AGX)

Start CAN connection, read in messages and transform them into ROS topics, save everything as ROSBAGs

sh ~/scripts/startROS.sh

As you can see in the gif below, when starting the Bash Script all relevant ROS nodes start and run in the background. This is a convenient way to start the whole system. The advantage of this comes into play when wanting to remote control like start, stop or copy data from the system. ROSBAG to CSV

Stop autonomous system on Ubuntu (Jetson AGX)

Kills all ROS processes including ROSBAG recodings and convert the latest (or a specified) ROSBAG into CSV files (on per topic) to be able to display them in Tableau or other visualization apps.

sh ~/scripts/stopROS.sh

Convert ROSBAG to CSV file on Ubuntu (Jetson AGX)

For simplicity this script is called directly from inside the stopROS.sh bash script. As already stated before it convert the latest (or a specified) ROSBAG into CSV files.

sh ~/scripts/rosbagToCSV.sh

ROSBAG to CSV

Start autonomous system from Windows Laptop

To ensure ease of use, all scripts can be executed via a simple double-click on a for this purpose prepared Windows computer (Wiki). The only requirement is a direct WIFI connection to the AGX (Link

To start the system from windows, double click the "startROS" programm.

Stop autonomous system from Windows Laptop

To stop the system from windows, just double click the "stopROS" programm.

Copy CSV files with CAN-Data to Windows Laptop

To copy the CSV files to the windows machine, just double click the "getData" porgramm.

Display Data in Tableau

To be able to display the data in Tableau, a Tableau desktop license is required as well as the installation of Tableau desktop in version 2020.4 or newer. If these requirements are met, the data_visualisation.twb file can be opened. Now you only have to update the data once under the tab "data source" to display the last run.


Features Datavisualization

Due to the current state of development of the vehicle as a whole, only values that were already available at the time of creation are visualized in this data_visualization.twb file. For this reason in particular, we decided to use Tableau, as our data acquisition process and Tableau make it extremely easy to integrate and quickly visualize data from new sensors.

The following visualizations are available in the current version:

  • Position data (displayed as geodata in a map).
  • rate angular velocity
  • acceleration

Using the unknown CAN-ids, we can easily and quickly display new sensors that have not been included so far. Below you can see our current dashboard. tableau_BFFT_Dashboard


Code Repository Conventions

For our coding conventions please visit the wiki page ROS & Python Conventions!


Feedback

Feel free to send us feedback!

If there's anything you'd like to chat about, please feel free to text us on one of our social media plattforms:

Support this project by becoming a sponsor. Your logo will show up on our website with a link to your website. [Become a sponsor]


Our Developers

Dev-Team Vehicle Control Unit & Autonomous Driving in alphabetical order


Release History

  • 0.0.1
    • Initial setup, work in progress

Meta

Distributed under the MIT license. See LICENSE.md for more information.


Contributing to one of our Repos

  1. Fork it (https://github.com/Black-Forest-Formula-Team/bfft_can_bus_msgs_to_ros_topic/fork)
  2. Create your feature branch (git checkout -b feature/fooBar)
  3. Commit your changes (git commit -am 'Add some fooBar')
  4. Push to the branch (git push origin feature/fooBar)
  5. Create a new Pull Request

HSO_Logo