This repository contains code for running a custom interactive LED display, together with some helper tools.
The display uses the Fadecandy LED controller board https://github.com/scanlime/fadecandy that supports the OpenPixelControl (OPC).
To use this software, a Fadecandy server needs to run and be connected to the display via Fadecandy USB controller boards.
This program was designed to run on a Raspberry Pi along with Fadcandy fcserver.
Currently this is a multi-stage process.
- Build the animations themselves (Note requires using the Processing GUI, since the CLI is broken, see processing/processing#5468).
- Open each Processing sketch under
animations/
- File -> Export Application...
- Click Export
- Make sure the sketch dir contains
animations/animation_name/
:application.linux-armv6hf
application.linux-amd64
- Open each Processing sketch under
- Now run:
./build-animations.sh
- Create thumbnails. (Note this will start the server and send actual traffic that is recorded and made into animated Gifs. So don't touch the webapp during this. Details
src/thumbgen/README.md
):./make-thumbnails.sh
- Now build the executables and static files (This includes Windows, OSX, Linux, Linux ARM):
./build.sh
- Finally build the debian packages:
./build-debian.sh
The assumption the RPi will run in "headless" mode. This means we need to use Xvfb for a virtual screen buffer since Processing needs that to generate the animations.
Follow the instructions in README_FADECANDY.md on how to build and install Fadecandy.
- Copy the debian package to the RPi somehow. For example using SCP:
scp build/siknas-skylt-server-armhf.deb my-rpi:. # my-rpi is the ip or hostname of the RPi
- Install the debian package:
sudo dpkg -i ./siknas-skylt-server.armhf.deb # This complains about unment dependencies sudo apt-get -f install # Fixes dependencies.
Running while developing under docker.
See the TUTORIAL.md
for details.
# Create and edit src/server/sikas.yaml
cp src/server/siknas.yaml.example src/server/siknas.yaml
# Start server.
docker-compose up -d
# (Separate window) Run Xvfb inside of server docker.
./run_xvfb.sh
# Surf to http://localhost:8080 (Linux)
open http://$(docker-machine ip):8080 # OSX
start http://$(docker-machine ip):8080 # Windows.
Example image of the real world display.
To enable development and testing animations on the display a simulator was created in Unity: https://github.com/JoakimSoderberg/OPCSim
docs
- Contains some documentation on how the display works.animations/
- Contains Processing sketches that animates the display using OPC.image-gui/
- A .NET C# program used to map real pixel locations to the virtual ones. (Used to producelayout.json
that the Processing sketches use).layouts
- Contains thelayout.json
created by using theimage-gui
, and the source image used to do this.scripts/
- A script to re-scale the coordinates inlayout.json
.src/controlpanel/
- A websocket client that talks to the control panel via a serial port over USB. The Websocket client connects to the server. Written in Golang.src/server/
- A server that hosts an OPC proxy, as well as a webserver and websockets server. This forwards the OPC traffic to the display coming from the processing sketches (that it starts and stops). It also broadcasts the traffic to connected webclients via websockets. The webserver hosts a web page that will let the user chose which processing sketch to run.src/server/static/
- Hosts a webpage written in Aurelia that displays the list of Processing sketches used to animate the display. This webpage also listens to the binary OPC traffic, and uses D3 to animate an SVG copy of the real display.src/thumbgen/
- Used to generate gif thumbnails by looping through and recording all the animations. These are used in the webapp to show a preview of the animation.
The webapp displays animated gifs as thumbnails for the animations.
To regenerate these, run the script below:
./make_thumbnails.sh
Or for details look at the README.md
for Thumbgen.