brilliantlabsAR/frame-codebase

[feature request] Video input over Bluetooth

Closed this issue · 9 comments

This is a discussion to explore all the ways to implement video transfer over Bluetooth towards Frame display.

The 2 point that could throttle the transfer might be:

  • Phone <---Bluetooth---> MCU communication
  • MCU <---SPI---> FPGA communication

As well as amount of processing on the MCU and FPGA.

A few evoked solutions are:

  • JPEG decoder counterpart to the encoder
  • Use the bitmap library to stream the kind of data the FPGA can use, in 16-colors mode, updating the palette at every frame

For reference, the absolute maximum that is possible to achieve on an nRF52840 with Bluetooth 5:
https://www.youtube.com/watch?v=K1ItqEZ2_tw

To illustrate what different resolution and color mode mean in practice:
None of these were confirmed to be possible yet!

400x400 full RGB888:
image_400_888

400x400 16 indexed colors:
image_400_16i

200x200 full RGB888:
image_200_888

200x200 16 indexed colors:
image_200_16i

100x100 full RGB888:
image_100_888

100x100 16 indexed colors:
image_100_16i

Here is what could be done for the Monocle:
output

Not yet tested on the hardware.

This could be achieved on hardware on the Monocle
mpv-shot0008

As seen above, still limitations that depend on the complexity of the image.

Deep dive into how JPEG works: Everything You Need to Know About JPEG Tutorial Series
Leaving this here for future reference

Git repo to with the video series, used as reference for encoder.
https://github.com/dannye/jed

Quick update, the EBR/LUT resources are almost full so it's unlikely we can support both JPEG decode and encode at the same time

If two identical bitstreams can fit on the flash, this means it is possible to restart the FPGA at runtime with the alternative bitfile to cycle through features as the API uses them?

I think the LZO-compressed image was still quite large already though...

Some great links here. The LZO compression is probably the best option for fast uncompressing on the device (API coming soon), but this isn't very useful for images.

Unfortunately, there aren't enough resources left for jpeg decompression on the device.

However slow frame rates could work. Remember that Frame can only show 16 colors at a time per frame. This can reduce the format down to 4bbp images. Combined with some posterization off-device and LZO, it could be possible to get a few FPS. Send the compressed bytes over the Bluetooth data channel, decompress, and print to display using frame.display.bitmap() (details coming soon once the API is done)

Keep an eye on issues: https://github.com/brilliantlabsAR/frame-codebase/issues/114 and #81

Some feedback suggests to investigate the https://multimedia.cx/mirror/idroq.txt codec widely used on early inexpensive (read: low-resource, low-complexity codec), high volume game consoles.

https://discord.com/channels/963222352534048818/984966420603482182/1240281670939050075
https://github.com/id-Software/Quake-III-Arena/blob/master/code/client/cl_cin.c