waveform80/pistreaming

python websocket code to grab stream

Closed this issue · 4 comments

I need to grab the video stream from the url to process the video. What should be the accompanying code to retrieve the image from the websocket? I am trying something like this. But it is not workigng

from PIL import Image
import websocket
import cStringIO
import base64
import cv2

class WSClient():
    def __init__(self):
        websocket.enableTrace(False)
        self.ws = websocket.WebSocketApp("ws://192.168.0.105:8084",
                                         on_message=self.on_message,
                                         on_error=self.on_error,
                                         on_close=self.on_close)
        self.ws.on_open = self.on_open
        self.ws.run_forever()

    def on_message(self, ws, message):
        print "here"
        # image_string = cStringIO.StringIO(base64.b64decode(message))
        image = Image.open(message)
        image.show()
        cv2.imshow("asd",image)
        # print 'a'+image_string

    def on_error(self, ws, error):
        print error

    def on_close(self, ws):
        print "connection closed"

    def on_open(self, ws):
        print "connected"


if __name__ == "__main__":
    client = WSClient()

pistreaming isn't an MJPEG streaming solution (in fact that was something I explicitly set out to avoid); it uses MPEG1 (not great, but at least it's a proper video format). However that does mean it's not suitable for opening with PIL (which only handles image formats, not video formats). If you want to process each frame your choices are:

  1. On the Pi: Process the stream of YUV frames going to ffmpeg
  2. On the client: Decode the MPEG1 stream from the websocket and process each frame
  3. Use something else that's MJPEG based (the latest picamera docs including a simple MJPEG streaming recipe)

ok. How do fetch the images from the link? to replace the cv2.VideoCapture()

Sorry, not sure what you mean by "fetch images from the link"? Do you mean you want to decode the MPEG1 stream from the websocket on the client side?

If so you'll need something like OpenCV's VideoCapture, although looking at the docs for that class it doesn't look like there's an option for opening a stream of bytes (e.g. from a pipe) which is rather disappointing.

You could try creating a FIFO, dumping the stream to that and opening it as a file with VideoCapture (you'll probably need to give the FIFO a sensible extension), but it'll depend on whether VideoCapture wants to seek() the file (if so, you're out of luck and you'll need to find another way). You'll also want to strip off JSMPEG's 8-byte header before writing to the FIFO as that's the first thing sent to any new web-socket connection.

Closing as answered - do feel free to re-open if you need more detail!