thaytan/gst-rpicamsrc

ERROR: from element /GstPipeline:pipeline0/GstRpiCamSrc:src: Internal data stream error.

joshuarlowry opened this issue · 8 comments

I am working to create this FruitNanny for watching our dog. https://ivadim.github.io/2017-08-21-fruitnanny/

System Information:

uname -r
4.9.59-v7+

No LSB modules are available.
Distributor ID: Raspbian
Description: Raspbian GNU/Linux 9.3 (stretch)
Release: 9.3
Codename: stretch

Base Image:
2017-11-29-RasbianStretch-lite.zip on Raspberry Pi 2.
Pi Noir Camera V2.1

Issue:
I'm having an issue running a stream after performing the setup.

When I run the test command from the docs it works to create a file.

gst-launch-1.0 rpicamsrc bitrate=1000000 ! filesink location=test.h264

When I try to create a stream it fails. See the result below.

gst-launch-1.0 -v rpicamsrc name=src preview=0 exposure-mode=night fullscreen=0 bitrate=1000000 annotation-mode=time+date annotation-text-size=20 ! video/x-h264,width=960,height=540,framerate=8/1 ! queue max-size-bytes=0 max-size-buffers=0 ! h264parse ! rtph264pay config-interval=1 pt=96 ! queue ! udpsink host=127.0.0.1 port=5004  sync=false

Console messages:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstRpiCamSrc:src.GstPad:src: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstRpiCamSrc:src: Internal data stream error.
Additional debug info:
gstbasesrc.c(2950): gst_base_src_loop (): /GstPipeline:pipeline0/GstRpiCamSrc:src:
streaming stopped, reason error (-5)
Execution ended after 0:00:00.734776907
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Does it work to record to a file if you use all the same rpicamsrc parameters? I wonder if you're running into the startup timeout bug. If it is, doing an edit like 4e5729b but making the timeout larger should help

Below are several commands with the resulting console output. How can I tell if I am running into the timeout bug?

Command

gst-launch-1.0 -v rpicamsrc name=src preview=0 exposure-mode=night fullscreen=0 bitrate=1000000 annotation-mode=time+date annotation-text-size=20 ! video/x-h264,width=960,height=540,framerate=8/1 ! queue max-size-bytes=0 max-size-buffers=0 ! h264parse ! rtph264pay config-interval=1 pt=96 ! queue ! filesink location=~/test.h264

Console:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstRpiCamSrc:src.GstPad:src: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, width=(int)960, height=(int)540, framerate=(fraction)8/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
Setting pipeline to PLAYING ...
New clock: GstSystemClock
mmal: mmal_vc_component_enable: failed to enable component: ENOSPC
ERROR: from element /GstPipeline:pipeline0/GstRpiCamSrc:src: Internal data stream error.
Additional debug info:
gstbasesrc.c(2950): gst_base_src_loop (): /GstPipeline:pipeline0/GstRpiCamSrc:src:
streaming stopped, reason error (-5)
Execution ended after 0:00:00.161164569
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Command - removed video-x/h264

gst-launch-1.0 -v rpicamsrc name=src preview=0 exposure-mode=night fullscreen=0 bitrate=1000000 annotation-mode=time+date annotation-text-size=20 ! queue max-size-bytes=0 max-size-buffers=0 ! h264parse ! rtph264pay config-interval=1 pt=96 ! queue ! filesink location=~/test.h264

Console

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstRpiCamSrc:src.GstPad:src: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
Setting pipeline to PLAYING ...
New clock: GstSystemClock
mmal: mmal_vc_component_enable: failed to enable component: ENOSPC
ERROR: from element /GstPipeline:pipeline0/GstRpiCamSrc:src: Internal data stream error.
Additional debug info:
gstbasesrc.c(2950): gst_base_src_loop (): /GstPipeline:pipeline0/GstRpiCamSrc:src:
streaming stopped, reason error (-5)
Execution ended after 0:00:00.161781145
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Caught SIGSEGV
#0  0x76c6b4cc in __waitpid (pid=21964, stat_loc=0x7e832a18, options=0)
#1  0x76c9e2a8 in g_on_error_stack_trace ()
#2  0x00015050 in ?? ()
Spinning.  Please run 'gdb gst-launch-1.0 21952' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.

Just wiped my RPI2 and started from scratch.

I was able to run the following and it seems to be streaming, but I don't know how to test it.

gst-launch-1.0 -v rpicamsrc name=src preview=0 exposure-mode=night fullscreen=0 bitrate=1000000 annotation-mode=time+date annotation-text-size=20 ! udpsink host=127.0.0.1 port=5004  sync=false

Console:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstRpiCamSrc:src.GstPad:src: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
Setting pipeline to PLAYING ...
New clock: GstSystemClock

ctrl+c

^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:02:15.895917312
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

This one too...

gst-launch-1.0 -v rpicamsrc name=src preview=0 exposure-mode=night fullscreen=0 bitrate=1000000 annotation-mode=time+date annotation-text-size=20 ! queue max-size-bytes=0 max-size-buffers=0 ! h264parse ! rtph264pay config-interval=1 pt=96 ! queue ! udpsink host=127.0.0.1 port=5004  sync=false

console

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstRpiCamSrc:src.GstPad:src: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)byte-stream, alignment=(string)nal, profile=(string)baseline
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)avc, alignment=(string)au, profile=(string)baseline, parsed=(boolean)true, level=(string)4, codec_data=(buffer)01428028ffe1000f2742802895a01e0089f9601e244d4001000528ce013720
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)428028, sprop-parameter-sets=(string)"J0KAKJWgHgCJ+WAeJE1A\,KM4BNyA\=", payload=(int)96, ssrc=(uint)3321967658, timestamp-offset=(uint)809134821, seqnum-offset=(uint)16843, a-framerate=(string)30
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)428028, sprop-parameter-sets=(string)"J0KAKJWgHgCJ+WAeJE1A\,KM4BNyA\=", payload=(int)96, ssrc=(uint)3321967658, timestamp-offset=(uint)809134821, seqnum-offset=(uint)16843, a-framerate=(string)30
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, width=(int)1920, height=(int)1080, framerate=(fraction)30/1, stream-format=(string)avc, alignment=(string)au, profile=(string)baseline, parsed=(boolean)true, level=(string)4, codec_data=(buffer)01428028ffe1000f2742802895a01e0089f9601e244d4001000528ce013720
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)428028, sprop-parameter-sets=(string)"J0KAKJWgHgCJ+WAeJE1A\,KM4BNyA\=", payload=(int)96, ssrc=(uint)3321967658, timestamp-offset=(uint)809134821, seqnum-offset=(uint)16843, a-framerate=(string)30
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)428028, sprop-parameter-sets=(string)"J0KAKJWgHgCJ+WAeJE1A\,KM4BNyA\=", payload=(int)96, ssrc=(uint)3321967658, timestamp-offset=(uint)809134821, seqnum-offset=(uint)16843, a-framerate=(string)30
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 809134821
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 16843

ctrl+c

^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:53.708912257
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Those last runs look like they are working. You're directing traffic to 127.0.0.1 though. If you want to actually watch it, you'll probably want to send it to another machine, and have a client listening there on port 5004

I managed to successfully stream to my desktop after changing the IP. I did not realize that this was directing traffic.

I'm a little surprised that the original command is breaking.

gst-launch-1.0 -v rpicamsrc name=src preview=0 exposure-mode=night fullscreen=0 bitrate=1000000 annotation-mode=time+date annotation-text-size=20 ! video/x-h264,width=960,height=540,framerate=8/1 ! queue max-size-bytes=0 max-size-buffers=0 ! h264parse ! rtph264pay config-interval=1 pt=96 ! queue ! udpsink host=192.168.1.224 port=5004 sync=false

...removing the video/x-h264 seems to "fix" it...

Works

gst-launch-1.0 -v rpicamsrc name=src preview=0 exposure-mode=night fullscreen=0 bitrate=1000000 annotation-mode=time+date annotation-text-size=20 ! queue max-size-bytes=0 max-size-buffers=0 ! h264parse ! rtph264pay config-interval=1 pt=96 ! queue ! udpsink host=192.168.1.224 port=5004 sync=false

Also Works using raspivid but also the video/x-h264

raspivid -n -t 0 -rot 180 -w 960 -h 540 -fps 30 -b 6000000 -o - | gst-launch-1.0 -e -vvvv fdsrc ! video/x-h264,width=960,height=540,framerate=8/1 ! queue max-size-bytes=0 max-size-buffers=0 ! h264parse ! rtph264pay config-interval=1 pt=96 ! queue ! udpsink host=192.168.1.224 port=5004

I'm not sure why it fails for you with the video/x-h264 filter in place. It works for me here either way.

gst-launch-1.0 -v rpicamsrc name=src preview=0 exposure-mode=night fullscreen=0 bitrate=1000000 annotation-mode=time+date annotation-text-size=20 ! video/x-h264,width=960,height=540,framerate=8/1 ! queue max-size-bytes=0 max-size-buffers=0 ! h264parse ! rtph264pay config-interval=1 pt=96 ! queue ! udpsink host=192.168.1.224 port=5004 sync=false

runs fine.
Note, adding a queue with bytes and buffer limits disabled means the queue limit is 1 second of video - the pipeline might buffer up to 1 second of data if the network is too slow at transmitting the data and that'll show as latency at the receiver side.

Thanks for all of the help. I'm going to keep playing, but I feel like this can be closed.