Possible to include screencasting as well?
Opened this issue · 4 comments
I'm loving this demo, I'm wondering if theres an opportunity for a more 1 way type connection, where the phone would broadcast the camera, screencast the website, and audio. The viewer would only receive the phone camera, screencast, and audio and broadcast audio.
It should be possible to add screen capture and make that available as another type of video source. However, this has limited use on an iOS device as video capture is paused whenever you switch to another app.
Thats good to know! I think for our immediate use case the user would be confined to the app more or less, so they wouldn't switch to other apps. For context, we are working on a tool to screencast InVision, Marvel, and Framer prototypes to a user research dashboard and right now our app is somewhat limited on all other AV stuff, outside of screencasting a webview. So we were looking at a WebRTC option.
You are welcome to dig deeper in to this, but it's unfortunately not a use-case we will be able to prioritise near term.
I think the GStreamer camera source element (called avfvideosrc) we use on OS X and iOS supports screen capture. I've never tried that functionality though and I don't know what is needed to get it working on iOS. I would expect some kind of statement that one is going to capture the screen so that the user is aware that the application does that.