cordova-rtc/cordova-plugin-iosrtc

telling when a stream is loaded

yocontra opened this issue · 18 comments

with the browser, we render the stream to a video element (just a document.createElement('video') in memory) and wait for it to start playing.

with iosrtc, the video never starts playing since it isn't in the DOM.

what's the best way to determine when a video stream from iosrtc is ready to show?

tried oncanplay instead of onplaying, still never fires

ibc commented

The video canplay event does fire once its attached MediaStream is playing. Sure.

This is not a bug so please let's continue in the mailing list.

ibc commented

NOTE: the video element must be inserted in the DOM before th MediaStream is attached to it, otherwise it is not "observed" by the plugin so nothing works (including video events, which are artificially emited by the plugin).

ibc commented

Or you can wait for the RTCPeerConnection to emit iceconnectionstatechange event, check that its iceConnectionState becomes "connected" or "completed" and, in that moment, insert the video element in the DOM and then attach the MediaStream to it.

I'd rather not use the mailing list (and I'm sure most users would agree) because it is a source of email spam, hard to get set up, and requires a Google+ account. Why not have discussion on github where users are already at?

Also, the URL in the README is set to spanish and has to be changed manually

Also, this is outside of the context of a P2P connection so the ice connection change events arent available - this is just displaying the users camera feed to themselves

ibc commented

The link to the Google Group is fixed in master, thanks. Said that, you do not need a Google+ (the Google social network) account, but just a Google account (which does not require a Gmail account). In any mailing list you need to provide an email account. This is not different.

Regarding the usage of the Github "Issues" section: I do not want to mix real issues and basic or complex questions. Yours is interesting because it exposes a real (and IMHO unavoidable) limitation of the plugin (more in my next comment).

ibc commented

this is outside of the context of a P2P connection so the ice connection change events arent available - this is just displaying the users camera feed to themselves

You are right. However we cannot make magic. The plugin observes the whole DOM, and when a video element is inserted it monitors its src property. If it is changed and a MediaStream is attached to it (via video.src = URL.createObjectURL()) then the magic begins and the plugin builds the native UIView and so on.

If you just create the video element in memory then the plugin does not realize of it and cannot handle it when you attach the MediaStream to it (unless I add a new extra-API cordova.plugins.iosrtc.observeVideo(video), which I don't like because the aim of the plugin is to expose the W3 WebRTC API).

Why don't you check whether the MediaStream has video track and just in that case the video element is shown? I understand you prefer your current and 100% valid approach, but we cannot do magic in the plugin.

No, you're putting words in my mouth. I never said I expect your plugin to work with off-screen video elements. Please quit conflating the actual question with that - that's just our approach in browsers, I don't expect it to work with iosrtc which is why I opened this ticket asking for an alternate solution.

I'm simply asking: How do you tell when a MediaStream is ready to be shown (ie loaded, initialized, has data running, however you want to say it) in iosrtc?

Your answer so far: Check if the MediaStream has a video track.

That doesn't really solve the problem since the stream may show it has a video track but there is no data actually flowing yet. Cameras can take a second or two to turn on and we are trying to show a loading indicator to the user until their camera has actually been started.

@contra, maybe the MediaStreamTrackState live would be what you are looking for. It looks like they are forwarded through the plugin.. I'm also at this point and will be trying it tomorrow...

ibc commented

@calebboyd I'm not sure that MediaStreamTrackState live is the way to know when a track becomes active (in the sense that it can be rendered). There are just two states, "live" and "ended". From the spec:

live
The track is active (the track's underlying media source is making a best-effort attempt to provide data in real time).

AFAIR a track is "live" from the beginning regardless it is not yet rendering the audio/video track.

The proper way to know when the track is ready is by waiting for the canplay event of the MediaStream the track is contained in, but for that the plugin must handle the video element, and for that, the video element must be inserted into the DOM before the MediaStream is attached to it (which is the limitation causing the issue reported here, am I right?).

@ibc Doesn't the MediaStream API have started and active events according to the spec? These have nothing to do with the DOM, it's the underlying stream from the hardware that these correspond to. There are plenty of ways to solve this problem without rendering to the DOM, I don't know why this keeps being brought up.

ibc commented

You don't want the MediaStream active event. From the spec:

A MediaStream object is said to be active when it has at least one MediaStreamTrack that has not ended. A MediaStream that does not have any tracks or only has tracks that are ended is inactive.

The fact that a track is "active" does not meant that its media is ready to be locally rendered, but just that the track is alive.

Also note that the plugin does not emit "active" for MediaStream objects. Such an event would be just fired in case a MediaStream is "inactive" (all its tracks are inactive) and new alive tracks are added into it so it becomes "active" again. But... when the MediaStream becomes "inactive" it is garbage collected by the plugin (that's the only way to automatically release it).

ibc commented

@contra as you can see, there is no event in MediaStreamTrack to tell you when its media is ready to be rendered: http://w3c.github.io/mediacapture-main/#idl-def-MediaStreamTrack

Instead, you should check the "canplay" event of the <video> element in which such a track is attached (but then the already explained plugin limitation comes into the scene: the video needs to be already inserted in the DOM).

I may add an extra-API cordova.plugins.iosrtc.observeVideo(video) to make your usecase posible. Would it work for you?

If that is truly the only solution here, one could monkey patch document.createElement in order to determine when a detached video element is created. Care would just need to be taken that it is cleaned up properly...

Monkey-patching document.createElement or observeVideo are both fine for me. I don't really care how I have to do it, I just need to be able to do it somehow 🌴

ibc commented

@contra please try master branch and the added API and let me know if that works for you.

works perfect!