How to audio data stream to the speakers on node server ?
tinamore opened this issue ยท 16 comments
Hi,
Your project is very interesting.
I wonder how to output the speaker on the server node?
Example:
const Speaker = require('speaker');//https://www.npmjs.com/package/speaker
...
socket.on('stream', function(packet){
console.log(packet);
packet.pipe(speaker);//this is only example...
socket.broadcast.emit('stream', packet);
});
Thanks
Hiya,
Actually it's possible if you can decode the stream and pipe it to speaker
library. But for some modern browser (like Chrome) it's only support recording in opus
with webm
format, so I don't think it's possible because I couldn't find any webm decoder
on npm repository.
The opus decoder is exist on npm, so the main problem is for decoding the webm
with NodeJS.
But you can output the stream as a file and use ffmpeg
or play the file directly.
var fs = require('fs');
// Event listener
io.on('connection', function(socket){
var chunks = [];
/* Presenter */
socket.on('bufferHeader', function(packet){
chunks.push(packet.data); // The header must be at first index
});
// Save the received buffer
socket.on('stream', function(packet){
if(chunks.length !== 0 && chunks.length <= 20){
chunks.push(packet[0]);
// Write to file after 2sec (If one chunk duration is 100ms)
if(chunks.length === 20){
fs.writeFile("test.webm", Buffer.concat(chunks), "binary", function(){
// Exit process after the write was completed
process.exit();
});
}
}
});
});
But if the browser support recording in audio/ogg
MediaRecorder.isTypeSupported('audio/ogg;codecs="opus"') === true
MediaRecorder.isTypeSupported('audio/ogg;codecs="vorbis"') === true
You can change the Presenter mimeType
var presenter = new ScarletsMediaPresenter(...);
presenter.options.mimeType = 'audio/ogg;codecs="opus"';
And on the server, you need to
- Combine every data chunk with the buffer header (with
Buffer.concat
andrequire('stream').PassThrough
) - Then decode it with
ogg decoder
andopus decoder
- And pipe it to
speaker
I'm doing a little experiment. It is the audio stream direct from the Microphone of the chrome of Android Phone (use SFMediaStream) to raspberrypi speaker (not browser, only nodejs).
I understand the steps you talk on server about ogg decoder and opus decoder. But I'm not good at programming nodejs.
Can you write a few lines of code that can work?
Thanks.
I have learned a lot on the Internet.
But there's really no way to stream the microphone from the browser to node js.
Webrtc can be used (a very good sound quality technology) but only support browser to browser.
I found your library SFMediaStream can convert from mirophone to raw and stream to the server via socket.io. Only the last step is to transfer raw to the speaker.
Hmm, I'm not sure chrome can recording with ogg
format.
But, if it's available maybe this can work
// Event listener
io.on('connection', function(socket){
var decoder = null;
/* Presenter */
socket.on('bufferHeader', function(packet){
decoder = opusDecoder(packet.data);
});
// Broadcast the received buffer
socket.on('stream', function(packet){
if(decoder !== null)
decoder.write(packet[0]);
});
});
function opusDecoder(headerBuffer){
var opus = require('node-opus');
var ogg = require('ogg');
var stream = require('stream');
var speaker = require('speaker');
speaker = new speaker();
return {
write:function(chunk){
var decoder = new ogg.Decoder();
decoder.on('stream', function(stream){
var opusDecoder = new opus.Decoder();
opusDecoder.on('format', function(format){
if(!format.signed && format.bitDepth !== 16)
throw new Error('unexpected format: ' + JSON.stringify(format));
// Send the data to speaker
opusDecoder.pipe(speaker);
});
opusDecoder.on('error', console.error);
stream.pipe(opusDecoder);
});
var bufferStream = new stream.PassThrough();
bufferStream.end(Buffer.concat([headerBuffer, chunk]));
bufferStream.pipe(decoder);
}
}
}
I have access link: https://kbumsik.io/opus-media-recorder/
Seem chrome support:
audio/wave is supported
audio/wav is supported
audio/ogg is supported
audio/ogg;codecs=opus is supported
audio/webm is supported
audio/webm;codecs=opus is supported
Oh, i have just change the Presenter mimeType
...
presenterMedia = new ScarletsMediaPresenter({
audio:{
channelCount:1,
echoCancellation: false
}
}, 100);
presenterMedia.options.mimeType = 'audio/ogg;codecs="opus"';
...
Then Chrome show console error: DOMException: Failed to construct 'MediaRecorder': Failed to initialize native MediaRecorder the type provided (audio/ogg;codecs="opus") is not supported.
I will explore this issue. Thanks for your support.
Yeah, maybe initializing the polyfill before you starting the recording could work.
The polyfill seems can solve many problem, but I haven't try it ๐
I'm doing a little experiment. It is the audio stream direct from the Microphone of the chrome of Android Phone (use SFMediaStream) to raspberrypi speaker (not browser, only nodejs).
Hello,
I'm doing the same project now, have you found a solution?
@ashkmn I just updated the example, but sadly the opus decoder dependency on NodeJS was not able to decode the stream. And currently the example is default to write to a file instead to the speaker.
I'm curious if there are a media player that able to play the file that still being written, or maybe I should concat every buffer start from the header so the opus decoder will not complain about data corrupted. But the speaker library doesn't have media seek so it will always play from the beginning.
Well, maybe you can set an interval in NodeJS to rename the written file after some second and start writing to new file. So you can play the renamed file with a media player. It's a bit crazy tho on File I/O, but it's the only solution for now.
Instead of using default browser's media recorder, I just using @tinamore is suggestion and now it's working with audio/wav
mimeType.
@StefansArya Hi ! Thanks for your work in this exemple. I try to test it but i'm stuck.
With Chrome, it's work well with stream to file function, but with stream to server's speakers i got this :
DOMException: Failed to construct 'MediaRecorder': Failed to initialize native MediaRecorder the type provided (audio/wav) is not supported.
With Firefox, both stream to file and stream to server's speakers gave me the same error :
TypeError: MediaRecorder constructor: Argument 1 does not implement interface AudioNode.
mediaGranted https://192.168.1.35:8000/dist/SFMediaStream.js:795
promise callback*ScarletsMediaPresenter/scope.startRecording https://192.168.1.35:8000/dist/SFMediaStream.js:866
start https://192.168.1.35:8000/browser/presenter.js:30
scarletsframe 0.28.0
Have you got any idea to make it work ?
Thanks !
@katarpilar Hi, thanks for notify me.
It seems the example wasn't work when switch into "Stream to speaker" after the page was loaded. I have fixed it, could you try on your browser?
@StefansArya It work grate now thanks :D
However, i found another problem, maybe i can open an issue for it but i need to test with another machine before.
When the stream start with default parameters, i got audio "pop" at every second (each audio chunks?)
Maybe it's related to the speaker library since i dont have it when i record to file and play it.
@katarpilar you're free to open new issue as long it isn't a duplicate with other open issue. But it's still related with this issue so it's OK to continue our discussion here.. Actually that "pop" also happen on my PC but currently I don't have solution about that ๐
wav
file is not optimized for streaming, but the node-speaker library doesn't seems to support webm
. I have tried to decode the webm
into wav
in the realtime for that library to use, but it's doesn't seems to work perfectly. So using the polyfill for the browser is the last solution on that time..
I think you can reduce that "pop" by increasing the latency, but yeah.. that's not a best idea. My another alternative solution is using WebRTC because I also want to add it for this library in the past.
@StefansArya Yea i noticed you commented some lines in speaker.js to decode opus chunk in real time and i tryed to test them but all i get is the node process segfaulting when it try to decode the stream :/
I still have some hope by using ogg instead of webm container but with chrome we need to use the OpusMediaRecorder, and when i try to change it i get "TypeError: Failed to execute 'fetch' on 'WorkerGlobalScope': Failed to parse URL from /OggOpusEncoder.wasm"
I dont understand why, probably because i'm a big noob in JS ^^ but i tryed to load the wasm file as specified here https://github.com/kbumsik/opus-media-recorder , without success.