Fork: Discord-video-experiment
This project implements the custom Discord UDP protocol for sending media. Since Discord is likely change their custom protocol, this library is subject to break at any point. An effort will be made to keep this library up to date with the latest Discord protocol, but it is not guranteed.
For better stability it is recommended to use WebRTC protocol instead since Discord is forced to adhere to spec, which means that the non-signaling code is guaranteed to work.
- Playing vp8 or h264 video in a voice channel (
go live
, or webcam video) - Playing opus audio in a voice channel
What I implemented and what I did not.
- VP8
- VP9
- H.264
- RTP (sending of realtime data)
- RTX (retransmission)
- Regular Voice Connection
- Go live
- Figure out rtp header extensions (discord specific) (discord seems to use one-byte RTP header extension https://www.rfc-editor.org/rfc/rfc8285.html#section-4.2)
Extensions supported by Discord (taken from the webrtc sdp exchange)
"a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level"
"a=extmap:2 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time"
"a=extmap:3 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01"
"a=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid"
"a=extmap:5 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay"
"a=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/video-content-type"
"a=extmap:7 http://www.webrtc.org/experiments/rtp-hdrext/video-timing"
"a=extmap:8 http://www.webrtc.org/experiments/rtp-hdrext/color-space"
"a=extmap:10 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id"
"a=extmap:11 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id"
"a=extmap:13 urn:3gpp:video-orientation"
"a=extmap:14 urn:ietf:params:rtp-hdrext:toffset"
Ffmpeg is required for the usage of this package. If you are on linux you can easily install ffmpeg from your distribution's package manager.
If you are on Windows, you can download it from the official ffmpeg website: https://ffmpeg.org/download.html
Install the package, alongside its peer-dependency discord.js-selfbot-v13:
npm install @dank074/discord-video-stream@latest
npm install discord.js-selfbot-v13@latest
Create a new client, and patch its events to listen for voice gateway events:
import { Client } from "discord.js-selfbot-v13";
import { Streamer } from '@dank074/discord-video-stream';
const streamer = new Streamer(new Client());
await streamer.client.login('TOKEN HERE');
Make client join a voice channel and create a stream:
await streamer.joinVoice("GUILD ID HERE", "CHANNEL ID HERE");
const udp = await streamer.createStream();
Start sending media over the udp connection:
udp.mediaConnection.setSpeaking(true);
udp.mediaConnection.setVideoStatus(true);
try {
const res = await streamLivestreamVideo("DIRECT VIDEO URL OR READABLE STREAM HERE", udp);
console.log("Finished playing video " + res);
} catch (e) {
console.log(e);
} finally {
udp.mediaConnection.setSpeaking(false);
udp.mediaConnection.setVideoStatus(false);
}
example/src/config.json
:
"token": "SELF TOKEN HERE",
"acceptedAuthors": ["USER_ID_HERE"],
- Configure your
config.json
with your accepted authors ids, and your self token - Generate js files with
npm run build
- Start program with:
npm run start
- Join a voice channel
- Start streaming with commands:
for go-live
$play-live <Direct video link>
or for cam
$play-cam <Direct video link>
or for screensharing using puppeteer
$play-screen <website url>
for example:
$play-live http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4
- Can I stream on existing voice connection (CAM) and in a go-live connection simultaneously?
Yes, just send the media packets over both udp connections. The voice gateway expects you to signal when a user turns on their camera, so make sure you signal using client.signalVideo(guildId, channelId, true)
before you start sending cam media packets.
- Does this library work with bot tokens?
No, Discord blocks video from bots which is why this library uses a selfbot library as peer dependency. You must use a user token