2

I would like to:

  1. Setup Node.js as a WebRTC peer (ex. that web browser can connect to)

  2. Decode video frames in real-time on server side (ex. streamed from browser's webcam)

What is the easiest way to accomplish this? I have seen many similar questions, but have not encountered any obvious answers.

Is this possible with just Node, or must one use a gateway such as Janus as well?

Thanks!

2 Answers 2

2

Finally, the answer was to run a Janus server alongside Node. A custom plugin was written for Janus to handle the WebRTC frames and pass them to my node server as needed.

Sign up to request clarification or add additional context in comments.

Comments

1

If you require realtime video: implementing DTLS, SRTP and codec handling is not trivial.

If you don't require realtime, you might want to give the MediaStreamRecorder API a try and send the data from the ondataavailable event via websocket to your node server. Or capture from a canvas which is shown here and send that to the server as a jpg image.

2 Comments

Unfortunately I need real-time access to the stream (edited original question to clarify). Thanks for the suggestion though.
you might want to try github.com/ibc/mediasoup -- written in a mix of node and c++

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.