Answer the question
In order to leave comments, you need to log in
What is the algorithm of modern streaming?
Hello.
I am currently at the stage of planning the details of the implementation of my future service in which one person will be able to stream a video stream from his camera to many viewers directly from the browser through a server written in node.js.
I have some idea of how it should work, but just in case, I would like to hear constructive criticism from other developers regarding my sketches of implementation details.
1. First, I get the video stream in the streamer's browser from his webcam using navigator.getUserMedia.
2. I then render the data from the MediaStream object to the video tag.
3. Then I pass the image from the video tag to the canvas tag through its context so that after that I can pull frames from the video stream as a whole and not mixed with the data of other frames that are in order.
4. Then, at some intervals (for example, 30 times per second), I pull 1 whole frame from the canvas as base64 text and send them to the server via socket.io.
5. On the server, I receive each frame and send it to all clients who are viewers and in turn display these frames, for example, in the same canvas.
I am wondering if it is necessary to somehow compress the frames before sending them to the server, or is the fact that they come in the form of a base64 string already an indication that they are already compressed? If you need more, how?
And in general, I would like to know your opinion on the algorithm. How are such tasks usually solved qualitatively?
Answer the question
In order to leave comments, you need to log in
Single frame: 1920px * 1080px * 24bpp = 49'766'400bit
Base64: 49'766'400bit / 6 * 8 = 66'355'200bit
Whole stream: 66'355'200bit * 30fps = 1'990'656'000bps
That is, you will need a bandwidth of ~ 2 gigabits per second.
Still, the developers of video codecs eat their bread for a reason.
You kind of ask a question, but when you get an answer that not only disagrees with your assumption, but contradicts it, you get into a pose as if you have real reasons to be sure that your assumption is correct. You contradict yourself...
Understand some points for yourself:
1. Transferring video data to the server requires optimizing this process in the direction of reducing the amount of transmitted data, for which, traditionally and not without reason, video codecs are used that compress it not only spatially ( within one frame) but also temporally (within a sequence of frames).
2. If you think that "everyone has a fast Internet and you don't have to worry about compression", then the logic is alien to you and you don't have to read further, if you don't think so, then read on.
3. What you are even suggesting is this:
but then you agree that base64's overhead is 30%, somehow it doesn't fit into one picture, because base64 doesn't compress anything, but on the contrary, it increases the volume.
4. If you are going to pack an uncompressed bitmap into base64, it is easy to calculate using a school arithmetic course that one second of video is 1024 × 768 (pixels) × 3 (RGB bytes) × 30 (frames / sec) × 1.3 (base64 overhead) will weigh "a little dofiga".
5. If the above has convinced you, I can offer you a different approach: read, for example, this article https://blog.cloudboost.io/how-to-run-node-js-apps... and think about how to run a Node.js application in the user's browser, and for this application to use, for example, handbrake https://github.com/75lb/handbrake-js
True, I have "vague doubts" about the fact that video compression for streaming in a client browser will really be comparable in performance to codecs used by native applications.
6. Your counter-analogy with receiving calls after hours (from the comments) is incorrect, because the problem is not in the formal purpose of the browser and streaming clients, but in very real possibilities. And no, the browser is not the "same" application as the streaming client.
Everything is somehow complicated for you... If you want to stream directly from the browser, use WebRTC on the client, and on the server, use any of the software that can handle WebRTC.
You can, as an option, take the stream directly from the camera, use software like Open Broadcaster to broadcast via RTMP to any media server or service that this protocol supports (which is almost everything). This is described in general terms here .
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question