Answer the question
In order to leave comments, you need to log in
How to stream a queue of JPGs using WebRTC and does it make sense?
Hello!
Faced with a working task to form a time-lapse video stream from jpg frames made by an esp32 camera at a certain interval. The first thing that came to mind was WebRTC. I briefly got acquainted with this technology and related protocols and found among the specs RTP payload type = 26 (JPEG Compressed Video over RTP), which suggested the possibility of streaming video frame by frame, loading new JPGs and forming RTP packages. Question: Is it advisable? Or is it better to create video from jpegs on the fly with ffmpeg and stream it? The last option is considered with difficulty, because ffmpeg eats a lot of resources, which are not so many.
The backend uses pion, a webrtc framework written in golang.
Answer the question
In order to leave comments, you need to log in
If we talk about rtc, then use mjpeg, frames consist of jpeg, it is played everywhere, it’s probably not difficult to implement on a weak processor, since in fact you need to form the correct header and correctly compose ready-made jpegs.
the goal is not clear, you need to proceed from the ultimate goal and not the features of its implementation
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question