V
V
Victor P.2021-08-20 17:30:35
Node.js
Victor P., 2021-08-20 17:30:35

How to deal with broadcasting video from raspberries?

Hello,
I found a library for broadcasting video from raspberry pi on github
https://github.com/131/h264-live-player

Unpacked on raspberry, ran: node server-rpi.js without changes and on home computer in browser 192.168.0.102 :8080 got a great live stream with no delays.
611fba15986b8310552431.png

Then I needed to embed this live stream on my website. The need lies in the fact that on the same page, in addition to the video broadcast, there should be other control buttons and information plates. I'm developing a site in Visual Studio in C#, I use angular on the front (but this is not necessary). I made an iframe and posted a link to the page described above

<iframe [src] = "getVideoUrl" style = "width: 1000px; height: 600px;"> </iframe>


3 buttons were drawn in the iframe,
611fba9a03ff5005947508.png

but when I click [Start video], there is no broadcast, and there are such errors in the console:
611fbac76180c218812703.png

stderr: increasing TOTAL_MEMORY to 67108864 to be compliant with the asm.js spec
stderr: pre-main prep time: 1 ms
creatingTextures: size: (960, 540)


I couldn't deal with them.

Then I found an npm package for this project https://www.npmjs.com/package/h264-live-player
It took a bit of work to get Angular up and running, but I did it. This is what I had to add to package.json
611fbaf994d99243717638.png

In the HTML I added something like this:
<div class="col-md-9 text-center">
    <button type="button" (click)="wsavc.playStream()">Start Video</button>
    <button type="button" (click)="wsavc.stopStream()">Stop Video</button>
    <button type="button" (click)="wsavc.disconnect()">Disconnect</button>
    <br />
    <canvas #roboVideo></canvas>
  </div>


TS has this:
var WSAvcPlayer = require('h264-live-player/wsavc/index.js');
 

  public wsavc: any;
  ngAfterViewInit() {
    var uri = "ws://192.168.0.102:8080";
    this.wsavc = new WSAvcPlayer(this.roboVideo, "webgl", 1, 35);
    this.wsavc.connect(uri);
  }


When loading the page, the same errors are displayed, plus a few more:
611fbb4136745321715644.png

After that, I got tired of picking angular and downloaded the project from github to my home computer. That is, the broadcast is launched on the raspberry pi, and the html page is hosted on the home computer. The only thing I fixed the connection string on the IP address of the raspberry pi:
611fbb66e6a68433494073.png

But even in this version it didn’t start for me either.
611fbb83eeba1159287132.png

What errors do I get and how can I fix them? In general, it’s important that it works in a separate browser window, but doesn’t work in an iframe, how can this be?

Additionally, if possible, do I need training materials, articles, or what queries to build in Google to understand this code? How to learn to do the same, this code remained difficult to understand. I even saw pieces of minified code (that's where the error information comes from) and I couldn't figure it out.
I tried to write to the author, but the project is over 5 years old and he did not answer from the email specified in the contacts

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
Armenian Radio, 2021-08-20
@Jeer

1) Read Harutyunyan's articles from the fence to lunch:
2) Build an idea for yourself about the codec, media container, transport and their relationships.
You need to understand what the first, second and third are (no, this is not a menu in the canteen). And how one affects the other.
For example, you need to understand that the codec must be tuned to realtime, namely, to wind up its key frame rate to the maximum. Otherwise, the user will have to wait half a minute for the I-Frame.
You need to understand that depending on the platform and browser, you get restrictions on the codec, container and transport - the old iPhone understands only h264 baseline, dumped in H264 annexb stream and transmitted via HLS
(and that's just one link option)
That is, what comes out of the codec, for a live broadcast, you need to put it in the right container and put it in the right transport.
The first two tasks are done by ffmpeg, the third one by the broadcast server.
The more devices you need to support, the more broadcast configurations and transcoded streams (if you want to broadcast in different resolutions - strain ffmpeg for decoding - scale - coding) If
you want to be viewed on a Linux wheelbarrow - encode in VP8/
9 the answer with all the options will take a whole book, and for two months I was busy setting up corporate streaming video calls, taking into account all the necessary wishes.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question