A
A
Andrey Medvedev2021-02-21 15:40:55
JavaScript
Andrey Medvedev, 2021-02-21 15:40:55

Why doesn't WebRTC getDisplayMedia() capture the sound of a remote stream?

I welcome everyone! There is a native WebRTC application based on peerjs - video conferencing. I'm trying to implement recording using MediaRecorder and ran into an unpleasant moment. I get a desktop stream like this:

let chooseScreen = document.querySelector('.chooseScreenBtn')
chooseScreen.onclick = async () => {
    let desktopStream = await navigator.mediaDevices.getDisplayMedia({ video:true, audio: true });
}


Next, I successfully render the resulting desktopStream in a DOM tag:

const videoElement = doc.querySelector('.videoElement')
      videoElement.srcObject = desktopStream 
      videoElement.muted = false;
      videoElement.onloadedmetadata = ()=>{videoElement.play();}


Suppose I receive a desktopStream on an active conference page where everyone can hear and see each other.

To check the desktopStream, I launch the video in the next youtube tab and safely hear and see it in the videoElement, but the rest of the audio from the conference participants does not get into this desktopStream. Of course, MediaRecorder, which takes the target stream as the first parameter, will record a video where there is no sound from the participants, but there is sound from the desktop. I don’t even know where to dig, can someone give some good advice? If you need to post additional code, of course I'll post it.

Answer the question

In order to leave comments, you need to log in

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question