webrtc

TURN-Server for RTCConfiguration

◇◆丶佛笑我妖孽 提交于 2021-01-29 16:04:20
问题 I'm wondering if it is possible to establish a WebRTC connection between two clients that have different turn servers configured. Does the specification state if the configuration has to be identical? What happens when they both specify different servers? Will they choose random or is the connection impossible? 回答1: The configurations don't have to be identical. If both Peers require a relay connection the TURN servers will communicate directly to each other (via their allocation ports) If

Event listener that “camera and microphone blocked” is allowed

百般思念 提交于 2021-01-29 14:40:44
问题 By getUserMedia, it is possible that UserMedia is blocked by user. navigator.mediaDevices.getUserMedia({ audio: true }) .then((stream) => { ... }) .catch(() => { this.usermedia_blocked = true; }); When the setting is changed to "always allow", I want to continue the subsequent task. But how can I detect that "always allow & done" is clicked? Appear.in start video call after "always allow & done " is clicked. I want to do the same thing. 回答1: appear.in developer here. What do is to poll

How to share screen remotely in a video/audio call?

喜夏-厌秋 提交于 2021-01-29 14:19:40
问题 I am trying to build a video call app which has a feature of screen sharing. The users can share their screen during the call. I'm using WebRTC SDK to meet my purpose, but they have a solution for screen share when the call starts but not for screen share while the call is ongoing. One can tick the screen sharing option and can start the call but cannot start screen sharing during the call. I added a button on the CallActivity screen which on click calls MediaProjection Class of Android to

onIceCandidate is never call after setRemoteDescription - webrtc - Android

对着背影说爱祢 提交于 2021-01-29 13:00:45
问题 I follow some flow who explain how to connect in webrtc. But I m blocked : After I get the sdpOffer, I suppose to call setRemoteDescrisption() and I have a callback of onIceCandidate. But I don't have this callback. I can show piece of my code if you need it. Thanks for helping 回答1: First things first, I do not know much about WebRTC on Android, but I imagine that it will be very similar to the web API. I used the standard js in my flow below. Regarding the onicecandidate-trigger: the

WebRTC视频推流播放平台前端获取视频流列表错误是什么原因?

做~自己de王妃 提交于 2021-01-29 10:18:42
在之前关于WebRTC的研发当中,我们实现了基于WebRTC的视频推流和播放,但是该方案目前还有不完善的地方,因此我们目前仍在不断优化当中。 前几天我们又重新对WebRTC推流做了测试,打开webrtc client页面,并推一个摄像头视频流,发现此次出现报错: Error in render: “TypeError: Cannot read property ‘length’ of undefined” Found in xxxxxx 分析问题 producers这个属性值是从服务端获得;前端页面使用的是Vue制作,只有没有producers属性在页面渲染节点才会报错,所以在渲染的原数据的源头先输入console.log,查看原数据信息,如下图: 然后在浏览器重新运行,查看log显示的内容,如下图: 发现此处没有视频的id,而在Vue的项目中使用了producers.length这个属性,但是原数据没有这个属性,导致报错。在vue的代码渲染如下图: 解决问题 需要在数据的源头来解决问题;要验证producers属性有没有即可。 来源: oschina 链接: https://my.oschina.net/u/4619556/blog/4933269

Adpative rate control for WebRTC: Does resolution change as well?

眉间皱痕 提交于 2021-01-28 05:06:07
问题 It seems obvious that WebRTC uses its own rate control (GCC) to control the bitrate of the encoder, but I couldn't find any information about changing resolution as well. Does WebRTC, (or other realtime video system like Hangout, Skype) change only the bitrate and not the resolution during live ingest? If it doesn't change the resolution, why is that? According to the bitrate, quality, resolution curve shown below, using only single fixed resolution to cover bitrate changes seems not

WebRTC cannot record screen

我的梦境 提交于 2021-01-28 04:58:15
问题 I'm trying to make screen sharing app using WebRTC. I have code that can get and share video stream from camera. I need to modify it to instead get video via MediaProjection API. Based on this post I have modified my code to use org.webrtc.ScreenCapturerAndroid, but there is no video output shown. There is only black screen. If I use camera, everything works fine (I can see camera output on screen). Could someone please check my code and maybe point me in right direction? I have been stuck on

Convert WebM/H.264 to MP4/H.264 efficiently with ffmpeg.js

五迷三道 提交于 2021-01-28 00:12:59
问题 As a result of the answer here: Recording cross-platform (H.264?) videos using WebRTC MediaRecorder How can one go about using ffmpeg.js to efficiently unwrap a webm h.264 video and re-wrap it into an mp4 container? I'm looking through the docs: https://github.com/Kagami/ffmpeg.js?files=1 However I don't see (or perhaps I'm looking for the wrong terminology) any examples for the above. This operation will be performed on the browser (chrome) prior to uploading as a Blob - I could use a web

webRTC - Differentiate between temporary disconnect or failure and permanant

此生再无相见时 提交于 2021-01-27 18:15:06
问题 UPDATE It seems I can do myPeerConnection.getStats() as described here I can measure the bytes sent or recieved. If they increase that means we are connected and the disconnected ICE state will be treated as temporary. otherwise, it is permanent. But now I am confused about which byte I should measure. There inbound-rtp , outbound-rtp , remote-inbound-rtp and remote-outbound-rtp . I want to make sure that both sides are actually receiving data from each other. So what should I measure from

Why when both audio and video MediaStreamTracks enabled parameter set to false, “on-air” indicator keeps being turned on?

我是研究僧i 提交于 2021-01-27 15:50:38
问题 W3C http://www.w3.org/TR/mediacapture-streams/#life-cycle-and-media-flow article on media flow lifecycle says: When all tracks connected to a source are muted or disabled, the "on-air" or "recording" indicator for that source can be turned off; when the track is no longer muted or disabled, it must be turned back on. Is mentioned behaviour implemented in Chrome or not? 回答1: Update: Firefox now supports turning off the camera light on mute. Edits in bold . I gather from the subject that you