webrtc

how to get socket.io number of clients in room?

狂风中的少年 提交于 2020-12-11 04:34:39
问题 my socket.io version 1.3.5 I want to get number of clients in particular room. This is my code. socket.on('create or join', function (numClients, room) { socket.join(room); }); I use this code for get clients in room : console.log('Number of clients',io.sockets.clients(room)); 回答1: To get the number of clients in a room you can do the following: function NumClientsInRoom(namespace, room) { var clients = io.nsps[namespace].adapter.rooms[room]; return Object.keys(clients).length; } This

Android WebRTC client with pre-encoded H.264 video stream

大憨熊 提交于 2020-12-08 14:56:43
问题 I have a video stream source that sends bytes of H.264-encoded video. I'd like to build an application with Android's WebRTC classes, to send this video stream to a WebRTC peer. These built-in classes seem to only support raw video sources... not video already processed by a codec. I simply need to create an offer with only one video codec/bitrate configuration. For my use case, I don't need to autoscale the bandwidth usage, nor offer any codecs other than the original H.264 stream of bytes.

如何利用webrtc实现TSINGSEE青犀视频云-边-端架构视频智能分析平台低延迟直播?

房东的猫 提交于 2020-12-08 11:47:31
目前TSINGSEE青犀视频云边端架构视频智能分析平台都实现了低延迟的视频直播,在我们测试期间最低的直播延迟协议应该属于ws-FLV、RTMP协议了,测试最优延迟可达1s左右。目前国内大部分厂家在用的 RTMP,它相对于 HLS 在服务端做了优化。RTMP 服务端不再进行切片,而是分别转发每一帧,CDN 分发延迟非常小。 上图是国标视频平台EasyGBS输出的视频流播放界面,可输出三种不同协议的视频流,其中FLV在低延迟直播当中的运用比较常见,同时RTMP也可达到低延迟,大家有兴趣可以研究一下。 而对于越来越高的视频直播要求,我们已经需要探寻更加低延迟的方案,webrtc恰巧就是这一技术发展的新兴之路,这也是TSINGSEE青犀视频研发团队目前不断测试webrtc的价值所在。 在测试过程中,我们发现标准 WebRTC 接入过程会有各种限制,比如它不支持直播中常用音频 AAC 编码和 44.1k 采样率,其它不支持视频 B 帧、H265等编码特性,多 slice 编码在弱网下也会花屏,并且WebRTC 建联过程耗时过长,会影响秒开体验。对此,我们也在寻找更为高效、兼容性更好的协议接入,从而将webrtc用于视频直播当中。 标准 WebRTC 接入的优点: 标准 WebRTC 接入除了 HTTP 建联请求外,全部符合 WebRTC 规范。 标准终端方便接入。 可快速实现原型。 标准

How to enable Bitcode for WebRTC iOS framework?

谁都会走 提交于 2020-12-08 08:00:32
问题 How can I compile WebRTC iOS framework with Bitcode enabled. Currently I have to disable the Bitcode of my project due to WebRTC framework. 回答1: You will need to build it yourself. Something like: # Clone the depot tools git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git # Add the tools to the path export PATH=$PATH:"`pwd`/depot_tools" # Download the WebRTC source code mkdir webrtc_ios cd webrtc_ios # This will take some time fetch --nohooks webrtc_ios gclient sync #

How to enable Bitcode for WebRTC iOS framework?

一曲冷凌霜 提交于 2020-12-08 08:00:18
问题 How can I compile WebRTC iOS framework with Bitcode enabled. Currently I have to disable the Bitcode of my project due to WebRTC framework. 回答1: You will need to build it yourself. Something like: # Clone the depot tools git clone https://chromium.googlesource.com/chromium/tools/depot_tools.git # Add the tools to the path export PATH=$PATH:"`pwd`/depot_tools" # Download the WebRTC source code mkdir webrtc_ios cd webrtc_ios # This will take some time fetch --nohooks webrtc_ios gclient sync #

iOS screen sharing (using ReplayKit) using WebRTC in swift

泪湿孤枕 提交于 2020-12-08 03:57:23
问题 I have successfully implemented ReplayKit. SampleHandler.swift class SampleHandler: RPBroadcastSampleHandler { override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) { } override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case RPSampleBufferType.video: break case RPSampleBufferType.audioApp: break case RPSampleBufferType.audioMic: break @unknown default: return } } } Question : How I

iOS screen sharing (using ReplayKit) using WebRTC in swift

北城余情 提交于 2020-12-08 03:54:50
问题 I have successfully implemented ReplayKit. SampleHandler.swift class SampleHandler: RPBroadcastSampleHandler { override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) { } override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case RPSampleBufferType.video: break case RPSampleBufferType.audioApp: break case RPSampleBufferType.audioMic: break @unknown default: return } } } Question : How I

iOS screen sharing (using ReplayKit) using WebRTC in swift

六眼飞鱼酱① 提交于 2020-12-08 03:54:27
问题 I have successfully implemented ReplayKit. SampleHandler.swift class SampleHandler: RPBroadcastSampleHandler { override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) { } override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case RPSampleBufferType.video: break case RPSampleBufferType.audioApp: break case RPSampleBufferType.audioMic: break @unknown default: return } } } Question : How I

iOS screen sharing (using ReplayKit) using WebRTC in swift

…衆ロ難τιáo~ 提交于 2020-12-08 03:53:18
问题 I have successfully implemented ReplayKit. SampleHandler.swift class SampleHandler: RPBroadcastSampleHandler { override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) { } override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case RPSampleBufferType.video: break case RPSampleBufferType.audioApp: break case RPSampleBufferType.audioMic: break @unknown default: return } } } Question : How I

Will ICE negotiations between peers behind two symmetric NAT's result in requiring two TURN servers?

不想你离开。 提交于 2020-12-06 13:12:17
问题 I read RFC6577 and RFC8445 but I feel like there is a bit of a disconnect between how TURN can be used versus how ICE actually utilizes the relay candidates . The TURN RFC describes the use of one single TURN server to ferry data between a client and a peer. The transport address on the TURN server accepts data flow from a client via TURN messages, whereas the relayed transport address accepts data flow from peer(s) via UDP. This sounds great - one TURN server and bidirectional data flow.