libjingle

Generating libjingle VS solution and projects

二次信任 提交于 2019-12-11 13:16:13
问题 I'm trying to generate the sln and vcproj files from the checkout of libjingle svn. I set all the environment variables defined on README, but i keep getting the error: C:\src>hammer --mode=all --vsproj scons: *** No SConstruct file found. File "c:\src\libjingle\scons-local\scons-local-2.1.0\SCons\Script\Main.py", line 904, in _main In fact there's no such file, but i can't find any info on how or where to create it. Is this problem related to Python or SCons installation or is something

can't add Remote session description in webrtc android client

旧时模样 提交于 2019-12-08 07:27:04
问题 Response from server: { "rtcid": "wKAm8eeyI-mQ5dsslkhu", "msgType": "offer", "senderrtcid": "53wp_LP5CYDie3eIlkhw", "msgData": { "type": "offer", "sdp": "v=0\r\no=- 951920257545056255 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE audio video\r\na=msid-semantic: WMS OfkjcHABgxUkHlk8mfJ8ayYZdCHqdpQGFSTM\r\nm=audio 1 RTP/SAVPF 111 103 104 0 8 106 105 13 126\r\nc=IN IP4 0.0.0.0\r\na=rtcp:1 IN IP4 0.0.0.0\r\na=ice-ufrag:CF4q+RW54gQVPaz0\r\na=ice-pwd:hEIbgX4MME6cPkZKGih7bjQM\r\na=ice-options

How do record media stream data as h.264(mp4) in webRTC android?

倖福魔咒の 提交于 2019-12-08 00:19:35
问题 Please help me! I used this example in https://github.com/pchab/AndroidRTC to streaming video and audio from a android device to an other android device.In this example, they used 2 librarys is : libjingle_peerConnection and SocketIo client but i don't know how to save streaming data as h.264 format? 来源: https://stackoverflow.com/questions/30685276/how-do-record-media-stream-data-as-h-264mp4-in-webrtc-android

Stream from Android app to browser client app via socket.io-client and libjingle

那年仲夏 提交于 2019-12-06 12:39:02
问题 So I'm trying to connect Android to Browser via webRTC via socket.io and libjingle and server is running on Node.js . Issue I'm facing is weired. When 1 client is at Android(native app) and other is at Ipad(native app), Everything works fine. When 1 client is at iPad(Native app) and other is WebApp, Everyting works fine. But When 1 Client is at Android(native app) and other is WebPage, everyting works fine except the audio and video is not streaming to that end. Following are the two major

How do record media stream data as h.264(mp4) in webRTC android?

你离开我真会死。 提交于 2019-12-06 04:15:21
Please help me! I used this example in https://github.com/pchab/AndroidRTC to streaming video and audio from a android device to an other android device.In this example, they used 2 librarys is : libjingle_peerConnection and SocketIo client but i don't know how to save streaming data as h.264 format? 来源: https://stackoverflow.com/questions/30685276/how-do-record-media-stream-data-as-h-264mp4-in-webrtc-android

Stream from Android app to browser client app via socket.io-client and libjingle

穿精又带淫゛_ 提交于 2019-12-04 18:30:54
So I'm trying to connect Android to Browser via webRTC via socket.io and libjingle and server is running on Node.js . Issue I'm facing is weired. When 1 client is at Android(native app) and other is at Ipad(native app), Everything works fine. When 1 client is at iPad(Native app) and other is WebApp, Everyting works fine. But When 1 Client is at Android(native app) and other is WebPage, everyting works fine except the audio and video is not streaming to that end. Following are the two major classes i've used for the purpose: PS. The method makeOffer(View v) called by the button. MainActivity

Import Objective-C Framework (CocoaPod) into Swift?

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-04 03:33:44
I'm trying to import the libjingle_peerconnection framework into my Xcode project, but for some reason I can't import the Objective-C header with import RTCICEServer in Swift source files. I have attempted to use header files, etc. What am I doing wrong? # Uncomment this line to define a global platform for your project # platform :ios, '8.0' # Uncomment this line if you're using Swift use_frameworks! target 'VideoRTCTest' do pod "libjingle_peerconnection" end target 'VideoRTCTestTests' do end target 'VideoRTCTestUITests' do end SwiftArchitect Bridge 1. Create a xxx-Bridging-Header Add a

Native Android WebRTC application development

半腔热情 提交于 2019-12-03 18:34:27
问题 I am trying to create an android application for video chat and messaging by using WebRtc Native apis. I have been through several links and found out that most of the documentation for android is vague, specially if you dont know where to start from. I followed the following links, https://webrtc.org/native-code/android/# https://www.chromium.org/developers/how-tos/android-build-instructions But the above mentioned links doesnt make any sense because I want to develop an android application

Create a WebRTC VideoTrack with a “custom” Capturer on Android with libjingle

空扰寡人 提交于 2019-12-03 15:29:54
问题 How to use a "custom" video capturer to create a VideoTrack and provide frames? The classic approach to create a VideoTrack is: 1 - Get a VideoCapturer instance VideoCapturer capturer = VideoCapturer.create(name); 2 - Create a VideoSource VideoSource videoSource = peerconnectionFactory.createVideoSource(capturer, videoConstraints); 3 - Create a VideoTrack using the video source VideoTrack videoTrack = peerconnectionFactory.createVideoTrack("Label", videoSource); 4 - Add the track to the

Create a WebRTC VideoTrack with a “custom” Capturer on Android with libjingle

会有一股神秘感。 提交于 2019-12-03 04:10:10
How to use a "custom" video capturer to create a VideoTrack and provide frames? The classic approach to create a VideoTrack is: 1 - Get a VideoCapturer instance VideoCapturer capturer = VideoCapturer.create(name); 2 - Create a VideoSource VideoSource videoSource = peerconnectionFactory.createVideoSource(capturer, videoConstraints); 3 - Create a VideoTrack using the video source VideoTrack videoTrack = peerconnectionFactory.createVideoTrack("Label", videoSource); 4 - Add the track to the MediaStream I was wondering if there is a way to change step one . Instead of using the native Capturer