replaykit

Saving video from CMSampleBuffer while streaming using ReplayKit

点点圈 提交于 2019-12-03 14:43:14
问题 I'm streaming a content of my app to my RTMP server and using RPBroadcastSampleHandler. One of the methods is override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case .video: streamer.appendSampleBuffer(sampleBuffer, withType: .video) captureOutput(sampleBuffer) case .audioApp: streamer.appendSampleBuffer(sampleBuffer, withType: .audio) captureAudioOutput(sampleBuffer) case .audioMic: () } } And the

Saving video from CMSampleBuffer while streaming using ReplayKit

牧云@^-^@ 提交于 2019-12-03 08:03:45
I'm streaming a content of my app to my RTMP server and using RPBroadcastSampleHandler. One of the methods is override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case .video: streamer.appendSampleBuffer(sampleBuffer, withType: .video) captureOutput(sampleBuffer) case .audioApp: streamer.appendSampleBuffer(sampleBuffer, withType: .audio) captureAudioOutput(sampleBuffer) case .audioMic: () } } And the captureOutput method is self.lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); // Append

ReplayKit: startRecording() completion handler is never entered

旧巷老猫 提交于 2019-12-01 06:09:37
问题 Problem description The startRecording() completion handler is never entered, even though the "Allow screen recording in $AppName" pop-up was shown. The "Allow screen recording in $AppName" pop-up is shown occasionally. This happens also when I remove the app, restart the device and do a clean/build on the project. I'm using an iPad Air 2 with iOS 11 and Xcode 9. Research This problem seemed to be an issue in earlier versions as well, see here: replaykit startrecording sometimes never enters

Replaykit, startCaptureWithHandler() not sending CMSampleBufferRef of Video type in captureHandler

我们两清 提交于 2019-11-30 03:58:05
问题 I've implemented a RPScreenRecorder , which records screen as well as mic audio. After multiple recordings are completed I stop the recording and merge the Audios with Videos using AVMutableComposition and then Merge all the videos to form Single Video. For screen recording and getting the video and audio files, I am using - (void)startCaptureWithHandler:(nullable void(^)(CMSampleBufferRef sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error))captureHandler completionHandler

Using WebRTC to send an iOS devices’ screen capture using ReplayKit

孤街醉人 提交于 2019-11-29 11:39:41
We would like to use WebRTC to send an iOS devices’ screen capture using ReplayKit. The ReplayKit has a processSampleBuffer callback which gives CMSampleBuffer. But here is where we are stuck, we can’t seem to get the CMSampleBuffer to be sent to the connected peer. We have tried to create pixelBuffer from the sampleBuffer, and then create RTCVideoFrame. we also extracted the RTCVideoSource from RTCPeerConnectionFactory and then used an RTCVideoCapturer and stream it to the localVideoSource. Any idea what we are doing wrong? var peerConnectionFactory: RTCPeerConnectionFactory? override func

How to forward Screen Capture in iOS 11 Control Center to your App?

99封情书 提交于 2019-11-29 06:55:36
I saw that TeamViewer allows iOS Screen Capturing by leveraging the Screen Recorder feature from the Control Center in iOS 11. As seen here: How is that possible? I checked out the ReplayKit, but couldn't find any feature that would hook up to the Control Center like that. You need to add a Broadcast Upload Extension to your app. (In Xcode, File > New Target, select "Broadcast Upload Extension") Once the extension is installed (alongside your app), a force-touch on the screen recording icon in control center will give the option of using your broadcast extension, instead of the default (

Using WebRTC to send an iOS devices’ screen capture using ReplayKit

你离开我真会死。 提交于 2019-11-28 04:34:07
问题 We would like to use WebRTC to send an iOS devices’ screen capture using ReplayKit. The ReplayKit has a processSampleBuffer callback which gives CMSampleBuffer. But here is where we are stuck, we can’t seem to get the CMSampleBuffer to be sent to the connected peer. We have tried to create pixelBuffer from the sampleBuffer, and then create RTCVideoFrame. we also extracted the RTCVideoSource from RTCPeerConnectionFactory and then used an RTCVideoCapturer and stream it to the localVideoSource.

How to forward Screen Capture in iOS 11 Control Center to your App?

假装没事ソ 提交于 2019-11-28 00:27:03
问题 I saw that TeamViewer allows iOS Screen Capturing by leveraging the Screen Recorder feature from the Control Center in iOS 11. As seen here: How is that possible? I checked out the ReplayKit, but couldn't find any feature that would hook up to the Control Center like that. 回答1: You need to add a Broadcast Upload Extension to your app. (In Xcode, File > New Target, select "Broadcast Upload Extension") Once the extension is installed (alongside your app), a force-touch on the screen recording

Screen recording when my iOS app is in background with ReplayKit

人盡茶涼 提交于 2019-11-27 07:26:49
问题 I have tried Broadcast Extension. I have added that extension via target. But, I don't know how to record when my app is in background. Two ways I am trying to record: Via Control center: I can see my app’s target name, after selecting that name, then I start to record by clicking Start Broadcast , next by clicking Stop Record , but my video not getting stored either in Camera Roll or in my app. Ref Link : ReplayKit's RPSystemBroadcastPickerView not showing preferredExtension Via my app: Once