问题
We want to share the screen (screenshots) to a browser from an iPad. At the moment we take screenshots and send them over a WebRTC DataChannel, however that requires to much bandwidth.
Sending 5 frames per second fully compressed and scaled, still requires about 1.5-2mb/s upload speed.
We need to utilize some form of video encoding, so we can lower the bandwidth requirements and let WebRTC handle the flow control, depending on connection speed.
AVAssetWriter takes images and converts them to a .MOV file, however doesn't let us get a stream from it.
Any ideas for us? Pretty stuck at the moment, all ideas appreciated.
Thank for suggesting that this is a duplicate, however that doesn't help me very much. I have already a working solution, however it's not good enough.
Edit:
UIGraphicsBeginImageContextWithOptions(view.frame.size, NO, 0.7); //Scaling is slow, but that's not the problem. Network is
[view drawViewHierarchyInRect:view.bounds afterScreenUpdates:NO];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *data = UIImageJPEGRepresentation(image, 0.0); //Compress alot, 0.0 is max, 1.0 is least
NSString *base64Content = [data base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed];
And then I send that base64 data over a WebRTC DataChannel in 16Kb blocks, as suggested by the docs.
dc.send(...)
回答1:
I would compress the screenshots using a javascript encoder, i.e. MPEG, then transcode this stream on server-side to VP8 for WebRTC.
However it may do not work properly on old iOS devices, i.e. iPad 2010-2011 due low CPU resources, so even if you encode this stream, it may be choppy and not suitable for a smooth playback.
来源:https://stackoverflow.com/questions/37346884/streaming-screenshots-over-webrtc-as-a-video-stream-from-ios