Playing a stream of video data using QTKit on Mac OS X

扶醉桌前 提交于 2019-12-11 06:34:03

问题


I've been playing with QTKit for a couple of days and I'm successfully able to record video data to a file from the camera using a QTCaptureSession and QTCaptureDeviceInput etc.

However what I want to do is send the data to another location, either over the network or to a different object within the same app (it doesn't matter) and then play the video data as if it were a stream.

I have a QTCaptureMovieFileOutput and I am passing nil as the file URL so that it doesn't actually record the data to the file (I'm only interested in the data contained in the QTSampleBuffer that is available via the delegate callback).

I have set a QTCompressionOptions object on the output specifying H264 Video and High Quality AAC Audio compression.

Each time I receive a call back, I append the data from the sample buffer into an NSMutableData object I have as an instance variable.

The problem I have is that no 'player' object in the QTKit framework seems capable of playing a 'stream' of video data. Am I correct in this assumption?

I tried creating a QTMovie object (to play in a QTMovieView) using my data instance variable but I get the error that the data is not a movie.

Am I approaching this issue from the wrong angle?

Previously I was using a QTCapturePreviewOutput which passes CVImageBufferRefs for each video frame. I was converting these frames into NSImages to display on a view.
While this gave the impression of streaming, it was slow and processor hungry.

How have other people conquered the streaming video problem?


回答1:


Seems to me like you'd need to make a GL texture and then load the data into it on a per-frame basis. Everything about QTMovie seems to be based on pre-existing files, as far as my little mind can tell.



来源:https://stackoverflow.com/questions/3854505/playing-a-stream-of-video-data-using-qtkit-on-mac-os-x

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!