qtkit

How can I obtain raw data from a CVImageBuffer object

血红的双手。 提交于 2020-01-24 05:00:28
问题 I'm trying to use cocoa to grab images from a webcam. I'm able to get the image in RGBA format using the QTKit and the didOutputVideoFrame delegate call, and converting the CVImageBuffer to a CIImage and then to a NSBitmapImageRep. I know my camera grabs natively in YUV, what I want is to get the YUV data directly from the CVImageBuffer, and proccess the YUV frame before displaying it. My question is: How can I get the YUV data from the CVImageBuffer? thanks. 回答1: You might be able to create

QTKit strange error

允我心安 提交于 2019-12-25 08:54:15
问题 just simple peace of code (file 1.mp3 clicked and playing as well in iTunes) : - (void)applicationDidFinishLaunching:(NSNotification *)aNotification { NSError *outError = nil; QTMovie *newMovie = [QTMovie movieWithURL:[NSURL URLWithString:@"/Users/Alex/1.mp3"] error:&outError]; if (newMovie) { //[newMovie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieEditableAttribute]; [self setMovie:newMovie]; } [movie play]; give me error Error Domain=NSOSStatusErrorDomain Code=-2000 UserInfo

Get RTSP stream from live555 and decode with AVFoundation

纵然是瞬间 提交于 2019-12-24 06:37:45
问题 I need to get video frames from ip camera using RTSP. To get rtsp stream I use live555. The problem is that I can't find a way to decode incoming video frames with AVFoundation. (I can't use ffmpeg). Is there a way to use AVFoundation for video decoding. If yes then how to do it? 来源: https://stackoverflow.com/questions/17585369/get-rtsp-stream-from-live555-and-decode-with-avfoundation

Cocoa QTKit and recording movies

旧时模样 提交于 2019-12-21 06:39:18
问题 I'm new with the whole QTKit and I was looking for some feedback on the following code that I am attempting to use to display the camera's image and record movies. - (void)initializeMovie { NSLog(@"Hi!"); QTCaptureSession* mainSession = [[QTCaptureSession alloc] init]; QTCaptureDevice* deviceVideo = [QTCaptureDevice defaultInputDeviceWithMediaType:@"QTMediaTypeVideo"]; QTCaptureDevice* deviceAudio = [QTCaptureDevice defaultInputDeviceWithMediaType:@"QTMediaTypeSound"]; NSError* error;

QTMovieCurrentSizeAttribute and QTMovieSizeDidChangeNotification replacements

纵饮孤独 提交于 2019-12-13 13:12:58
问题 Does anyone know the correct way to replace old QTMovieCurrentSizeAttribute and QTMovieSizeDidChangeNotification tasks? I'm trying to clean out old deprecated code. I've found that QTMovieNaturalSizeDidChangeNotification is not a replacement for QTMovieSizeDidChangeNotification . Likewise QTMovieNaturalSizeAttribute is not a replacement for QTMovieCurrentSizeAttribute . Natural Size refers to the QTMovie 's native resolution, while Current Size refer to the resolution at which a QTMovie is

Mirroring CIImage/NSImage

烈酒焚心 提交于 2019-12-13 05:14:59
问题 Currently I have the following CIImage *img = [CIImage imageWithCVImageBuffer: imageBuffer]; NSCIImageRep *imageRep = [NSCIImageRep imageRepWithCIImage:img]; NSImage *image = [[[NSImage alloc] initWithSize: [imageRep size]] autorelease]; [image addRepresentation:imageRep]; This works perfectly, I can use the NSImage and when written to a file the image is exactly how I need it to be. However, I'm pulling this image from the users iSight using QTKit, so I need to be able to flip this image

How can I capture iSight frames with Python in Snow Leopard?

空扰寡人 提交于 2019-12-12 08:09:11
问题 I have the following PyObjC script: from Foundation import NSObject import QTKit error = None capture_session = QTKit.QTCaptureSession.alloc().init() print 'capture_session', capture_session device = QTKit.QTCaptureDevice.defaultInputDeviceWithMediaType_(QTKit.QTMediaTypeVideo) print 'device', device, type(device) success = device.open_(error) print 'device open success', success, error if not success: raise Exception(error) capture_device_input = QTKit.QTCaptureDeviceInput.alloc()

Can't call methods on objects in pyObjC

爷,独闯天下 提交于 2019-12-11 10:21:42
问题 When I call setDelegate_ within my pyObjC code I get an AttributeError: 'tuple' object has no attribute 'setDelegate_' . My Code looks like this: def createMovie(self): attribs = NSMutableDictionary.dictionary() attribs['QTMovieFileNameAttribute'] = '<My Filename>' movie = QTMovie.alloc().initWithAttributes_error_(attribs, objc.nil) movie.setDelegate_(self) Edit I Found out that i can't use any instance methods with the movie object. 回答1: From your comment, it looks like QTMovie.alloc()

Playing a stream of video data using QTKit on Mac OS X

扶醉桌前 提交于 2019-12-11 06:34:03
问题 I've been playing with QTKit for a couple of days and I'm successfully able to record video data to a file from the camera using a QTCaptureSession and QTCaptureDeviceInput etc. However what I want to do is send the data to another location, either over the network or to a different object within the same app (it doesn't matter) and then play the video data as if it were a stream. I have a QTCaptureMovieFileOutput and I am passing nil as the file URL so that it doesn't actually record the

using QTkit for recording audio

倖福魔咒の 提交于 2019-12-10 11:06:27
问题 It looks like using core audio to record audio is overly complicated. While QTkit is basic and down to earth However. All of the examples I have see integrate video and audio together. Does some one have or know an example of using QTkit for recording audio? 回答1: Here is an example of using QTKit for recording audio. 回答2: In order to capture audio only, you have to either select the default device that supports sounds, or disable video connection on muxed devices. // Get the default sound