avassetwriter

Adding filters to video with AVFoundation (OSX) - how do I write the resulting image back to AVWriter?

假装没事ソ 提交于 2020-01-10 02:07:16
问题 Setting the scene I am working on a video processing app that runs from the command line to read in, process and then export video. I'm working with 4 tracks. Lots of clips that I append into a single track to make one video. Let's call this the ugcVideoComposition. Clips with Alpha which get positioned on a second track and using layer instructions, is set composited on export to play back over the top of the ugcVideoComposition. A music audio track. An audio track for the

How to draw text into CIImage?

牧云@^-^@ 提交于 2020-01-06 06:52:13
问题 How to draw into CIImage (or maybe into CVPixelBuffer , but I guess it easier to add text to CIImage )? not to UIImage I record video (.mp4 file) using AVAssetWriter and CMSampleBuffer (from video, audio inputs). While recording I want to add text on the video frames, I'm already converting CMSampleBuffer to CIImage , now I need somehow add text to CIImage and convert back to CVPixelBuffer . I didn't really find any simple examples in Swift how to add (draw) text to CIImage or add anything

AVAssetWriter failing to encode video with AVVideoProfileLevelKey

こ雲淡風輕ζ 提交于 2020-01-03 20:05:56
问题 Can anyone help me out with this problem? I am able to encode a video using AVAssetWriter with following output settings. NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInteger:dimensions.width], AVVideoWidthKey, [NSNumber numberWithInteger:dimensions.height], AVVideoHeightKey, [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInteger:bitsPerSecond], AVVideoAverageBitRateKey, [NSNumber

Record video with AVAssetWriter: first frames are black

蹲街弑〆低调 提交于 2020-01-03 13:37:23
问题 I am recording video (the user also can switch to audio only) with AVAssetWriter . I start the recording when the app is launched. But the first frames are black (or very dark). This also happens when I switch from audio to video. It feels like the AVAssetWriter and/or AVAssetWriterInput are not yet ready to record. How can I avoid this? I don't know if this is a useful info but I also use a GLKView to display the video. func start_new_record(){ do{ try self.file_writer=AVAssetWriter(url:

AVAssetWriter corrupting the video trimmed by AVAssetExportSession

十年热恋 提交于 2020-01-03 04:45:13
问题 I am trying to trim and then compress a video file. For trimming I am using AVAssetExportSession For compression I am using AVAssetWriter . If I use both codes individually every thing works fine but If I trim and then feed the trim output for compression I got compressed but corrupt video. - (void)viewDidLoad { [super viewDidLoad]; [self trimVideo]; } Trimming Code -(void)trimVideo { AVAsset *anAsset = [[AVURLAsset alloc]initWithURL:[self.asset valueForProperty:ALAssetPropertyAssetURL]

iphone image to video issue in video speed

99封情书 提交于 2020-01-02 22:59:22
问题 I have done image to video conversion in iphone(of course I got the code from stack overflow questions). But the problem is speed of recorded video is very fast, it ran away within 2 seconds even though I have around 2250 frames. I know the problem is with its frame rate. But i don't know how to make it correct. my code is below NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectoryPath = [paths objectAtIndex:0]; NSString

How to set expected framerate to AVAssetWriterInput

戏子无情 提交于 2020-01-02 07:41:30
问题 I have an app which encodes videos in different ways and saves it to Photos library - it can cut specific time range, add pictures, text, etc. Everything is working perfectly till I try to encode video 120+ fps. The problem is that video appears to be slow-motioned and I don't pursue that goal at all. Here I found out about property for AVAssetWritterInput which is called AVVideoExpectedSourceFrameRateKey , but the problem is that when I try to apply this parameter to my AVAssetWritterInput ,

Output Video Size Huge Using HEVC Encoder on iOS

大憨熊 提交于 2020-01-02 07:30:12
问题 I have a project that currently uses the H.264 encoder to record video on iOS. I wanted to try using the new HEVC encoder in iOS 11 to reduce file sizes, but have found that using the HEVC encoder causes file sizes to balloon enormously. Here's a project on GitHub that shows the issue - it simultaneously writes frames from the camera to files using the H.264 and H.265 (HEVC) encoders, and the resulting file sizes are printed to the console. The AVFoundation classes are setup like this: class

CMSampleBuffer from OpenGL for video output with AVAssestWritter

▼魔方 西西 提交于 2020-01-01 11:46:28
问题 I need to get a CMSampleBuffer for the OpenGL frame. I'm using this: int s = 1; UIScreen * screen = [UIScreen mainScreen]; if ([screen respondsToSelector:@selector(scale)]){ s = (int)[screen scale]; } const int w = viewController.view.frame.size.width/2; const int h = viewController.view.frame.size.height/2; const NSInteger my_data_length = 4*w*h*s*s; // allocate array and read pixels into it. GLubyte * buffer = malloc(my_data_length); glReadPixels(0, 0, w*s, h*s, GL_RGBA, GL_UNSIGNED_BYTE,

Possible for AVAssetWriter to write files with transparency?

前提是你 提交于 2019-12-30 11:23:09
问题 Every file I write with AVAssetWriter has a black background, if the images I include do not fill the entire render area. Is there any way to write with transparency? Here's the method I use to get the pixel buffer: - (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image { CGSize size = self.renderSize; NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES],