AVAssetImageGenerator returns sometimes same image from 2 successive frames

前端 未结 3 1896
北海茫月
北海茫月 2021-02-06 07:26

I\'m currently extracting every frame from a video with AVAssetImageGenerator, but sometimes it returns me successively 2 times almost the same image (they do not h

相关标签:
3条回答
  • 2021-02-06 08:00

    Please see the following properties of AVAssetImageGenerator. You should set kCMTimeZero for both properties to get the exact frames.

    /* The actual time of the generated images will be within the range [requestedTime-toleranceBefore, requestedTime+toleranceAfter] and may differ from the requested time for efficiency.
       Pass kCMTimeZero for both toleranceBefore and toleranceAfter to request frame-accurate image generation; this may incur additional decoding delay.
       Default is kCMTimePositiveInfinity. */
    @property (nonatomic) CMTime requestedTimeToleranceBefore NS_AVAILABLE(10_7, 5_0);
    @property (nonatomic) CMTime requestedTimeToleranceAfter NS_AVAILABLE(10_7, 5_0);
    

    Before I set kCMTimeZero for both properties, I got some same images for different request time as you experienced. Just try the following code.

    self.imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset];
    self.imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
    self.imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
    
    0 讨论(0)
  • 2021-02-06 08:01

    I was having the same issue of you, but much more evident, the duplication was happening when the interval between the two frames was under 1.0 second and I realised it was depending on the timescale I was using for generating CMTime values.

    Before

    CMTime requestTime = CMTimeMakeWithSeconds(imageTime, 1);
    

    After

    CMTime requestTime = CMTimeMakeWithSeconds(imageTime, playerItem.asset.duration.timescale);
    

    ... and Boom, no more duplication :)

    So maybe you can try to increase, double perhaps, the timescale, using your code:

    NSValue * time = [NSValue valueWithCMTime:CMTimeMakeWithSeconds(i*frameDuration, composition.frameDuration.timescale*2)]; // *2 at the end
    

    For future references here is my code:

        playerItem = [AVPlayerItem playerItemWithURL:item.movieUrl];
        imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:playerItem.asset];
        imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
        imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
    
        float duration = item.duration;
        float interval = item.interval;
    
        NSLog(@"\nItem info:\n%f \n%f", duration,interval);
    
        NSString *srcPath = nil;
        NSString *zipPath = nil;
    
        srcPath = [item.path stringByAppendingPathComponent:@"info.json"];
        zipPath = [NSString stringWithFormat:@"/%@/info.json",galleryID];
    
        [zip addFileToZip:srcPath newname:zipPath level:0];
    
        NSTimeInterval frameNum = item.duration / item.interval;
        for (int i=0; i<=frameNum; i++)
        {
            NSArray* cachePathArray = NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES);
            NSString* cachePath = [cachePathArray lastObject];
    
            srcPath = [cachePath stringByAppendingPathComponent:@"export-tmp.jpg"];
            zipPath = [NSString stringWithFormat:@"/%@/%d.jpg",galleryID,i];
    
            float imageTime = ( i * interval );
    
            NSError *error = nil;
            CMTime requestTime = CMTimeMakeWithSeconds(imageTime, playerItem.asset.duration.timescale);
            CMTime actualTime;
    
            CGImageRef imageRef = [imageGenerator copyCGImageAtTime:requestTime actualTime:&actualTime error:&error];
    
            if (error == nil) {
                float req = ((float)requestTime.value/requestTime.timescale);
                float real = ((float)actualTime.value/actualTime.timescale);
                float diff = fabsf(req-real);
    
                NSLog(@"copyCGImageAtTime: %.2f, %.2f, %f",req,real,diff);
            }
            else
            {
                NSLog(@"copyCGImageAtTime: error: %@",error.localizedDescription);
            }
    
    
    
            // consider using CGImageDestination -> http://stackoverflow.com/questions/1320988/saving-cgimageref-to-a-png-file
            UIImage *img = [UIImage imageWithCGImage:imageRef];
            CGImageRelease(imageRef);  // CGImageRef won't be released by ARC
    
    
    
            [UIImageJPEGRepresentation(img, 100) writeToFile:srcPath atomically:YES];
    
            if (srcPath != nil && zipPath!= nil)
            {
                [zip addFileToZip:srcPath newname:zipPath level:0]; // 0 = no compression. everything is a jpg image
                unlink([srcPath UTF8String]);
            }
    
    0 讨论(0)
  • 2021-02-06 08:01

    I was using a slightly different way for calculating the CMTime request, and it seemed to work. Here is the code (assuming iOS) :

    -(void)extractImagesFromMovie {
    
    // set the asset
        NSString* path = [[NSBundle mainBundle] pathForResource:@"myMovie" ofType:@"MOV"];
        NSURL* movURL = [NSURL fileURLWithPath:path];
    
    NSMutableDictionary* myDict = [NSMutableDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES] , 
                                     AVURLAssetPreferPreciseDurationAndTimingKey , 
                                    [NSNumber numberWithInt:0],
                                    AVURLAssetReferenceRestrictionsKey, nil];
    
    AVURLAsset* movie = [[AVURLAsset alloc] initWithURL:movURL options:myDict];
    
    
    // set the generator
    AVAssetImageGenerator* generator = [[AVAssetImageGenerator assetImageGeneratorWithAsset:movie] retain];
    generator.requestedTimeToleranceBefore = kCMTimeZero;
    generator.requestedTimeToleranceAfter = kCMTimeZero;
    
    // look for the video track
    AVAssetTrack* videoTrack;
    bool foundTrack = NO;
    
    for (AVAssetTrack* track in movie.tracks) {
    
        if ([track.mediaType isEqualToString:@"vide"]) {
            if (foundTrack) {NSLog (@"Error - - - more than one video tracks"); return(-1);}
            else {
                videoTrack = track;
                foundTrack = YES;
            }
        }
    }
    if (foundTrack == NO) {NSLog (@"Error - - No Video Tracks at all"); return(-1);}
    
    // set the number of frames in the movie
    int frameRate = videoTrack.nominalFrameRate;
    float value = movie.duration.value;
    float timeScale = movie.duration.timescale;
    float totalSeconds = value / timeScale;
    int totalFrames = totalSeconds * frameRate;
    
    NSLog (@"total frames %d", totalFrames);
    
    int timeValuePerFrame = movie.duration.timescale / frameRate;
    
    NSMutableArray* allFrames = [[NSMutableArray new] retain];
    
    // get each frame
    for (int k=0; k< totalFrames; k++) {
    
        int timeValue = timeValuePerFrame * k;
        CMTime frameTime;
        frameTime.value = timeValue;
        frameTime.timescale = movie.duration.timescale;
        frameTime.flags = movie.duration.flags;
        frameTime.epoch = movie.duration.epoch;
    
        CMTime gotTime;
    
        CGImageRef myRef = [generator copyCGImageAtTime:frameTime actualTime:&gotTime error:nil];
        [allFrames addObject:[UIImage imageWithCGImage:myRef]];
    
        if (gotTime.value != frameTime.value) NSLog (@"requested %lld got %lld for k %d", frameTime.value, gotTime.value, k)
    
    }
    
    NSLog (@"got %d images in the array", [allFrames count]);
    // do something with images here...
    }
    
    0 讨论(0)
提交回复
热议问题