AVPlayer streaming progress

前端 未结 6 1384
死守一世寂寞
死守一世寂寞 2020-11-30 17:42

I\'m successfully using AVPlayer to stream audio from a server and what I want to do now is to show a custom UISlider who shows the progress of the

相关标签:
6条回答
  • 2020-11-30 18:11

    This method will return buffer time interval for your UISlider

    public var bufferAvail: NSTimeInterval {
    
        // Check if there is a player instance
        if ((player.currentItem) != nil) {
    
            // Get current AVPlayerItem
            var item: AVPlayerItem = player.currentItem
            if (item.status == AVPlayerItemStatus.ReadyToPlay) {
    
                var timeRangeArray: NSArray = item.loadedTimeRanges
                var aTimeRange: CMTimeRange = timeRangeArray.objectAtIndex(0).CMTimeRangeValue
                var startTime = CMTimeGetSeconds(aTimeRange.start)
                var loadedDuration = CMTimeGetSeconds(aTimeRange.duration)
    
                return (NSTimeInterval)(startTime + loadedDuration);
            }
            else {
                return(CMTimeGetSeconds(kCMTimeInvalid))
            }
        } 
        else {
            return(CMTimeGetSeconds(kCMTimeInvalid))
        }
    }
    
    0 讨论(0)
  • 2020-11-30 18:13

    It should work well:

    Objective-C:

    - (CMTime)availableDuration
    {
        NSValue *range = self.player.currentItem.loadedTimeRanges.firstObject;
        if (range != nil){
            return CMTimeRangeGetEnd(range.CMTimeRangeValue);
        }
        return kCMTimeZero;
    }
    

    Swift version:

    func availableDuration() -> CMTime
    {
        if let range = self.player?.currentItem?.loadedTimeRanges.first {
            return CMTimeRangeGetEnd(range.timeRangeValue)
        }
        return .zero
    }
    

    To watch current time value you can use: CMTimeShow([self availableDuration]); or CMTimeShow(availableDuration()) (for swift)

    0 讨论(0)
  • 2020-11-30 18:15

    Selected answer may cause you problems if returned array is empty. Here's a fixed function:

    - (NSTimeInterval) availableDuration
    {
        NSArray *loadedTimeRanges = [[_player currentItem] loadedTimeRanges];
        if ([loadedTimeRanges count])
        {
            CMTimeRange timeRange = [[loadedTimeRanges objectAtIndex:0] CMTimeRangeValue];
            Float64 startSeconds = CMTimeGetSeconds(timeRange.start);
            Float64 durationSeconds = CMTimeGetSeconds(timeRange.duration);
            NSTimeInterval result = startSeconds + durationSeconds;
            return result;
        }
        return 0;
    }
    
    0 讨论(0)
  • 2020-11-30 18:17

    I am just working on this, and so far have the following:

    - (NSTimeInterval) availableDuration;
    {
      NSArray *loadedTimeRanges = [[self.player currentItem] loadedTimeRanges];
      CMTimeRange timeRange = [[loadedTimeRanges objectAtIndex:0] CMTimeRangeValue];
      Float64 startSeconds = CMTimeGetSeconds(timeRange.start);
      Float64 durationSeconds = CMTimeGetSeconds(timeRange.duration);
      NSTimeInterval result = startSeconds + durationSeconds;
      return result;
    }
    
    0 讨论(0)
  • 2020-11-30 18:23

    The code from Suresh Kansujiya in Objective C

    NSTimeInterval bufferAvail;
    
    if (player.currentItem != nil) {
    
        AVPlayerItem *item = player.currentItem;
        if (item.status == AVPlayerStatusReadyToPlay) {
            NSArray *timeRangeArray = item.loadedTimeRanges;
            CMTimeRange aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue];
            Float64 startTime = CMTimeGetSeconds(aTimeRange.start);
            Float64 loadedDuration = CMTimeGetSeconds(aTimeRange.duration);
    
            bufferAvail = startTime + loadedDuration;
    
            NSLog(@"%@ - %f", [self class], bufferAvail);
        } else {
            NSLog(@"%@ - %f", [self class], CMTimeGetSeconds(kCMTimeInvalid)); }
    }
    else {
        NSLog(@"%@ - %f", [self class], CMTimeGetSeconds(kCMTimeInvalid));
    }
    
    0 讨论(0)
  • 2020-11-30 18:31

    Personally I do not agree that the timeRanges value will always have a count of 1.

    According to the documentation

    The array contains NSValue objects containing a CMTimeRange value indicating the times ranges for which the player item has media data readily available. The time ranges returned may be discontinuous.

    So this may have values similar to:

    [(start1, end1), (start2, end2)]

    From my experience with the hls.js framework within the desktop web world, the holes between these time ranges could be very small or large depending on a multitude of factors, ex: seeking, discontinuities, etc.

    So to correctly get the total buffer length you would need to loop through the array and get the duration of each item and concat.

    If you are looking for a buffer value from current play head you would need to filter the time ranges for a start time that's greater than the current time and an end time that's less than current time.

    public extension AVPlayerItem {
    
        public func totalBuffer() -> Double {
            return self.loadedTimeRanges
                .map({ $0.timeRangeValue })
                .reduce(0, { acc, cur in
                    return acc + CMTimeGetSeconds(cur.start) + CMTimeGetSeconds(cur.duration)
                })
        }
    
        public func currentBuffer() -> Double {
            let currentTime = self.currentTime()
    
            guard let timeRange = self.loadedTimeRanges.map({ $0.timeRangeValue })
                .first(where: { $0.containsTime(currentTime) }) else { return -1 }
    
            return CMTimeGetSeconds(timeRange.end) - currentTime.seconds
        }
    
    }
    
    0 讨论(0)
提交回复
热议问题