[iOS]AVPlayerItemVideoOutput.hasNewPixelBufferForItemTime doesn't work correctly

后端 未结 3 1817
轮回少年
轮回少年 2021-02-08 05:16

It\'s my first question here, so don\'t be severe.

I\'m playing video from the net using AVPlayer. I output the current frame using AVPlayerItemVideoOutput

相关标签:
3条回答
  • 2021-02-08 06:06

    Make sure that AVPlayerItem.status equals AVPlayerItemStatusReadyToPlay before calling - (void)addOutput:(AVPlayerItemOutput *)output method on AVPlayerItem

    Reference:Renaud's reply on this page

    I got the same problem with my implementation. After trying the solutions proposed here, I think I finally found the relable way to do things

    The AVPlayerItemVideoOutput must be created AFTER the AVPlayerItem status is ready to play.

    So

    1. Create player & player item, dispatch queue and display link

    2. Register observer for AVPlayerItem status key

    3. On status AVPlayerStatusReadyToPlay, create AVPlayerItemVideoOutput and start display link

    Thanks to all for the inspiration

    Renaud

    0 讨论(0)
  • 2021-02-08 06:14

    I noticed that AVPlayerItemVideoOutput "jams" somehow when using HLS multibitrate playlists. When player changes to higher bitrate -> trackid of playeritems video track changes -> it will got few pixelbuffers but after that hasNewPixelBufferForItemTime will return NO always.

    I have spent days with this problem. Accidentally I noticed that if I go to background and after that back to foreground -> Video will play normally with higher bitrate. This is not the solution.

    Finally I found workaround to this problem. I set counter for failed pixelbuffers, after 100 fails I remove current output from playeritem and set same instance back.

    if (failedCount > 100)
        {
            failedCount = 0;
            [_playerItem removeOutput:_output];
            [_playerItem addOutput:_output];
        }
    
    0 讨论(0)
  • 2021-02-08 06:15

    Code blow does not solve my problem, I still got nothing from [AVPlayerItemVideoOutput hasNewPixelBufferForItemTime]

    if (failedCount > 100) {
        failedCount = 0;
        [_playerItem removeOutput:_output];
        [_playerItem addOutput:_output];
    }
    

    Finally after testing my code for a whole day. I found a way to solve it.

    #pragma mark - AVPlayerItemOutputPullDelegate
    - (void)outputMediaDataWillChange:(AVPlayerItemOutput *)sender {
        if (![self.videoOutput hasNewPixelBufferForItemTime:CMTimeMake(1, 10)]) {
            [self configVideoOutput];
        }
        [self.displayLink setPaused:NO];
    }
    

    Check [AVPlayerItemVideoOutput hasNewPixelBufferForItemTime] when outputMediaDataWillChange: called. Recreate your AVPlayerItemVideoOutput if no new pixel buffer at 0.1s.

    Code in [self configVideoOutput]; just recreate a new AVPlayerItemVideoOutput to replace current videoOutput property.

    Why 0.1s?

    I tested and experimented many times I found that first 1 or 2 frame may always get no pixel buffer. So first 1/30s, 2/30s (for video at 30fps) may have no frame and pixel buffer. But if no video pixel buffer after 0.1s, the video output may broken or something problem with it. So we need recreate it.

    0 讨论(0)
提交回复
热议问题