I am currently working with the MPMoviePlayerController
and am analysing metrics for video playback. Specifically, analysing adaptive bitrates.
As part o
There is a simple answer to this - the indicatedBitrate
of a MPMovieAccessLogEvent
(or AVPlayerItemAccessLogEvent
for AVPlayer
) is the bitrate from the current playlist, so is an average bitrate required to play the stream.
However, the observedBitrate
is NOT averaged - it is the instantaneous bitrate (or download speed) which the player achieved while downloading a particular chunk of video.
Example: Playing a playlist with a 1000 Kb/s stream, in chunks of 10 seconds each. The device can achieve over 10MB/s download over WiFi, so it takes less than 1 second to download each chunk. Therefore, the player is downloading at over 10,000 Kb/s during each chunk. I'd expect the player to return (approximately) these values:
indicatedBitrate
: 1000 Kb/s
observedBitrate
: 10,000 Kb/s
I'd been mystified by these large values myself, but I think this explains it.
This is just for illustration - these values are not very meaningful since we don't really know how long it takes to download a chunk, or indeed how big each chunk is. All the observedBitrate
really tells you is how well the player is managing to keep up with the bitrate needed to play the stream. If the former is 10x bigger than the latter, then it is only using 10% of the available time to download each chunk. This ratio may be used as a quality-of-service indicator.
For example, if the observedBitrate
is less than the indicatedBitrate
then it is very likely that the player will stall due to buffering, but as long as it is greater, then all is well and the stream is likely to play smoothly.