I\'m trying to get the following MediaExtractor example to work:
http://bigflake.com/mediacodec/ - ExtractMpegFramesTest.java (requires 4.1, API 16)
The prob
The way SurfaceTexture
works makes this a bit tricky to get right.
The docs say the frame-available callback "is called on an arbitrary thread". The SurfaceTexture
class has a bit of code that does the following when initializing (line 318):
if (this thread has a looper) {
handle events on this thread
} else if (there's a "main" looper) {
handle events on the main UI thread
} else {
no events for you
}
The frame-available events are delivered to your app through the usual Looper
/ Handler
mechanism. That mechanism is just a message queue, which means the thread needs to be sitting in the Looper
event loop waiting for them to arrive. The trouble is, if you're sleeping in awaitNewImage()
, you're not watching the Looper
queue. So the event arrives, but nobody sees it. Eventually awaitNewImage()
times out, and the thread returns to watching the event queue, where it immediately discovers the pending "new frame" message.
So the trick is to make sure that frame-available events arrive on a different thread from the one sitting in awaitNewImage()
. In the ExtractMpegFramesTest
example, this is done by running the test in a newly-created thread (see the ExtractMpegFramesWrapper
class), which does not have a Looper
. (For some reason the thread that executes CTS tests has a looper.) The frame-available events arrive on the main UI thread.
Update (for "edit 3"): I'm a bit sad that ignoring the "size" field helped, but pre-4.3 it's hard to predict how devices will behave.
If you just want to display the frame, pass the Surface
you get from the SurfaceView
or TextureView
into the MediaCodec
decoder configure()
call. Then you don't have to mess with SurfaceTexture
at all -- the frames will be displayed as you decode them. See the two "Play video" activities in Grafika for examples.
If you really want to go through a SurfaceTexture
, you need to change CodecOutputSurface to render to a window surface rather than a pbuffer. (The off-screen rendering is done so we can use glReadPixels()
in a headless test.)