I am trying to use MediaCodec
to save a series of Images, saved as Byte Arrays in a file, to a video file. I have tested these images on a SurfaceView
I think you have the right general idea. Some things to be aware of:
COLOR_FormatYUV420SemiPlanar
. Some only accept planar. (Android 4.3 introduced CTS tests to ensure that the AVC codec supports one or the other.)inputBuffers[].clear()
will blow up if it's still -1).queueInputBuffer
call. The data in that frame may be discarded. Always send EOS with a zero-length buffer.The output of the codecs is generally pretty "raw", e.g. the AVC codec emits an H.264 elementary stream rather than a "cooked" .mp4 file. Many players won't accept this format. If you can't rely on the presence of MediaMuxer
you will need to find another way to cook the data (search around on stackoverflow for ideas).
It's certainly not expected that the mediaserver process would crash.
You can find some examples and links to the 4.3 CTS tests here.
Update: As of Android 4.3, MediaCodec
and Camera
have no ByteBuffer formats in common, so at the very least you will need to fiddle with the chroma planes. However, that sort of problem manifests very differently (as shown in the images for this question).
The image you added looks like video, but with stride and/or alignment issues. Make sure your pixels are laid out correctly. In the CTS EncodeDecodeTest, the generateFrame()
method (line 906) shows how to encode both planar and semi-planar YUV420 for MediaCodec
.
The easiest way to avoid the format issues is to move the frames through a Surface
(like the CameraToMpegTest sample), but unfortunately that's not possible in Android 4.1.