How to use ByteBuffer in the MediaCodec context in android

前端 未结 1 1586
太阳男子
太阳男子 2021-02-04 09:23

So far I am able to setup a MediaCodec to encode a video stream. The aim is to save my user generated artwork into a video file.

I use android Bitmap objects of the user

相关标签:
1条回答
  • 2021-02-04 09:35

    The exact layout of the ByteBuffer is determined by the codec for the input format you've chosen. Not all devices support all possible input formats (e.g. some AVC encoders require planar 420 YUV, others require semi-planar). Older versions of Android (<= API 17) didn't really provide a portable way to software-generate video frames for MediaCodec.

    In Android 4.3 (API 18), you have two options. First, MediaCodec now accepts input from a Surface, which means anything you can draw with OpenGL ES can be recorded as a movie. See, for example, the EncodeAndMuxTest sample.

    Second, you still have the option of using software-generated YUV 420 buffers, but now they're more likely to work because there are CTS tests that exercise them. You still have to do runtime detection of planar or semi-planar, but there's really only two layouts. See the buffer-to-buffer variants of the EncodeDecodeTest for an example.

    0 讨论(0)
提交回复
热议问题