stagefright

Access StageFright.so directly to decode H.264 stream from JNIlayer in Android

泄露秘密 提交于 2019-11-27 15:17:45
问题 Is there a way to access libstagefright.so directly to decode H.264 stream from JNI layer on Android 2.3 or above? 回答1: If your objective is to decode an elementary H.264 stream, then your code will have to ensure that the stream is extracted, the codec-specific-data is provided to the codec which is primarily SPS and PPS data and frame data along with time-stamps is provided to the codec. Across all Android versions, the most common interface would be OMXCodec which is an abstraction over an

How to create a stagefright plugin

二次信任 提交于 2019-11-27 11:23:33
问题 I have a task which involves integration of a video decoder into Stagefright (Android's multimedia framework). I searched and found the following about creating a new plugin for Stagefright : To add support for a new format, you need to: Develop a new Extractor class, if the container is not supported yet. Develop a new Decoder class, that implements the interface needed by the StageFright core to read the data. Associate the mime-type of the files to read to your new Decoder in the OMXCodec

How to use hardware accelerated video decoding on Android?

守給你的承諾、 提交于 2019-11-27 09:09:54
问题 I need hardware-accelerated H.264 decoding for a research project, to test a self-defined protocol. As I have Search on the web, I have found a few ways to perform hardware-accelerated video decoding on Android. Use ffmpeg libstagefright (overview of libstagefright) or use libstagefright in the OS directly, like here. Use OpenMax on specific hardware platform. like here about samsung device and here about Qualcomm Snapdragon series Some people mentioned PVplayer , Some people "say"

How to dump YUV from OMXCodec decoding output

余生颓废 提交于 2019-11-27 07:32:47
问题 I'd like to dump YUV data from OMXCodec decoding output. It's MediaBuffer type. It's impossible to access data() pointer. If I try to access data, crash happens due to the check code below. frameworks/av/media/libstagefright/MediaBuffer.cpp:119 CHECK(mGraphicBuffer == NULL) failed. Please let me know the solution to extract YUV data from this MediaBuffer . 回答1: From the MediaBuffer , I feel that the following should be functional. I haven't tried the same yet and have worked with rg2's

How to use MediaCodec without MediaExtractor for H264

ぃ、小莉子 提交于 2019-11-27 03:06:27
I need to use MediaCodec without the MediaExtractor and I'm reading the file using a FileInputStream. Currently it is not working, it is showing a greenish scrambled image on the screen. This is the whole source code: FileInputStream in = new FileInputStream("/sdcard/sample.ts"); String mimeType = "video/avc"; MediaCodec decoder = MediaCodec.createDecoderByType(mimeType); MediaFormat format = MediaFormat.createVideoFormat(mimeType, 1920, 1080); byte[] header_sps = { 0, 0, 0, 1, 103, 100, 0, 40, -84, 52, -59, 1, -32, 17, 31, 120, 11, 80, 16, 16, 31, 0, 0, 3, 3, -23, 0, 0, -22, 96, -108 }; byte[

How to use MediaCodec without MediaExtractor for H264

倾然丶 夕夏残阳落幕 提交于 2019-11-26 12:38:00
问题 I need to use MediaCodec without the MediaExtractor and I\'m reading the file using a FileInputStream. Currently it is not working, it is showing a greenish scrambled image on the screen. This is the whole source code: FileInputStream in = new FileInputStream(\"/sdcard/sample.ts\"); String mimeType = \"video/avc\"; MediaCodec decoder = MediaCodec.createDecoderByType(mimeType); MediaFormat format = MediaFormat.createVideoFormat(mimeType, 1920, 1080); byte[] header_sps = { 0, 0, 0, 1, 103, 100,

FFmpeg on Android

心不动则不痛 提交于 2019-11-25 22:59:37
问题 I have got FFmpeg compiled (libffmpeg.so) on Android. Now I have to build either an application like RockPlayer or use existing Android multimedia framework to invoke FFmpeg. Do you have steps / procedures / code / example on integrating FFmpeg on Android / StageFright? Can you please guide me on how can I use this library for multimedia playback? I have a requirement where I have already audio and video transport streams, which I need to feed to FFmpeg and get it decoded / rendered. How can