stagefright

QOMX_COLOR_FormatYUV420PackedSemiPlanar64x32Tile2m8ka converter

烂漫一生 提交于 2019-12-02 04:44:37
I need to handle YUV data from H/W decoding output on Android. Actually, I'm using Nexus4 and the decoding output format is QOMX_COLOR_FormatYUV420PackedSemiPlanar64x32Tile2m8ka type. But I need YUV420 Planar format data, it need to be converted. Could you share the converting function or any way? 来源: https://stackoverflow.com/questions/21797923/qomx-color-formatyuv420packedsemiplanar64x32tile2m8ka-converter

How to register a OMX core for adding a new decoder

白昼怎懂夜的黑 提交于 2019-12-01 10:47:59
I'm referring to the post: Android: How to integrate a decoder to multimedia framework Following it i have registered my new decoder (Which is currently not supported by Android) in media_codecs.xml . The Step 2 of the above post requires me to perform OMX core registration . However, Since i'm really new to this topic, i'm not able to follow the step 2. I have the working code of the decoder in C and is already ported to android. So i request if anybody can provide information on: A step-by-step guide to preform OMX code registration for a decoder that is currently not supported by android.

Why am I getting “Unsupported format” errors, reading H.264 encoded rtsp streams with the Android MediaPlayer?

你说的曾经没有我的故事 提交于 2019-12-01 03:36:06
I am trying to show H.264 encoded rtsp video on an Android device. The stream is coming from a Raspberry Pi, using vlc to encode /dev/video1 which is a "Pi NoIR Camera Board". vlc-wrapper -vvv v4l2:///dev/video1 --v4l2-width $WIDTH --v4l2-height $HEIGHT --v4l2-fps ${FPS}.0 --v4l2-chroma h264 --no-audio --no-osd --sout "#rtp{sdp=rtsp://:8000/pi.sdp}" :demux=h264 > /tmp/vlc-wrapper.log 2>&1 I am using very minimal Android code right now: final MediaPlayer mediaPlayer = new MediaPlayer(); mediaPlayer.setDisplay(holder); try { mediaPlayer.setDataSource(url); mediaPlayer.prepare(); and getting a

FFMpeg Android Stagefright SIGSEGV error (h264 decode)

只谈情不闲聊 提交于 2019-11-29 19:57:25
问题 I need to decode h264 file to YUV on Android 2.3+. As I understand I need to communicate with Stagefright, as it`s the only way now, after closing access with OpenMAX IL implementations. I have used FFmpeg 0.10 (and tried 0.9/0.9.1..) for this issue, compiled it with NDK7 (and also tried NDK6b with the same result): ffmpeg version 0.10 Copyright (c) 2000-2012 the FFmpeg developers built on Jan 28 2012 14:42:37 with gcc 4.4.3 configuration: --target-os=linux --cross-prefix=arm-linux

Using MediaCodec asynchronously to decode and render a Video File

谁都会走 提交于 2019-11-29 18:25:45
Recently started toying around with the Android Media Codec class to render the video frames from a Native C++ application. Was able to successfully decode and render both audio and video streams using Android MediaCodec class using synchronous approach[queueInputBuffer and deququeInputBuffer]. Android has a good reference example of how to do it in Native C++ Application. Ex : SimplePlayer.cpp Now i have started with the implementation of asynchronous approach using callbacks and feed the input streams to codec in those callbacks[OnInputBufferAvailable/ OnOutPutBufferAvailable]. I was

Access StageFright.so directly to decode H.264 stream from JNIlayer in Android

こ雲淡風輕ζ 提交于 2019-11-29 00:17:01
Is there a way to access libstagefright.so directly to decode H.264 stream from JNI layer on Android 2.3 or above? If your objective is to decode an elementary H.264 stream, then your code will have to ensure that the stream is extracted, the codec-specific-data is provided to the codec which is primarily SPS and PPS data and frame data along with time-stamps is provided to the codec. Across all Android versions, the most common interface would be OMXCodec which is an abstraction over an underlying OMX component. In Gingerbread (Android 2.3) and ICS (Android 4.0.0), if you would like to create

How to create a stagefright plugin

自作多情 提交于 2019-11-28 18:29:57
I have a task which involves integration of a video decoder into Stagefright (Android's multimedia framework). I searched and found the following about creating a new plugin for Stagefright : To add support for a new format, you need to: Develop a new Extractor class, if the container is not supported yet. Develop a new Decoder class, that implements the interface needed by the StageFright core to read the data. Associate the mime-type of the files to read to your new Decoder in the OMXCodec.cpp file, in the kDecoderInfo array. static const CodecInfo kDecoderInfo[] = { {MEDIA_MIMETYPE_AUDIO

Android - Include native StageFright features in my own project

淺唱寂寞╮ 提交于 2019-11-28 16:45:24
I am currently developing an application that needs to record audio, encode it as AAC, stream it, and do the same in reverse - receiving stream, decoding AAC and playing audio. I successfully recorded AAC (wrapped in a MP4 container) using the MediaRecorder , and successfully up-streamed audio using the AudioRecord class. But, I need to be able to encode the audio as I stream it, but none of these classes seem to help me do that. I researched a bit, and found that most people that have this problem end up using a native library like ffmpeg . But I was wondering, since Android already includes

How to use hardware accelerated video decoding on Android?

风格不统一 提交于 2019-11-28 15:21:25
I need hardware-accelerated H.264 decoding for a research project, to test a self-defined protocol. As I have Search on the web, I have found a few ways to perform hardware-accelerated video decoding on Android. Use ffmpeg libstagefright ( overview of libstagefright ) or use libstagefright in the OS directly, like here . Use OpenMax on specific hardware platform. like here about samsung device and here about Qualcomm Snapdragon series Some people mentioned PVplayer , Some people "say" libstagefright is the only way while Qualcomm guys have made success obviously. Currently I am not sure which

Using MediaCodec asynchronously to decode and render a Video File

拜拜、爱过 提交于 2019-11-28 13:01:36
问题 Recently started toying around with the Android Media Codec class to render the video frames from a Native C++ application. Was able to successfully decode and render both audio and video streams using Android MediaCodec class using synchronous approach[queueInputBuffer and deququeInputBuffer]. Android has a good reference example of how to do it in Native C++ Application. Ex : SimplePlayer.cpp Now i have started with the implementation of asynchronous approach using callbacks and feed the