Buffering URI on OpenSL, Android

妖精的绣舞 提交于 2019-12-25 04:19:17

问题


I have been trying the OpenSL for a few weeks now. I'm trying to access to the buffer while playing a file on sdcard via SL_DATALOCATOR_URI as a source. I want to write a few effects of my own and need the buffer.

Currently in the code, I'm creating two audio players. One is reading the file to buffer, other is writing the buffer to the output. When I'm testing the code with the microphone (recorder), everything is fine. Sound in-out works as expected.

When I switch the recorder with a uri audioplayer, queue goes haywire. Streaming is not listening to thread locks (it occurs async as I understand) and buffer calls are not fired correctly, time flies away.

I've put logs to every method, so the result appears something like this:

V/PDecoder( 1292): Position : 15023
V/PDecoder( 1292): Position : 16044
V/PDecoder( 1292): Position : 17043
V/PDecoder Native PL1( 1292): bqPlayerCallback
V/PDecoder Native PL1( 1292): Notify thread lock
V/PDecoder Native PL1( 1292): android_AudioIn 32768
V/PDecoder Native PL1( 1292): Wait thread lock
V/PDecoder Native PL1( 1292): android_AudioOut 32768
V/PDecoder Native PL1( 1292): android_AudioIn 32768
V/PDecoder Native PL1( 1292): android_AudioOut 32768
V/PDecoder Native PL1( 1292): Wait thread lock
V/PDecoder Native PL1( 1292): bqRecorderCallback
V/PDecoder Native PL1( 1292): Notify thread lock
V/PDecoder( 1708): Position : 18041
V/PDecoder( 1708): Position : 19040
V/PDecoder( 1708): Position : 20038

Seconds fly away before queue callbacks are even fired.

So the question is, how can I correct this problem? Is there a way for audioplayer > buffer > output solution for uri playback? What am I doing wrong? If someone can point me to the right direction, it is greatly appreciated.

The code is a little long for pasting here, so here are the gists

  • H file gist
  • C file gist

回答1:


After loosing myself in the code I gave in the question, decided to write it again, as clean as possible.

I found out that I were not locking the uri player after all. I'm adding the working final code at the end of the answer. The code is good for playing local file or url, but needs to be run in a thread started from java, or you'll lock the gui thread.

PS. The buffer is using stack, so you might want to move it to the heap, and probably save the pointer in the struct. Also, play, pause, destroy methods are not finished. If you want to use the code, you can easily implement these functions.

Bonus. The code also includes a simple way to call java instance methods (without the dreaded *env sent from java part). If you need it, look at JNI_OnLoad, then playStatusCallback() and then callPositionChanged() methods.

The code is a little long for pasting here, so here are the gists

  • H and C Files in a single Gist



回答2:


Emrah, this is the precise problem I've been having for my project right now. I've been following this blog:

http://audioprograming.wordpress.com/2012/10/29/lock-free-audio-io-with-opensl-es-on-android/

which is the circular buffer implementation of this, from the same blog:

http://audioprograming.wordpress.com/2012/03/03/android-audio-streaming-with-opensl-es-and-the-ndk/

In any case, upon studying the code it looks like he has his versions of your opensl-native h and c files, named opensl_io. He also has another class, opensl_example, that has an inbuffer and an outbuffer with a bit of simple processing in between. It seems like his recorder object fills the inbuffer of this opensl_example class, and his outbuffer populates his audioplayer object to play to sink. From what it sounds like, you were doing that as well.

Basically, I'm trying to replace the recorder object with an input stream from file, since I have to have access to chunks of buffer from the file if I want to process each chunk differently during stream, for example. you are using SLDATA_locator from the utf8 converted URI, which I'm trying to do now but I'm not exactly sure how to get a stream from it.

Right now, how the blog example works is that it takes the audio input from the recorder object in stream, puts it to the circular buffers as they fill, and put it through processing to output. I'm trying to just replace the source of the recorder buffers with my chunks of buffer from mp3. Again, it sounds like your code does precisely that. The audioprogramming blog's example is particularly complicated to alter to me because I'm not entirely sure with how SWIG works. But since you're using JNI, it might be easier.

Can you advise me in how yours works? Do you simply call StartPDecoderNative and then DecodeOn from Java with the uri string as a parameter?




回答3:


Ok, tried running the c and h code with a simple java mainactivity that runs both those functions, in that order, on a button click.

Also, it looks like you also need a positionchanged method in java. What are you running in there? I can comment the part with jmethod out and the music plays, so that's working. Is it for seek?

Finally, maybe I'm just having a bit of trouble understanding it but which buffer are you doing the processing on, and where does it reside? Is it outbuffer? If I just wanted to, say, apply an fft or more simply just a scalar multiplication to the output sound, would I just multiply it to outbuffer before playing it out the final sink?



来源:https://stackoverflow.com/questions/24758121/buffering-uri-on-opensl-android

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!