AudioRecord and AudioTrack latency

老子叫甜甜 提交于 2019-11-30 13:53:06

Android's AudioTrack\AudioRecord classes have high latency due to minimum buffer sizes. The reason for those buffer sizes is to minimize drops when GC's occur according to Google (which is a wrong decision in my opinion, you can optimize your own memory management).

What you want to do is use OpenSL, which is available from 2.3. It contains native APIs for streaming audio. Here's some docs: http://mobilepearls.com/labs/native-android-api/opensles/index.html

Just a thought, but shouldn't you be reading < mBufferSize

My first instict was to suggest initting AudioTrack into static mode rather than streaming mode, since static mode has notably smaller latency. However, Static Mode is more appropriate for short sounds that fit entirely in memory rather than a sound you are capturing from elsewhere. But just as a wild guess, what if you set AudioTrack to static mode and feed it discrete chunks of your input audio?

If you want tighter control over audio, I'd recommend taking a look at OpenSL ES for Android. The learning curve will be a bit steeper, but you get much more fine-grained control and lower latency.

As mSparks pointed out, streaming should be made using smaller read size: you don't need to read the full buffer to stream data!

int read = mRecorder.read(mBuffer, 0, 256); /* Or any other magic number */
if (read>0) {
    mPlayer.write(mBuffer, 0, read);  
}

This will reduce drastically your latency. If mHz is 44100 and your are in MONO configuration with 256 your latency will be no less then 1000 * 256/44100 milliseconds = ~5.8 ms. 256/44100 is the conversion from samples to seconds, so multiplying by 1000 gives you milliseconds. The problems is internal implementation of the player. You don't have control about that from java. Hope this helps someone :)

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!