android, AudioRecord.read() --> bufferoverflow, how to handle the buffer?

元气小坏坏 提交于 2019-12-11 07:06:27

问题


for a university project my prof. wants me to write an android application, would be my first one. I have some Java experience but I am new to Android programming, so please be gentle with me.

First I create an Activity where I have only two buttons, one for starting an AsyncTask and one for stopping it, I mean I just set the boolean "isRecording" to false, everything else is handled in the AsyncTask, which is attached as source code.

The thing is running quite okay, but after a while I can find some bufferoverflow messages in the LogCat and after that it crashes with an uncaught exception. I figured out why it's crashing, and the uncaught exception shouldn't be the purpose of that question.

03-07 11:34:02.474: INFO/buffer 247:(558): 40
03-07 11:34:02.484: WARN/AudioFlinger(33): RecordThread: buffer overflow
03-07 11:34:02.484: INFO/MutantAudioRecorder:doInBackground()(558): isRecoding
03-07 11:34:02.484: INFO/MutantAudioRecorder:doInBackground()(558): isRecoding
03-07 11:34:02.494: WARN/AudioFlinger(33): RecordThread: buffer overflow
03-07 11:34:02.494: INFO/buffer 248:(558): -50
  1. I write out the buffer as you can see, but somehow I think I made a mistake in configuring the AudioRecord correctly, can anybody tell why I get the bufferoverflow?

  2. And the next question would be, how can I handle the buffer? I mean, I have the values inside it and want them to show in graphical spectrogram on the screen. Does anyone have experience with it and can me give a hint? How can I go on ...

Thanks in advance for your help.

Source code of the AsyncTask:

package nomihodai.audio;

import android.media.AudioFormat;
import android.media.AudioRecord;
import android.os.AsyncTask;
import android.util.Log;



public class MutantAudioRecorder extends AsyncTask<Void, Void, Void> {

private boolean isRecording = false;
public AudioRecord audioRecord = null;
public int mSamplesRead;
public int buffersizebytes;
public int buflen;
public int channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
public int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
public static short[] buffer;
public static final int SAMPLESPERSEC = 8000;


@Override
protected Void doInBackground(Void... params) {

    while(isRecording) {

        audioRecord.startRecording();
        mSamplesRead = audioRecord.read(buffer, 0, buffersizebytes);

        if(!readerT.isAlive())
            readerT.start();

        Log.i("MutantAudioRecorder:doInBackground()", "isRecoding");
    }

    readerT.stop();

    return null;
}


Thread readerT = new Thread() {
    public void run() {
        for(int i = 0; i < 256; i++){ 
            Log.i("buffer " + i + ": ", Short.toString(buffer[i]));
        }
    }
};


@Override
public void onPostExecute(Void unused) {
    Log.i("MutantAudioRecorder:onPostExecute()", "try to release the audio hardware");

    audioRecord.release();

    Log.i("MutantAudioRecorder:onPostExecute()", "released...");
}


public void setRecording(boolean rec) {
    this.isRecording = rec;

    Log.i("MutantAudioRecorder:setRecording()", "isRecoding set to " + rec);
}


@Override
protected void onPreExecute() {

    buffersizebytes = AudioRecord.getMinBufferSize(SAMPLESPERSEC, channelConfiguration, audioEncoding);
    buffer = new short[buffersizebytes];
    buflen = buffersizebytes/2;

    Log.i("MutantAudioRecorder:onPreExecute()", "buffersizebytes: " + buffersizebytes
                                                + ", buffer: " + buffer.length
                                                + ", buflen: " + buflen);

    audioRecord = new AudioRecord(android.media.MediaRecorder.AudioSource.MIC,
            SAMPLESPERSEC,
            channelConfiguration,
            audioEncoding,
            buffersizebytes);

    if(audioRecord != null)
        Log.i("MutantAudioRecorder:onPreExecute()", "audiorecord object created");
    else
        Log.i("MutantAudioRecorder:onPreExecute()", "audiorecord NOT created");
}

}


回答1:


It's probably some live analyzing process working on the recorded audio bytes?

Since the buffer size for recording is limited, once your "analyzing process" is slower than the rate of recording, the data in the buffer will be stuck, but the recording bytes are always coming thus buffer overflows.

Try use threads on recording and the other process on the recorded bytes, there's a open source sample code for this approach: http://musicg.googlecode.com/files/musicg_android_demo.zip




回答2:


As we discussed in the chat room, decoding the audio data and displaying it on the screen should be straightforward. You mentioned that the audio buffer has 8000 samples per second, each sample is 16 bit, and it's mono audio.

Displaying this should be straightforward. Treat each sample as a vertical offset in your view. You need to scale the range -32k to +32k to the vertical height of your view. Starting at the left edge of the view, draw one sample per column. When you reach the right edge, wrap around again (erasing the previous line as necessary).

This will end up drawing each sample as a single pixel, which may not look very nice. You can also draw a line between adjacent samples. You can play around with line widths, colors and so on to get the best effect.

One last note: You'll be drawing 8000 times per second, plus more to blank out the previous samples. You may need to take some shortcuts to make sure the framerate can keep up with the audio. You may need to skip samples.



来源:https://stackoverflow.com/questions/5218856/android-audiorecord-read-bufferoverflow-how-to-handle-the-buffer

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!