问题
I'm new to android programming and audio visualization. I want to create a simple audio visualizer using MediaPlayer and Visualizer classes. My problem is, I don't know what wave form data
really is. Must I use it to visualize audio?
I'm using the code below. It's problem is, it will only visualize audio for the first 10-12 seconds of the file; after that, I'm unable to capture more data! Where did I go wrong?
public void attachVisualizer()
{
Visualizer vis = new Visualizer(mPlayer.getAudioSessionId());
vis.setCaptureSize(Visualizer.getCaptureSizeRange()[0]);
vis.setDataCaptureListener(new Visualizer.OnDataCaptureListener() {
public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate) {
int sum = 0;
for(int i = 0; i < bytes.length; i++) {
sum += bytes[i];
}
if(sum > 8000) {
// Do something which uses mPlayer.getCurrentPosition() in mathematics
}
}
public void onFftDataCapture(Visualizer visualizer, byte[] fft, int samplingRate) {}
}, Visualizer.getMaxCaptureRate() , true, false);
vis.setEnabled(true);
}
EDIT
And another question in my mind is, how do I record the length of time contained in a given audio segment?
回答1:
i do this:
visualizer = new Visualizer(0);
visualizer.setEnabled(false);
visualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[0]);
visualizer.setDataCaptureListener(
new Visualizer.OnDataCaptureListener() {
public void onWaveFormDataCapture(Visualizer visualizer,
byte[] bytes, int samplingRate) {
eqview.setVSWaveForm(bytes);
}
public void onFftDataCapture(Visualizer visualizer,
byte[] bytes, int samplingRate) {
fftview.setVSFftData(bytes);
}
}, Visualizer.getMaxCaptureRate(), true, true);
visualizer.setEnabled(true);
for the Visualizer View, I found this code online, I did not write it:
package app.util;
import android.content.Context;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.graphics.Rect;
import android.util.AttributeSet;
import android.view.View;
/***
*
*
* @author yokmama
*
*/
public class VisualizerView extends View {
private byte[] mBytes;
private float[] mPoints;
private Rect mRect = new Rect();
//SharedPreferences prefs;
private Paint mForePaint = new Paint();
public VisualizerView(Context context, AttributeSet attrs) {
super(context, attrs);
//prefs = PreferenceManager.getDefaultSharedPreferences(context);
init();
}
private void init() {
mBytes = null;
//int colorchosen = prefs.getInt("COLOR_PREFERENCE_KEY",
// Color.WHITE);
mForePaint.setStrokeWidth(1);
//mForePaint.setAntiAlias(true);
mForePaint.setColor(Color.WHITE);
//mForePaint.setMaskFilter(new BlurMaskFilter(1, Blur.INNER));
}
public void updateVisualizer(byte[] bytes) {
mBytes = bytes;
invalidate();
}
@Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
if (mBytes == null) {
return;
}
if (mPoints == null || mPoints.length < mBytes.length * 4) {
mPoints = new float[mBytes.length * 4];
}
mRect.set(0, 0, getWidth(), getHeight());
for (int i = 0; i < mBytes.length - 1; i++) {
mPoints[i * 4] = mRect.width() * i / (mBytes.length - 1);
mPoints[i * 4 + 1] = mRect.height() / 2
+ ((byte) (mBytes[i] + 128)) * (mRect.height() / 2) / 128;
mPoints[i * 4 + 2] = mRect.width() * (i + 1) / (mBytes.length - 1);
mPoints[i * 4 + 3] = mRect.height() / 2
+ ((byte) (mBytes[i + 1] + 128)) * (mRect.height() / 2)
/ 128;
}
canvas.drawLines(mPoints, mForePaint);
//canvas.drawPoints(mPoints, mForePaint);
}
}
来源:https://stackoverflow.com/questions/11081356/how-to-work-with-onwaveformdatacapture-result-of-visualizer-object