Trying to understand buffers with regard to NAudio in C#

元气小坏坏 提交于 2019-12-06 00:02:06

Disclaimer: I don't have that much experience with NAudio.


It kind of depends on what you want to do with the audio data.

If you simply want to store or dump the data (be it a file target or just the console) then you don't need a BufferedWaveProvider. Just do whatever you want to do directly in the event handler sourceStream_DataAvailable(). But keep in mind that you receive the data as raw bytes, i.e. how many bytes actually constitute a single frame (a.k.a. sample) of the recorded audio depends on the wave format:

var bytesPerFrame = sourceStream.WaveFormat.BitsPerSample / 8
                  * sourceStream.WaveFormat.Channels

If you want to analyze the data (fourier analysis with FFT, for instance) then I suggest to use NAudio's ISampleProvider. This interface hides all the raw byte, bit-depth stuff and lets you access the data frame by frame in an easy manner.

First create an ISampleProvider from your BufferedWaveProvider like so:

var samples = waveBuffer.ToSampleProvider();

You can then access a sample frame with the Read() method. Make sure to check if data is actually available with the BufferedBytes property on your BufferedWaveProvider:

while (true)
{
    var bufferedFrames = waveBuffer.BufferedBytes / bytesPerFrame;        

    if (bufferedFrames < 1)
        continue;

    var frames = new float[bufferedFrames];
    samples.Read(frames, 0, bufferedFrames);

    DoSomethingWith(frames);
}

Because you want to do two things at once -- recording and analyzing audio data concurrently -- you should use two separate threads for this.

There is a small GitHub project that uses NAudio for DTMF analysis of recorded audio data. You might wanna have a look to get some ideas how to bring it all together. The file DtmfDetector\Program.cs there is a good starting point.


For a quick start that should give you "more coherent" output try the following:

Add this field to your class:

ISampleProvider samples;

Add this line to your method startRecording():

samples = waveBuffer.ToSampleProvider();

Extend sourceStream_DataAvailable() like so:

void sourceStream_DataAvailable(object sender, NAudio.Wave.WaveInEventArgs e)
{
    waveBuffer.AddSamples(e.Buffer, 0, e.BytesRecorded);
    waveBuffer.DiscardOnBufferOverflow = true;

    var bytesPerFrame = sourceStream.WaveFormat.BitsPerSample / 8
                      * sourceStream.WaveFormat.Channels
    var bufferedFrames = waveBuffer.BufferedBytes / bytesPerFrame;

    var frames = new float[bufferedFrames];
    samples.Read(frames, 0, bufferedFrames);

    foreach (var frame in frames)
        Debug.WriteLine(frame);
}
标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!