naudio

How to play an MP3 stream in C#

吃可爱长大的小学妹 提交于 2019-12-19 04:04:39
问题 I want to play an MP3 stream in my C# application. I have a server application that captures wave audio and converts it into MP3, then writes it to a network stream. The client then reads this stream to play the MP3. I have tried NAudio with the following code example, but it results in exception: using (WaveStream blockAlignedStream = new BlockAlignReductionStream( WaveFormatConversionStream.CreatePcmStream( new Mp3FileReader(ms)))) { using (WaveOut waveOut = new WaveOut(WaveCallbackInfo

How to calculate FFT using NAudio in realtime (ASIO out)

折月煮酒 提交于 2019-12-19 03:31:08
问题 I am programming clone of guitar (violin) Hero as a final project for this school year. The idea is to take input from my electric violin, analyse it via FFT, do some logic and drawing and output it through speakers. Perhaps some steps in parallel threads. I already have Asio low latency input-output implemented but I am having a great problem implementing realtime FFT. This is a code that sets up asioOut along with sampleAggregator. Sample aggregator should store samples that are added each

How do can I use LAME to encode an wav to an mp3 c#

空扰寡人 提交于 2019-12-18 12:06:21
问题 I am currently using NAudio to capture the sound and it only creates a wav file. I am looking for a way to encode it to an mp3 before saving the file. I found LAME but when ever i try to add the lame_enc.dll file it says "A reference could not be added. Please make sure the file is accessible, and that is a valid assembly or COM component". Any help would be appreciated. 回答1: Easiest way in .Net 4.0: Use the visual studio Nuget Package manager console: Install-Package NAudio.Lame Code Snip:

WaveMixerStream32 and IWaveProvider

我怕爱的太早我们不能终老 提交于 2019-12-18 09:09:26
问题 Is there any way with NAudio to link a WaveMixerStream32 with WaveProviders, rather than WaveStreams? I am streaming multiple network streams, using a BufferedWaveProvider. There doesn't seem to be an easy way to convert it into a WaveStream. Cheers! Luke 回答1: It's a fairly simple to convert an IWaveProvider to a WaveStream. An IWaveProvider is just a simplified WaveStream that doesn't support repositioning and has an unknown length. You can create an adapter like this: public class

How to get Exact Time of a MIDI event

谁说胖子不能爱 提交于 2019-12-18 08:47:56
问题 I'm trying to read a MIDI file and I want to determine the exact time of a NoteOn event from it in C#. I tried to use absolute time, but the output was something like 256632. What is this number ? This is the line of my code that returns the time : (note as NoteOnEvent).AbsoluteTime 回答1: A MIDI file only contains incremental times. Included as a variable length value between 1 and 4 bytes before each MIDI event. The library you are using is being helpful in providing you with the AbsoluteTime

playing byte[] using naudio

时光怂恿深爱的人放手 提交于 2019-12-14 03:13:56
问题 How can I convert an audio file to byte[] and play it using Naudio? (code will be appreciated) Also related to question above, how to play audio file in Resource using Naudio? For the second question I have this code: IWavePlayer waveOutDevice; IWaveProvider provider; public void PlaySound(byte[] sound) { waveoutDevice = new WaveOutEvent(); provider = new RawSourceWaveStrem(new MemoryStream(sound), new WaveFormat(); if (waveOutDevice != null) waveOutDevice.Stop(); waveOutDevice.Init(provider)

Convert 32 bit float audio to 16 bit byte array?

本秂侑毒 提交于 2019-12-13 17:16:51
问题 I am getting audio using the NAudio library which returns a 32 bit float[] of audio data. I'm trying to find a way to convert this to a 16 bit byte[] for playback. private void sendData(float[] samples) { Buffer.BlockCopy(samples, 0, byteArray, 0, samples.Length); byte[] encoded = codec.Encode(byteArray, 0, byteArray.Length); waveProvider.AddSamples(byteArray, 0, byteArray.Length); s.Send(encoded, SocketFlags.None); } The audio being sent to waveProvider is coming out static-y — I don't think

Easiest way to read 2-channel samples into array from WaveStream

守給你的承諾、 提交于 2019-12-13 12:05:53
问题 I've been struggling with this for quite some time now and I couldn't find a working solution. I have a wav file (16 bit PCM: 44kHz 2 channels) and I want to extract samples into two arrays for each of the two channels. As far as I know the direct method for this does not exist in NAudio library, so I tried to run the following code to read a few of interlaced samples but the buffer array stays empty (just a bunch of zeros): using (WaveFileReader pcm = new WaveFileReader(@"file.wav")) { byte[

How to convert a float between -1.0 and +1.0 to dB(A) of sound pressure? [duplicate]

纵然是瞬间 提交于 2019-12-13 08:59:37
问题 This question already has answers here : How can I calculate audio dB level? (7 answers) Closed 4 years ago . Using WaveFileReader wfr = new WaveFileReader(file); float[] d = wfr.ReadNextSampleFrame(); I get a float array d . When iterating through d using foreach (float s in d) I get floats between -1.0 and +1.0. How do I convert them to dB(A) of sound pressure? EDIT: I solved it using double db = 20 * Math.Log10(Math.Abs(s)); 回答1: The dBA scale is a measure of relative air-pressure. In this

NoDriver calling acmFormatSuggest on Azure

本小妞迷上赌 提交于 2019-12-13 07:26:02
问题 I am Using NAudio for getting MP3 file information as well merging 2 or more MP3 files. It works fine on localhost but when I publish the site on AZURE it throws error "NoDriver calling acmFormatSuggest" 回答1: I assume that you are trying to use something that is not installed on the machine in Azure - in your case it is ACM MP3 decoder. On a client Windows it can be pre-installed, but i do not think that server Windows can have it. Also i suspect that something like that will not be allowed