naudio

Concatenate wave files at 5 second intervals

血红的双手。 提交于 2019-12-13 03:12:27
问题 I have a series of wave files with individual words, each lasting about 1 second. I want to use C# to concatenate them to one large file at exactly five second intervals. This will save me from having to put the big file through a sound editor and record the start times for each word. I know how to concatenate files using NAudio and WaveFileWriter.write. Is there a way to either insert a silence for a certain length of time, or to actually append one file at a certain point in a file? I

How to decode RTP packets and save it has .wav file

↘锁芯ラ 提交于 2019-12-12 21:25:25
问题 I am trying to develop an application in which a sip call is established and then i am capturing rtp audio packets. As they are encoded so i need to decode them and save it has .wav file. Tried using NAudio but didnt worked. Is there any solution using NAudio or any other source to solve this problem... the code i used is as follow. data is the byte array in which rtp packet data is. System.IO.MemoryStream stream = new System.IO.MemoryStream(data); RawSourceWaveStream rsws = new

AcmNotPossible calling acmStreamOpen, naudio

别说谁变了你拦得住时间么 提交于 2019-12-12 20:03:47
问题 I am trying to convert a PCM S16 LE (araw) Mono, Sample rate 22050, Bit pr. sample 16 to PCM mulaw (PCM MU-LAW) Mono, Sample rate 8000hz, Bit pr. sample is 8. WaveFormat.CreateMuLawFormat(8000,1) or even a more generic WaveFormat.CreateCustomFormat where I have specified the same WaveFormatEncoding as the source stream is throwing the same exception. AcmNotPossible calling acmStreamOpen Am I missing something here? Any leads will be greatly appreciated. 回答1: The ACM mu-law encoder expects its

Convert PCM to MP3/OGG

一个人想着一个人 提交于 2019-12-12 09:12:23
问题 I need to convert a continuous stream of PCM, or encoded audio (ADPCM, uLaw, Opus), into MP3/OGG format so that it can be streamed to a browser (using html's audio tag). I have the "stream-mp3/ogg-using-audio-tag" part working, now I need to develop the conversion layer. I have two questions: How can I convert PCM into MP3/OGG using NAudio and/or some other C# library/framework? I assume there is a code snippet or two in the NAudio demo app that may be doing this, but I haven't been able to

Can't fade out with FadeInOutSampleProvider

╄→гoц情女王★ 提交于 2019-12-12 06:37:08
问题 I'm trying to use NAudio's FadeInOutSampleProvider to fade in a sample and fade it out. The fade in works OK, but instead of fading out gradually I get abrupt silence from where the fade-out should begin. What's the correct way to fade out with FadeInOutSampleProvider ? Here's how I'm trying to do it: IWaveProvider waveSource; // initialised by reading a WAV file // The ISampleProvider will be the underlying source for the following operations ISampleProvider sampleSource = waveSource

NAudio AudioMeterInformation works only if “control mmsys.cpl sounds” is open

醉酒当歌 提交于 2019-12-12 03:55:42
问题 I'm trying to capture the sound of MIC (DataFlow.Capture), but AudioMeterInformation.PeakValues only works if sound properties is open (control mmsys.cpl sounds) Working example But when i close sound properties.. My code private void calculateChannels(Object source, ElapsedEventArgs e) { dev = devEnum.GetDefaultAudioEndpoint(DataFlow.Capture, Role.Multimedia); try { double currentLeftChannel = 100 - (dev.AudioMeterInformation.PeakValues[0] * 100); double currentRightChannel = 100 - (dev

How to convert from wma to mp3 using NAudio

感情迁移 提交于 2019-12-12 03:35:05
问题 Please note This is NOT a duplicate of the other question linked. That uses classes I couldn't find, as detailed in my question below. I'm trying to convert wma files to mp3. I need a solution that I can integrate into my code base, not rely on an external resource, so using ffmpeg isn't an option. I've been trying NAudio, but without any success. One problem is that there seem to be two versions of NAudio around, and neither seems complete. The one you get from Nuget doesn't include the

Reduce delay in mapping speaker and mic volume using naudio c#

佐手、 提交于 2019-12-12 02:36:31
问题 Hello i am trying to map system mic audio to external sound card speaker and external sound card mic audio to system speaker. By using code public void MapForManualCall() { try { if (db.getResultOnQuery("SELECT [Value] FROM [dbo].[SystemProperties] where property='RecordingEnabled'").Rows[0][0].ToString().Equals("YES")) { SystemMic = new NAudio.Wave.WaveInEvent(); SystemMic.DeviceNumber = 0; SystemMic.WaveFormat = new NAudio.Wave.WaveFormat(44100, NAudio.Wave.WaveIn.GetCapabilities(SystemMic

Using NAudio to achieve fade out and fade in for a series of 44 kHz 16-bit two-channel wave files

不想你离开。 提交于 2019-12-12 02:28:11
问题 I have a series of 44 kHz 16-bit two-channel uncompressed wave files (read from resources) and want to apply the fade out and fade in effect to create a stream from the sequence of all the WAV files. The resource reading, and getting the 16-bit wavestream happens correctly. The target format is also shown correct, but I keep getting acmnotpossible as the exception in the waveformat coversion step below. What am I doing wrong? String ResToPlay2 = NameSpaceString + ".Resources." + inWave2 + "

Naudio Recording and playing audio

大憨熊 提交于 2019-12-12 01:56:20
问题 I'm using NAudio 1.7 after i gave up on WaveIn p/invoke... anyway, i'm making a VoIP application and the sample code i found used WaveFileWriter to output to disk, i don't want that, so i used the memoryStream overload instead. The problem is when i try to play the stream after i stop the recording with the SoundPlayer class, it just doesn't play and continues the code, but if i save it as shown below, i can play it in VLC, but if i try to load it from the file itself, it doesn't play either,