wav

How can I convert a WAV from stereo to mono in Python?

自古美人都是妖i 提交于 2019-12-03 17:25:09
问题 I don't want to use any other apps (like sox) - I want to do this in pure Python. Installing needed Python libs is fine. 回答1: If the WAV file is PCM-encoded then you can use wave. Open the source and destination files, read samples, average the channels, and write them out. 回答2: I maintain an open source library, pydub, which make this pretty simple from pydub import AudioSegment sound = AudioSegment.from_wav("/path/to/file.wav") sound = sound.set_channels(1) sound.export("/output/path.wav",

Decoding a WAV File Header

我的未来我决定 提交于 2019-12-03 16:21:09
I'm trying to understand the header of a WAV file. I've opened an example file and got this: 5249 4646 e857 1400 5741 5645 666d 7420 1000 0000 0100 0200 44ac 0000 10b1 0200 0400 1000 I've been reading this data representation tutorial . I understand that 52 is one byte and represents the ASCII letter R . I understand up to the 1000 0000 . Why does that represent decimal 16? The tutorial says that the value at that position is always 0x10 . How does 1000 0000 equate to 0x10 . Also, when reading the file, will a program know whether to expect a number or ASCII? Presumably it'll check against a

Play a wav file with Haskell

末鹿安然 提交于 2019-12-03 16:06:48
问题 Is there a simple, direct way to play a WAV file from Haskell using some library and possibly such that I play many sounds at once? I'm aware of OpenAL but I'm not writing some advanced audio synthesis program, I just want to play some sounds for a little play thing. Ideally the API might be something like: readWavFile :: FilePath -> IO Wave playWave :: Wave -> IO () playWaveNonBlocking :: Wave -> IO () I'm this close to merely launching mplayer or something. Or trying to cat the wav directly

Generate wav tone in PHP

一个人想着一个人 提交于 2019-12-03 15:04:07
I would like to generate a sine tone in php. But constructing my wav I need to give the values in bytes. I don't know how to do that: Here is the code I have: $freqOfTone = 440; $sampleRate = 44100; $samplesCount = 80000; $amplitude = 0.25 * 32768; $w = 2 * pi() * $freqOfTone / $sampleRate; //$dataArray = new $text = "RIFF" ."80036" ."WAVE" ."fmt " ."16" ."1" ."1" ."44100" ."44100" ."1" ."8" ."data" ."80000"; for ($n = 0; $n < $samplesCount; $n++) { $text .= (int)($amplitude * sin($n * $w)); } $myfile = fopen("sine.wav", "w") or die("Unable to open file!"); fwrite($myfile, $text); fclose(

getting error while converting wav to amr using ffmpeg

假如想象 提交于 2019-12-03 13:09:59
问题 I am using ffmpeg to convert amr to wav and wav to amr.Its successfully converting amr to wav but not viceversa. As ffmpeg is supporting amr encoder decoder, its giving error. ffmpeg -i testwav.wav audio.amr Error while opening encoder for output stream #0.0 - maybe incorrect parameters such as bit_rate, rate, width or height 回答1: You can try setting the sample rate and bit rate. Amr supports only 8000Hz sample rate and 4.75k, 5.15k, 5.9k, 6.7k, 7.4k, 7.95k, 10.2k or 12.2k bit rates: ffmpeg

Best way to play wav files in the browser?

半腔热情 提交于 2019-12-03 11:45:52
I have no choice but to play wav files directly in the browser (serverside encoding to mp3 isn't an option, unfortunately.) What's the best way to do this? I'd really like to take advantage of the HTML 5 audio tag but my target audience includes many, many teens using IE6. As far as I'm aware flash isn't an option, but speedy playback really is critical. Thanks. Nowadays, the best way is probably just to use the HTML5 <audio> tag. In the past, you might have done it like this: Background: <embed src="bgsound.wav" hidden="true" autostart="true" loop="1"> On Click: <a href="success.wav">Play

Creating a .wav File in C#

不打扰是莪最后的温柔 提交于 2019-12-03 11:43:07
问题 As an excuse to learn C#, I have been trying to code a simple project: creating audio files. To start, I want to make sure that I can write files that meet the WAVE format. I have researched the format online (for example, here), but whenever I try to play back a file, it won't open correctly. Here is my code. Is something missing or incorrect? uint numsamples = 44100; ushort numchannels = 1; ushort samplelength = 1; // in bytes uint samplerate = 22050; FileStream f = new FileStream("a.wav",

Get length of .wav from sox output

痞子三分冷 提交于 2019-12-03 11:31:22
问题 I need to get the length of a .wav file. Using: sox output.wav -n stat Gives: Samples read: 449718 Length (seconds): 28.107375 Scaled by: 2147483647.0 Maximum amplitude: 0.999969 Minimum amplitude: -0.999969 Midline amplitude: 0.000000 Mean norm: 0.145530 Mean amplitude: 0.000291 RMS amplitude: 0.249847 Maximum delta: 1.316925 Minimum delta: 0.000000 Mean delta: 0.033336 RMS delta: 0.064767 Rough frequency: 660 Volume adjustment: 1.000 How do I use grep or some other method to only output the

Is it correct to assume that floating-point samples in a WAV or AIFF file will be normalized?

僤鯓⒐⒋嵵緔 提交于 2019-12-03 10:38:49
Say I have a program that reads a .WAV or .AIFF file, and the file's audio is encoded as floating-point sample-values. Is it correct for my program to assume that any well-formed (floating-point-based) .WAV or .AIFF file will contain sample values only in the range [-1.0f,+1.0f]? I couldn't find anything in the WAV or AIFF specifications that addresses this point. And if that is not a valid assumption, how can one know what the full dynamic range of the audio in the file was intended to be? (I could read the entire file and find out what the file's actual minimum and maximum sample values are,

Reading a single channel from a multi-channel wav file

浪尽此生 提交于 2019-12-03 10:13:02
I need to extract the samples of a single channel from a wav file that will contain up to 12 (11.1 format) channels. I know that within a normal stereo file samples are interleaved, first left, and then right, like so, [1st L] [1st R] [2nd L] [2nd R]... So, to read the left channel I'd do this, for (var i = 0; i < myByteArray.Length; i += (bitDepth / 8) * 2) { // Get bytes and convert to actual samples. } And to get the right channel I'd simply do for (var i = (bitDepth / 8)... . But, what order is used for files with more than 2 channels? Microsoft have created a standard that covers up to 18