Noise in background when generating sine wave in Java

匿名 (未验证) 提交于 2019-12-03 01:48:02

问题:

I'm getting a slight distortion (sounds like buzzing) in the background when I run the following code. Because of its subtle nature it makes believe there is some sort of aliasing going on with the byte casting.

AudioFormat = PCM_SIGNED 44100.0 Hz, 16 bit, stereo, 4 bytes/frame, big-endian

Note: code assumes (for now) that the data is in big endian.

public static void playFreq(AudioFormat audioFormat, double frequency, SourceDataLine sourceDataLine) {     System.out.println(audioFormat);     double sampleRate = audioFormat.getSampleRate();     int sampleSizeInBytes = audioFormat.getSampleSizeInBits() / 8;     int channels = audioFormat.getChannels();      byte audioBuffer[] = new byte[(int)Math.pow(2.0, 19.0) * channels * sampleSizeInBytes];      for ( int i = 0; i  0 ? 127 : -127);          if ( channels == 1 )         {             if ( sampleSizeInBytes == 1 )             {                 audioBuffer[i] = (byte) (wave);             }              else if ( sampleSizeInBytes == 2 )             {                 audioBuffer[i] = (byte) (wave);                 audioBuffer[i+1] = (byte)(wave >>> 8);             }         }          else if ( channels == 2 )         {             if ( sampleSizeInBytes == 1 )             {                 audioBuffer[i] = (byte) (wave);                 audioBuffer[i+1] = (byte) (wave);             }              else if ( sampleSizeInBytes == 2 )             {                 audioBuffer[i] = (byte) (wave);                 audioBuffer[i+1] = (byte)(wave >>> 8);                  audioBuffer[i+2] = (byte) (wave);                 audioBuffer[i+3] = (byte)(wave >>> 8);             }         }     }      sourceDataLine.write(audioBuffer, 0, audioBuffer.length); } 

回答1:

Your comments say that the code assumes big-endian.

Technically you're actually outputting in little-endian, however it doesn't seem to matter because through a lucky quirk your most significant byte is always 0.

EDIT: to explain that further - when your value is at its maximum value of 127, you should be writing (0x00, 0x7f), but the actual output from your code is (0x7f, 0x00) which is 32512. This happens to be near the proper 16 bit maximum value of 32767, but with the bottom 8 bits all zero. It would be better to always use 32767 as the maximum value, and then discard the bottom 8 bits if required.

This means that even though you're outputting 16-bit data, the effective resolution is only 8 bit. This seems to account for the lack of sound quality.

I've made a version of your code that just dumps the raw data to a file, and can't see anything otherwise wrong with the bit shifting itself. There's no unexpected changes of sign or missing bits, but there is a buzz consistent with 8 bit sample quality.

Also, for what it's worth your math will be easier if you calculate the wave equation based on sample counts, and then worry about byte offsets separately:

int samples = 2 >> 8);     byte lsb = (byte) wave;      for (int c = 0; c  1) {             audioBuffer[j++] = lsb;         }     }  } 


回答2:

I assume you are calling this code repeatedly to play a long sound.

Is there a chance that the wave you are generating is not getting to complete a full period before it is written?

If the wave gets "cut-off" before it completes a full period and then the next wave is written to the output, you will certainly hear something strange and I assume that may be what is causing the buzzing.

For example:

        /-------\              /-------\              /-------\   -----/         \       -----/         \       -----/         \                   \                      \                      \                    \-----                 \-----                 \----- 

Notice the disconnect between parts of this wave. That might be causing the buzzing.



标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!