问题
I've been writing a short program to redirect audio from the line in/mic to to speakers. I'm just learning most of this, but my labors have yielded what seems to almost be a working model of what I want. However, when I print the TargetDataLine
buffer, it prints all 0s as if it was connected and streaming, but can't hear my input. At this point, I've studied most of the sampled
package content and what was available online in forums, tutorials, and other people's code, and due to the disappointing lack of published audio code, I think my studying resources have all but run out. So, if anyone has any advice or resources whatsoever, it would be greatly appreciated. I don't think you'll need any of the other code, but if you do, just ask. This code compiles without errors or warnings on my machine using Eclipse version: 4.3.0.v20130605
.
Here's a method index of the class to save you most of 200 lines of code.
class Stream extends Thread {
vars
...
Stream()
setProcessingBuffer()
setRenderingBuffer()
bytesToString()
play()
pause()
run() (Override)
}
Code:
package moshi;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.Line;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.Mixer;
import javax.sound.sampled.SourceDataLine;
import javax.sound.sampled.TargetDataLine;
/**
* @author KN
* @version 1.0 Build 1 October 26, 2013
*
* This concurrent process sets up and controls the streaming of audio
* data via input and output buffers.
*
* @see {@link Thread}, {@link AudioSystem}, {@link TargetDataLine},
* {@link SourceDataLine}
*/
public class Stream extends Thread {
/** The {@link AudioFormat} used in encoding/decoding streaming audio data */
public final static AudioFormat audioFormat = new AudioFormat(48000, 16, 2, true, true);
/**
* {@link String} describing the name of the audio device to be used.
* <p>
* Example: "Line In", "Microphone"
*/
private static String INPUT = "Mic";
/**
* {@link String} describing the name of the audio device to be used.
* <p>
* Example: "Speakers", "Line Out"
*/
private static String OUTPUT = "Speakers";
/**
* {@link #PROCESSING_BUFFER} is a buffer used for receiving audio data
*
* @see TargetDataLine
*/
private static TargetDataLine PROCESSING_BUFFER;
/**
* {@link #RENDERING_BUFFER} is a buffer used for writing audio data
*
* @see SourceDataLine
*/
private static SourceDataLine RENDERING_BUFFER;
/** {@link Integer} specifying the buffer sizes in bytes */
private static int BUFFER_SIZE = 2048;
/** {@link Byte[]} for holding raw audio data */
private static byte[] READ_BUFFER = new byte[Stream.BUFFER_SIZE];
/**
* Initiates the audio hardware read/write buffers into
* {@link TargetDataLine}s and {@link SourceDataLine}s respectively.
*
* @see {@link TargetDataLine}, {@link SourceDataLine}
*/
public Stream() {
setProcessingBuffer();
setRenderingBuffer();
}
/**
* Queries input Lines and stores the {@link TargetDataLine} at
* {@link #PROCESSING_BUFFER}
*
* @see {@link AudioSystem}, {@link Line}, {@link TargetDataLine},
* {@link Mixer}
*/
private void setProcessingBuffer() {
final Mixer.Info[] mixerInfos = AudioSystem.getMixerInfo();
for (final Mixer.Info info : mixerInfos) {
final Mixer mixer = AudioSystem.getMixer(info);
final Line.Info[] targetLineInfos = mixer.getTargetLineInfo();
for (final Line.Info targetLineInfo : targetLineInfos) {
if (targetLineInfo.getLineClass() == javax.sound.sampled.TargetDataLine.class
&& info.getName().startsWith(Stream.INPUT)) {
try {
Stream.PROCESSING_BUFFER = (TargetDataLine) mixer.getLine(targetLineInfo);
System.out.println(targetLineInfo.getLineClass() + ": " + info.getName() + " ["
+ Stream.PROCESSING_BUFFER + "] ");
} catch (LineUnavailableException e) {
e.printStackTrace();
}
} else {
}
}
}
}
/**
* Queries output Lines and stores the {@link SourceDataLine} at
* {@link #RENDERING_BUFFER}
*
* @see {@link AudioSystem}, {@link Line}, {@link SourceDataLine},
* {@link Mixer}
*/
private void setRenderingBuffer() {
final Mixer.Info[] mixerInfos = AudioSystem.getMixerInfo();
for (Mixer.Info info : mixerInfos) {
final Mixer mixer = AudioSystem.getMixer(info);
final Line.Info[] sourceLineInfos = mixer.getSourceLineInfo();
for (final Line.Info sourceLineInfo : sourceLineInfos) {
if (sourceLineInfo.getLineClass() == javax.sound.sampled.SourceDataLine.class
&& info.getName().startsWith(Stream.OUTPUT)) {
try {
Stream.RENDERING_BUFFER = (SourceDataLine) mixer.getLine(sourceLineInfo);
System.out.println(sourceLineInfo.getLineClass() + ": " + info.getName() + " ["
+ Stream.RENDERING_BUFFER + "]");
} catch (LineUnavailableException e) {
e.printStackTrace();
}
} else {
}
}
}
}
/**
* Takes in an array of bytes and returns a String object representation of
* the data
*
* @param array
* The byte array to be converted
* @return The string object representation of a byte array
*/
private static String bytesToString(byte[] array) {
String toString = "";
for (byte currentByte : array) {
toString += currentByte;
}
return toString;
}
/**
* Opens buffers {@link #PROCESSING_BUFFER} and {@link #RENDERING_BUFFER}
* for reading/writing
*/
public static void play() {
try {
if (!Stream.PROCESSING_BUFFER.isOpen()) {
Stream.PROCESSING_BUFFER.open(Stream.audioFormat, Stream.BUFFER_SIZE);
}
if (!Stream.RENDERING_BUFFER.isOpen()) {
Stream.RENDERING_BUFFER.open(Stream.audioFormat, Stream.BUFFER_SIZE);
Stream.RENDERING_BUFFER.start();
}
while (Stream.RENDERING_BUFFER.isOpen()) {
Stream.PROCESSING_BUFFER.read(Stream.READ_BUFFER, 0, Stream.BUFFER_SIZE);
System.out.println(Stream.bytesToString(Stream.READ_BUFFER));
Stream.RENDERING_BUFFER.write(Stream.READ_BUFFER, 0, Stream.BUFFER_SIZE);
}
} catch (Exception e) {
e.printStackTrace();
}
}
/**
* Stops buffers {@link #PROCESSING_BUFFER} and {@link #RENDERING_BUFFER}
* from reading/writing
*/
public static void pause() {
if (Stream.PROCESSING_BUFFER.isOpen()) {
Stream.PROCESSING_BUFFER.close();
}
if (Stream.RENDERING_BUFFER.isOpen()) {
Stream.RENDERING_BUFFER.stop();
Stream.RENDERING_BUFFER.close();
}
}
/** {@inheritDoc} */
@Override
public void run() {
}
}
Output:
interface javax.sound.sampled.TargetDataLine: Microphone (Realtek High Defini [com.sun.media.sound.DirectAudioDevice$DirectTDL@2f57d162]
interface javax.sound.sampled.SourceDataLine: Speakers (Realtek High Definition Audio) [com.sun.media.sound.DirectAudioDevice$DirectSDL@79b7d13e]
0000000000000000000000000000000000000000000000000000000000000...... And a lot more of that
回答1:
Lets look at:
Stream.PROCESSING_BUFFER.read(Stream.READ_BUFFER, 0, Stream.BUFFER_SIZE);
if all is well this will read data until the buffer is full, but in your case all is not well and in fact it is reading nothing as you can tell by checking the value returned by read
:
int numRead = Stream.PROCESSING_BUFFER.read(Stream.READ_BUFFER, 0, Stream.BUFFER_SIZE);
numRead
is 0 and nothing has been put in Stream.READ_BUFFER
.
To be sure you output the data that was read you need to use:
Stream.RENDERING_BUFFER.write(Stream.READ_BUFFER, 0, numRead);
The reason you are reading nothing is because you have not started the TargetDataLine
, you need:
if (!Stream.PROCESSING_BUFFER.isOpen()) {
Stream.PROCESSING_BUFFER.open(Stream.audioFormat, Stream.BUFFER_SIZE);
Stream.PROCESSING_BUFFER.start();
}
来源:https://stackoverflow.com/questions/19610222/java-audio-data-streaming-0s