问题
I am wanting to create a live audio stream from one device to a node server which can then broadcast that live feed to several front ends.
I have searched extensively for this and have really hit a wall so hoping somebody out there can help.
I am able to get my audio input from the window.navigator.getUserMedia
API.
getAudioInput(){
const constraints = {
video: false,
audio: {deviceId: this.state.deviceId ? {exact: this.state.deviceId} : undefined},
};
window.navigator.getUserMedia(
constraints,
this.initializeRecorder,
this.handleError
);
}
This then passes the stream to the initializeRecorder
function which utilises the AudioContext
API to create a createMediaStreamSource`
initializeRecorder = (stream) => {
const audioContext = window.AudioContext;
const context = new audioContext();
const audioInput = context.createMediaStreamSource(stream);
const bufferSize = 2048;
// create a javascript node
const recorder = context.createScriptProcessor(bufferSize, 1, 1);
// specify the processing function
recorder.onaudioprocess = this.recorderProcess;
// connect stream to our recorder
audioInput.connect(recorder);
// connect our recorder to the previous destination
recorder.connect(context.destination);
}
In my recorderProcess
function, I now have an AudioProcessingEvent
object which I can stream.
Currently I am emitting the audio event as as a stream via a socket connection like so:
recorderProcess = (e) => {
const left = e.inputBuffer.getChannelData(0);
this.socket.emit('stream', this.convertFloat32ToInt16(left))
}
Is this the best or only way to do this? Is there a better way by using fs.createReadStream
and then posting the an endpoint via Axios? As far as I can tell this will only work with a file as opposed to a continuous live stream?
Server
I have a very simple socket server running ontop of express. Currently I listen for the stream event and then emit that same input back out:
io.on('connection', (client) => {
client.on('stream', (stream) => {
client.emit('stream', stream)
});
});
Not sure how scalable this is but if you have a better suggestion, I'm very open to it.
Client
Now this is where I am really stuck:
On my client I am listening for the stream
event and want to listen to the stream as audio output in my browser. I have a function that receives the event but am stuck as to how I can use the arrayBuffer object that is being returned.
retrieveAudioStream = () => {
this.socket.on('stream', (buffer) => {
// ... how can I listen to the buffer as audio
})
}
- Is the way I am streaming audio the best / only way I can upload to the node server?
- How can I listen to the arrayBuffer object that is being returned on my client side?
回答1:
- Is the way I am streaming audio the best / only way I can upload to the node server?
Not really the best but i have seen worse, its not the only way either using websockets its considered ok from point of view since you want things to be "live" and not keep sending http post request every 5sec.
- How can I listen to the arrayBuffer object that is being returned on my client side?
You can try this BaseAudioContext.decodeAudioData to listen to data streamed, the example is pretty simple.
From the code snippets you provide i assume you want to build something from scratch to learn how things work.
In that case, you can try MediaStream Recording API along with an websocket server that sends the chunks to X clients so they can reproduce the audio, etc.
It would make sense to invest time into WebRTC API, to learn how to stream from client to another client.
Also take a look at the links below for some useful information.
- (stackoverflow) Get live streaming audio from NodeJS server to clients
- (github) video-conference-webrtc
- twitch.tv tech stack article
- rtc.io
来源:https://stackoverflow.com/questions/50532474/how-to-create-a-live-media-stream-with-javascript