问题
I have .NET Core console application which acts as UDP Server and UDP Client
- UDP client by receiving audio packet.
- UDP server, by sending each received packet.
Here's a sample code of the console app:
static UdpClient udpListener = new UdpClient();
static IPEndPoint endPoint = new IPEndPoint(IPAddress.Parse("192.168.1.230"), 6980);
static IAudioSender audioSender = new UdpAudioSender(new IPEndPoint(IPAddress.Parse("192.168.1.230"), 65535));
static void Main(string[] args)
{
udpListener.Client.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReuseAddress, true);
udpListener.Client.Bind(endPoint);
try
{
udpListener.BeginReceive(new AsyncCallback(recv), null);
}
catch (Exception e)
{
throw e;
}
Console.WriteLine("Press enter to dispose the running service");
Console.ReadLine();
}
private async static void recv(IAsyncResult res)
{
byte[] received = udpListener.EndReceive(res, ref endPoint);
OnAudioCaptured(received);
udpListener.BeginReceive(new AsyncCallback(recv), null);
}
On the other side, I have a node js API application, which supposes to execute an FFmpeg command as a child process and to do the following
- receive the audio packet as an input from the console app UDP server.
- convert the received bytes into WebM
- pipe out the result into the response.
Finally, in the client-side, I should have an audio element with source value equals to the http://localhost:3000
For now, I can only execute this FFmpeg command:
ffmpeg -f s16le -ar 48000 -ac 2 -i 'udp://192.168.1.230:65535' output.wav
Which do the following
- Receive UDP packet as an input
- Convert the received bytes into the output.wav audio file.
How would I execute a child process in the node js server which receives the UDP packets and pipe out the result into the response as Webm?
回答1:
Huge thanks to Brad, he directed me to the solution.
In node js server I had to do the following
- Execute the Ffmpe as a child process which receives the UDP packet as an input
- Pipe out each result of the child process into the response.
Here's the source code the node js server:
var http = require("http");
var port = 8888;
var host = "localhost";
var children = require("child_process");
http
.createServer(function (req, res) {
//ffmpeg -f s16le -ar 48000 -ac 2 -i 'udp://192.168.1.230:65535' -b:a 128k -f webm -
var ffm = children.spawn(
"ffmpeg",
"-f s16le -ar 48000 -ac 2 -i udp://192.168.1.230:65535 -b:a 128k -f webm -".split(
" "
)
);
res.writeHead(200, { "Content-Type": "audio/webm;codecs=vorbis" });
ffm.stdout.on("data", (data) => {
console.log(data);
res.write(data);
});
})
.listen(port, host);
console.log("Server running at http://" + host + ":" + port + "/");
Client-side:
<audio src="http://loclahost:8888" type='audio/webm; codecs="vorbis"' controls preload="none"></audio>
来源:https://stackoverflow.com/questions/62006071/ffmpeg-how-to-play-live-audio-in-the-browser-from-received-udp-packets-using-f