How can I find the response time (latency) of a client in NodeJS with sockets (socket.io)?

后端 未结 6 980
故里飘歌
故里飘歌 2021-01-30 07:11

I\'m trying to create a multiplayer game with NodeJS and I want to synchronize the action between clients.

What would be the best way to find the latency (the time that

相关标签:
6条回答
  • 2021-01-30 07:49

    Heres my really quick and dirty script to test the ping ... just head to http://yourserver:8080 in your browser and watch the console (ssh terminal for me).

    var http = require('http');
    var io = require('socket.io');
    
    server = http.createServer(function (req, res) {
      res.writeHead(200, {'Content-Type': 'text/html'});
      res.write('<html>\n');
      res.write('  <head>\n');
      res.write('    <title>Node Ping</title>\n');
      res.write('    <script src="/socket.io/socket.io.js"></script>\n');
      res.write('    <script>\n');
      res.write('        var socket = new io.Socket();\n');
      res.write('        socket.on("connect",function(){ });\n');
      res.write('        socket.on("message",function(){ socket.send(1); });\n');
      res.write('        socket.connect();\n');
      res.write('    </script>\n');
      res.write('  </head>\n');
      res.write('  <body>\n');
      res.write('    <h1>Node Ping</h1>\n');
      res.write('  </body>\n');
      res.write('</html>\n');
      res.end();
    });
    server.listen(8080);
    
    console.log('Server running at http://127.0.0.1:8080/');
    
    var socket = io.listen(server);
    
    socket.on('connection',function(client){
      var start = new Date().getTime();
      client.send(1);
      client.on('message',function(message){ client.send(1);  console.log( new Date$
      client.on('disconnect',function(){});
    });
    

    I'm very curious about this because it seems like my pings are pretty high(200-400ms round trip) on large vps boxes w/ dedicated resources both in california and new jersey. (I'm on the east coast) I'm betting theres just a lot of latency on the vps boxes b/c they're serving so much traffic?

    The thing that gets me is that a regular ping from the linux terminal from the same client to the same server is 11ms on average a factor of 10 lower ... am I doing something wrong or is something slow with node.js/socket.io/websockets?

    0 讨论(0)
  • 2021-01-30 07:56

    After reading all these answers...

    ...I still wasn't satisfied. I visited the official docs and well, well, well - the solution is already built-in.

    You just need to implement it - check out mine:

    Client

    // (Connect to socket).
    
    var latency = 0;
    
    socket.on('pong', function(ms) {
        latency = ms;
    
        console.log(latency);
    });
    
    // Do cool things, knowing the latency...
    

     Server

    var server = require('http').Server(app);
    
    // "socket.io": "^1.7.1"
    // Set pingInterval to whatever you want - 'pong' gets emitted for you!
    var io = require('socket.io')(server, {pingInterval: 5000});
    
    0 讨论(0)
  • 2021-01-30 08:01

    Overview:

    After socket.io connection has been established, you create a new Date object on the client, let's call it startTime. This is your initial time before making a request to the server. You then emit a ping event from the client. Naming convention is totally up to you. Meanwhile server should be listening for a ping event, and when it receives the ping, it immediately emits a pong event. Client then catches the pong event. At this time you want to create another date object that represents Date.now(). So at this point you have two date objects - initial date before making a request to the server, and another date object after you make a request to the server and it replied. Subtract the startTime from current time and you have the latency.

    Client

    var socket = io.connect('http://localhost');
    var startTime;
    
    setInterval(function() {
      startTime = Date.now();
      socket.emit('ping');
    }, 2000);
    
    socket.on('pong', function() {
      latency = Date.now() - startTime;
      console.log(latency);
    });
    

    Server

    io.sockets.on('connection', function (socket) {
      socket.on('ping', function() {
        socket.emit('pong');
      });
    });
    

    Also available as a Github Gist.

    0 讨论(0)
  • 2021-01-30 08:09

    Read first — Due to repeated questions why this is supposed to work, let me clarify a bit.

    • The client callback function is executed on the client, which is why it has access to the closure, including the start variable containing the time stamp. This is the ack() argument in socket.io.
    • The server naturally cannot call an arbitrary function on the client and access the function’s closure. But socket.io allows to define a callback funtion, which appears to be executed by the server, but this actually just passes the function arguments through the web socket, and the client then calls the callback.

    What happens below (please do check the example code!):

    1. Client stores current timestamp 1453213686429 in start
    2. Client sends a ping event to the server and is waiting for an answer
    3. Server responds to the ping event with “Please call your callback with empty arguments”
    4. Client receives the response and calls clientCallback with empty arguments (Check the demo code if you want to see arguments)
    5. clientCallback again takes the current timestamp on the client, e.g. 1453213686449, and knows that 20 ms have passed since it sent the request.

    Imagine the druid (client) holding a stopwatch and pushing the button when the messenger (event) starts running, and pushing it again when the messenger arrives with his scroll (function arguments). The druid then reads the scroll and adds the ingredient names to his potion recipe and brews the potion. (callback)

    Okay, forget the previous paragraph, I guess you got the point.


    Although the question has already been answered, here a short implementation for checking the RTT with socket.io:

    Client

    var start = Date.now();
    this.socket.emit( 'ping', function clientCallback() {
        console.log( 'Websocket RTT: ' + (Date.now() - start) + ' ms' );
    } );
    

    Server

    socket.on( 'ping', function ( fn ) {
        fn(); // Simply execute the callback on the client
    } );
    

    Demo Code

    Demo code as node module: socketIO-callback.tgz Set it up and run it with

    npm install
    node callback.js
    

    and then navigate to http://localhost:5060

    0 讨论(0)
  • 2021-01-30 08:11

    What I usually do to send timestamp with request:

    1. On the client, create a new Date() and send timestamp: date.getTime() to the server, with every JSON request.
    2. On the server, upon receiving a request, put a processed: (new Date()).getTime() in the object.
    3. Handle request.
    4. On the response, put the timestamp from the request, and a new processed field: processed: (new Date()).getTime() - req.processed that now contains the number of milliseconds it took to process the request.
    5. On the client, when receiving a response, take the timestamp (which is the same that was sent on pt 1) and subtract it from the current time, and subtract processing time (processed), and there is your "real" ping time in milliseconds.

    I think you should always include the time for both request and response in the ping time, even if there is one-way communication. This is because that is the standard meaning behind "ping time" and "latency". And if it is one-way communication and the latency is only half of the real ping time, that's just a "good thing".

    0 讨论(0)
  • 2021-01-30 08:13

    I'm going to assume you are using WebSockets or Socket.IO since you are implementing a game where latency matters (and you tagged it as such).

    I would think the server should probably measure and keep track of this for each client.

    You probably want to implement some sort of ping action that the server can request of the client. As soon as the client receives the request, it sends back a response to the server. The server then divides by 2 and updates the latency for that client. You probably want the server to do this periodically with each client and probably average the last several so that you don't get strange behavior from sudden but temporary spikes.

    Then, when there is a message from one client that needs to be sent (or broadcast) to another client, the server can add client1's latency to client2's latency and communicate this as the latency offset to client2 as part of the message. client2 will then know that the event on client1 happened that many milliseconds ago.

    An additional reason to do this on the server is that some browser Javascript timestamps are inaccurate: http://ejohn.org/blog/accuracy-of-javascript-time/. I suspect node.js timestamps are just as accurate (or more so) than V8 (which is one of the few accurate ones).

    0 讨论(0)
提交回复
热议问题