问题
I need to measure a one direction latency measurement between two applications that communicate thru a LAN and report the result to a data collection server.
The client application sends the data using multicast, then it passes thru two servers and the last server is the end point of this test, like so:
Agent -> multicast cloud -> server 1 -> server 2
I thought about using NTP (or PTP for LAN) to synchronize "agent" and "server 2", but I wonder what's the right algorithm to implement this and what would be its precision.
How can I perform this measurement (using C#)? and what would its precision be?
UPDATE: note that the data is being processed between agent and server 2, so the measurement is not purely network-wise.
回答1:
Underlying problem is synchronizing the clock between two or more machines.
Synchronization (of clocks) between two remote computers
Once you have this, you simply add the add a "construction time" to the packet itself, when it is created/sent. the one-way latency from start to finish is the "arrival time" minus the "construction time"
If you cannot trust the intermediate nodes, then you would have to have Agent register with Server, that it expects to send a packet at a specific time. Then the one-way latency from start to finish is the "arrival time" minus the "expected-arrival time"
来源:https://stackoverflow.com/questions/15177697/how-to-can-i-implement-one-way-latency-measurment-using-c