How to can I implement one way latency measurment using C#?

梦想的初衷 提交于 2019-12-11 00:31:53

问题


I need to measure a one direction latency measurement between two applications that communicate thru a LAN and report the result to a data collection server.

The client application sends the data using multicast, then it passes thru two servers and the last server is the end point of this test, like so:

Agent -> multicast cloud -> server 1 -> server 2

I thought about using NTP (or PTP for LAN) to synchronize "agent" and "server 2", but I wonder what's the right algorithm to implement this and what would be its precision.

How can I perform this measurement (using C#)? and what would its precision be?

UPDATE: note that the data is being processed between agent and server 2, so the measurement is not purely network-wise.


回答1:


Underlying problem is synchronizing the clock between two or more machines.

Synchronization (of clocks) between two remote computers

Once you have this, you simply add the add a "construction time" to the packet itself, when it is created/sent. the one-way latency from start to finish is the "arrival time" minus the "construction time"

If you cannot trust the intermediate nodes, then you would have to have Agent register with Server, that it expects to send a packet at a specific time. Then the one-way latency from start to finish is the "arrival time" minus the "expected-arrival time"



来源:https://stackoverflow.com/questions/15177697/how-to-can-i-implement-one-way-latency-measurment-using-c

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!