I want to measure bandwidh using c#. Here what I did. Comments and suggestions are welcome.
- Find maximum udp payload(on my test bed, its 1472 byte)
- Create non compressible data with 1472 byte size
- Send this data from a server to a client multiple times(on my test, its 5000 packets)
- Client start stopwatch at the time the first packet arrive
- When all data has been sent, send notification to client stating all data has been sent
- Client stop stopwatch
- I calculate bandwidth as (total packet sent(5000) * MTU(1500bytes)) / time lapse
- I notice that some packets are loss. a best, 20% loss. at worst 40% loss. I did not account this when calculating the bandwidth. I suspect client network device experience buffer overrun. Do I need to take account this factor?
If you guys have any suggestion or comment, feel free to do so.
Thanks.
To calculate bandwith, I would use TCP instead of UDP. When you use UDP all the datagrams may get out really fast through your network card (at 100mbps) and get queued at the "slowest link" of the chain (e.g. a 512kbps cable modem/router). If the queue buffer gets full, its likely that datagrams will be discarded. So your test is not very reliable.
I would use TCP and make some math to transform tcp speed (KB/s) to throughput (Mbps) (I think TCP overhead is around 8%)
来源:https://stackoverflow.com/questions/4167278/how-to-calculate-bandwidth-using-c-sharp