Hello. I am doing some testing of UDP video streams that are being multicasted. I am playing back 8 streams while capturing with wireshark. When I generate an IO graph using a tick interval of 1sec, I get the expected avg of just under 140Mbps (8 streams @ ~17Mbps each). I am playing around with the tick interval and I don't understand why the bandwidth seems to increase with every decrease of the tick interval? So if I set the tick interval to .001, I am seeing spikes well over 1 000 000 bits. If I understand the relationship between bits and tick interval that equals 1 000 000 000 bits/second, or 1Gbps. If this is incorrect could somebody please set me straight on this? If it is correct, how is this possible, since the video streams really shouldn't be going anywhere near that high, and to boot my NIC is only 1Gbps. I'm very confused here. Thanks! Mike asked 26 Dec '12, 11:57 acedreds |
can you please post the screenshots for the intervals of 1,0.1,0.001?
BTW: What is your Wireshark version and OS (wireshark -v)?
Mike, remember, there are only two utilization states. Zero percent utilized and 100% utilized. Either the packet is on the wire, or it's not. So as you make your avg interval smaller, the utilization will go up. I can't speak for the validity of the graph in Wireshark, but the fact that utilization goes up when you average it over a smaller time slot is perfectly normal and expected. Google for "network microburst" and you will get a more detailed answer.