This is a static archive of our old Q&A Site. Please post any new questions and answers at ask.wireshark.org.

Tick interval question

0

Hello. I am doing some testing of UDP video streams that are being multicasted. I am playing back 8 streams while capturing with wireshark. When I generate an IO graph using a tick interval of 1sec, I get the expected avg of just under 140Mbps (8 streams @ ~17Mbps each). I am playing around with the tick interval and I don't understand why the bandwidth seems to increase with every decrease of the tick interval? So if I set the tick interval to .001, I am seeing spikes well over 1 000 000 bits. If I understand the relationship between bits and tick interval that equals 1 000 000 000 bits/second, or 1Gbps. If this is incorrect could somebody please set me straight on this? If it is correct, how is this possible, since the video streams really shouldn't be going anywhere near that high, and to boot my NIC is only 1Gbps. I'm very confused here.

Thanks! Mike

asked 26 Dec '12, 11:57

acedreds's gravatar image

acedreds
1111
accept rate: 0%

can you please post the screenshots for the intervals of 1,0.1,0.001?

BTW: What is your Wireshark version and OS (wireshark -v)?

(27 Dec '12, 01:34) Kurt Knochner ♦

Mike, remember, there are only two utilization states. Zero percent utilized and 100% utilized. Either the packet is on the wire, or it's not. So as you make your avg interval smaller, the utilization will go up. I can't speak for the validity of the graph in Wireshark, but the fact that utilization goes up when you average it over a smaller time slot is perfectly normal and expected. Google for "network microburst" and you will get a more detailed answer.

(02 Jan '13, 16:26) hansangb