If your time column is configured to display "seconds since beginning of capture" and you look at the timestamp (call it T) of the last packet in the capture file, this is the total elapsed time from the first packet to the last. (Note: You can also see this if you look at Statistics -> Summary -> Elapsed; however, that currently only displays second resolution, so it won't be as accurate.)
The last packet's frame number (call it F) will also indicate the number of packets in the capture file. The average number of packets/sec for this capture file is then just F/T. The stats show Rate(ms), so the first stat will apply to all packets and be F/T/1000. All the other stats will be computed based on the count (call it C) of matching packets for the particular stat, i.e., C/T/1000.
For example, in one capture file I happened to be looking at, the elapsed time T was 14.891 seconds. During that time the total number of packets captured F was 771; therefore, the average packet rate is 771 packets/14.891 seconds = 51.776 packets/second. Wireshark displays the Rate stats in milliseconds, so for the overall rate, it shows 0.051776 packets/ms. Within the trace, once particular stat count C shows that there were 332 packets having a packet length of 40-79 bytes, so the rate for those packets is 332/14.891/1000 = 0.022295.
answered 22 Sep '13, 16:50

cmaynard ♦♦
9.4k●10●38●142
accept rate: 20%
Thank you for this in depth comment. This would be really good to add to either the wiki or the docs. Especially, the README.stats_tree document.