0 I am trying to learn about how are the jitter and mean jitter calculated. I downloaded an example pcap file from http://wiki.wireshark.org/SampleCaptures#head-6f6128a524888c86ee322aa7cbf0d7b7a8fdf353 named aaa.pcap. Actually, this pcap file was used as the example on wiki.wireshark RTP_statistics section to calculate the jitter. When I used the RTP Stream Analysis, it will show some information about every RTP packet including jitter. My question is: Max jitter is 7.80ms, why the Mean jitter is 18.02ms? How does Mean jitter = 18.02ms come from? asked 03 Feb '15, 09:14 Antibes 6●1●1●3 accept rate: 0%

 0 My question is: Max jitter is 7.80ms, why the Mean jitter is 18.02ms? to my understanding, the mean value should not be larger than the max value. Looks like a bug to me. Please file a bug report at https://bugs.wireshark.org Regards Kurt answered 09 Feb '15, 15:32 Kurt Knochner ♦ 24.8k●10●39●237 accept rate: 15%
 toggle preview community wiki:

By Email:

Markdown Basics

• *italic* or _italic_
• **bold** or __bold__
• image?![alt text](/path/img.jpg "title")
• numbered list: 1. Foo 2. Bar
• to add a line break simply add two spaces to where you would like the new line to be.
• basic HTML tags are also supported

Question tags:

×238
×20

question asked: 03 Feb '15, 09:14

question was seen: 3,610 times

last updated: 09 Feb '15, 15:32

### Related questions

p​o​w​e​r​e​d by O​S​Q​A