What is Jitter and how you measure it?

  • Jitter is the variation (variance) in measuring successive latency tests. A network with constant latency has no variation (or jitter). Zero jitter means the results were exactly the same every time, and anything above zero is the amount by which they varied. A lower jitter value is better and shows a more stable connection.
  • We measure the latency for 10 times and pick the minimum as the latency result.
  • Jitter would be the variance of these 10 latency values and is in milliseconds. 

Jitter Result   Acceptability
   <   1 ms       Excellent
   <   5 ms       Extremely Good
   <  20 ms      Very Good
   <  50 ms      Good
   <  80 ms      Good to Fair
   < 100 ms     Fair

Have more questions? Submit a request


Article is closed for comments.