Jitter in audio or video streams is indeed measured in milliseconds (ms), representing the variation in the time it takes for data to travel from the source to the destination. The acceptable level of jitter before causing distortion can vary depending on the specific application and standards, with some says 30 ms.
How would the aforementioned Jittle test measured in dB be translated to time delay? thx.