Analysis of jitter due to call-level fluctuations
Article Ecrit par: Mandjes, Michel ;
Résumé: In communication networks used by constant bit rate applications, call-level dynamics (i.e. entering and leaving calls) lead to fluctuations in the load, and therefore also fluctuations in the delay (‘jitter’). By intentionally delaying the packets at the destination, one can transform the ‘perturbed’ packet stream back into the original periodic stream; in other words: there is a trade off between jitter and delay, in that jitter can be removed at the expense of delay. As a consequence, for streaming applications for which the packet delay should remain below some predefined threshold, it is desirable that the jitter remains small. This paper presents a set of procedures to compute the jitter due to call-level variations. We consider a network resource shared by a fluctuating set of constant bit rate applications (modelled as periodic sources). As a first step, we study the call-level dynamics: supposing that a tagged call sees n0 calls when entering the system, then we compute the probability that at the end of its duration (consisting of, say, i packets) ni calls are present, of which n0,i stem from the original n0 calls. As a second step, we show how to compute the jitter, for given n0, ni, and n0,i; in this analysis generalised Ballot-problems have to be solved. We find an iterative exact solution to these, and explicit approximations and bounds. Then, as a final step, the (packet-level) results of the second step are weighed with the (call-level) probabilities of the first step, thus resulting in the probability distribution of the jitter experienced within the call duration. An explicit Gaussian approximation is proposed. Extensive numerical experiments validate the accuracy of the approximations and bounds.
Langue:
Anglais