End-to-end uplink delay jitter in LTE systems

2021 
For a single LTE interface on a mobile bonding router, we study end-to-end delay jitter seen by a constant bit rate (CBR) traffic under the uplink (synchronous non-adaptive) hybrid-automatic repeat request (HARQ)-controlled transport block (TB)-based scheduling. The qualitative behavior of the delay jitter is studied experimentally and it is observed that the delay jitter is not a function of the aggregate CBR traffic generation rate alone, but depends on the CBR burst sizes (in bytes) and inter-burst generation duration separately. An explanation of this behavior is provided using an analytical model that explicitly accounts for LTE’s HARQ and TB concepts. The qualitative behavior of jitter is then used to design an end-to-end adaptation algorithm to achieve a suitable level of delay jitter. We then experimentally study the impact of system parameters (RSSI, Cell ID, device location, RSRQ, RSRP and, importantly, the average TB size) on the delay jitter performance. Applying a standard machine-learning-type classification approach, we find that the average TB size acts as sufficient statistics for determination of delay jitter. The adaptation algorithm is then modified to achieve a better delay jitter performance under significant changes such as serving cell handover.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    0
    Citations
    NaN
    KQI
    []