Reduction of Padding Overhead for RLNC Media Distribution With Variable Size Packets

2019 
Random linear network coding (RLNC) can enhance the reliability of multimedia transmissions over lossy communication channels. However, RLNC has been designed for equal size packets, while many efficient multimedia compression schemes, such as variable bitrate (VBR) video compression, produce unequal packet sizes. Padding the unequal packet sizes with zeros to the maximum packet size creates an overhead on the order of 20%–50% or more for typical VBR videos. Previous padding overhead reduction approaches have focused on packing the unequal packet sizes into fixed size packets, e.g., through packet bundling or chaining and fragmentation. We introduce an alternative padding reduction approach based on coding macro-symbols (MSs), whereby an MS is a fixed-sized part of a packet. In particular, we introduce a new class of RLNC, namely MS RLNC which conducts RLNC across columns of MSs, instead of the conventional RLNC across columns of complete packets of equal size. Judiciously arranging the source packets into columns of MSs, e.g., through shifting the source packets horizontally relative to each other, supports favorable MS RLNC coding properties. We specify the MS RLNC encoding and decoding mechanisms and analyze their complexity for a range of specific MS arrangement strategies within the class of MS RLNC. We conduct a comprehensive padding overhead evaluation encompassing both previous approaches of packing the unequal size packets into fixed size packets as well as the novel MS RLNC approaches with long VBR video frame size traces. We find that for small RLNC generation sizes that support low network transport delays, MS RLNC achieves the lowest padding overheads; while for large generation sizes, both the previous packing approaches and the novel MS RLNC approaches effectively reduce the padding overhead.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    81
    References
    3
    Citations
    NaN
    KQI
    []