Achieving low latency networks through high performance computing

2014 
We define latency as the time taken to deliver a unit of data from one point to another point in the system. Low latency networks refer to the networks where systems, their architecture, hardware and protocols are designed to bring down this latency. The question that why latency is so important can be answered by mentioning the fact that many applications such as voice transmission, networked gaming, and video transmission, interactive sessions solely depend on the latency of the network. The components of latency include Hardware: Every hardware comes with its own advantages and limitations. For example, some have fixed packet size whereas other may have variable size. Routers and Switches: All the networks components follow their own queuing strategies or congestion control strategies. Traditionally, processing of packets is dependent on the rate on incoming packets. System Latency: The packets to and fro between the application the network interface and this surely forms a part of the latency. Potentially, interruption by system can introduce infinite amount of latency. OS Latency: The processing of the packets by the OS consumes time. It de-multiplexes the packets and sends them to their respective destinations. Application Latency: The application need sufficient amount of CPU resources to perform the task.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []