Towards Verifiable Performance Measurement over In-the-Cloud Middleboxes

2019 
In-the-cloud middleboxes have drawn widespread attentions recently, along with the rapid advancement of network function virtualization (NFV). Despite the well known benefits like reduced hardware and maintenance cost, deploying middleboxes in the remote environment poses new performance and security concerns, due to invisibility of the untrusted cloud and susceptible software implementations. One essential requirement for enterprise customers is to monitor performance compliance, while ensuring that packets are faithfully processed by remote middleboxes. In this paper, we propose a practical scheme towards verifiable performance measurement over in-the-cloud middleboxes. It employs “sample and replay” to achieve performance measurement and packet processing attestation. It estimates performance by collecting receipts in a tunable way, while coping with dynamic traffic changes made by middleboxes. In particular, our sampling is stateful which can capture a sequence of packets sharing same states of middleboxes for correct local replay. More importantly, it ensures high-confidence packet processing attestation by enforcing middleboxes to bind execution assurances with packets using commitment messages, and by using delayed verification procedure to defeat any potential biased results against selected sampling. To demonstrate the feasibility and efficiency of our scheme, we implement a prototype consisting of various types of middleboxes on Click, and conduct extensive experiments on Amazon EC2 with real traces. The experimental results show that our scheme imposes marginal processing delay for packets with various middleboxes and presents negligible throughput degradation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    27
    References
    5
    Citations
    NaN
    KQI
    []