COINN: Crypto/ML Codesign for Oblivious Inference via Neural Networks

2021 
We introduce COINN - an efficient, accurate, and scalable framework for oblivious deep neural network (DNN) inference in the two-party setting. In our system, DNN inference is performed without revealing the client's private inputs to the server or revealing server's proprietary DNN weights to the client. To speedup the secure inference while maintaining a high accuracy, we make three interlinked innovations in the plaintext and ciphertext domains: (i) we develop a new domain-specific low-bit quantization scheme tailored for high-efficiency ciphertext computation, (ii) we construct novel techniques for increasing data re-use in secure matrix multiplication allowing us to gain significant performance boosts through factored operations, and (iii) we propose customized cryptographic protocols that complement our optimized DNNs in the ciphertext domain. By co-optimization of the aforesaid components, COINN brings an unprecedented level of efficiency to the setting of oblivious DNN inference, achieving an end-to-end runtime speedup of 4.7×14.4× over the state-of-the-art. We demonstrate the scalability of our proposed methods by optimizing complex DNNs with over 100 layers and performing oblivious inference in the Billion-operation regime for the challenging ImageNet dataset. Our framework is available at https://github.com/ACESLabUCSD/COINN.git.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    0
    Citations
    NaN
    KQI
    []