Low Area-Overhead Low-Entropy Masking Scheme (LEMS) Against Correlation Power Analysis Attack

2019 
The low-entropy masking scheme (LEMS) is a cost-security tradeoff solution that ensures a certain level of security with much lower overheads than a full-entropy masking scheme (FEMS). However, most existing LEMSs are based on a look-up-table (LUT) and limited to the first-order, which is vulnerable to classical higher-order correlation power analysis (CPA) attack and other special types of attack (e.g., collision attack). This paper proposes a new type of LEMS for a block cipher in which the S-box consists of power functions and an affine function. First, a low masking-complexity algorithm for evaluating S-boxes is developed by fully utilizing the property of a hybrid addition-chain (AC) named LUT-AC. Next, an LEMS for block ciphers is proposed. This LEMS provides two different masking modes to realize various cost-security tradeoff schemes. Due to the “masked invariant property” of the LUT-AC, the masking complexity of the proposed LEMS is equal to ${O}$ ( ${d}$ ), whereas under FEMS it is equal to ${O}$ ( ${d}^{{2}}$ ). Compared with existing LEMSs, the proposed LEMS has following advantages: higher security in terms of the masking entropy; resistance against collision attacks; and scalability to higher-order schemes. Per the proposed algorithm, an architecture without any nonlinear multiplication for evaluating AES is developed by replacing the LUT with seven scalar multiplications. The different LEMSs based on this architecture are developed. Their area overheads are evaluated by implementing different schemes in 65 nm CMOS process. The security of the first-order LEMS with rotation mode is verified by performing CPA on the SAKURA-G FPGA board. From the experimental success rates, it shows that the proposed first-order LEMS can resist CPA without revealing the correct subkey for up to 100 000 power traces, whereas the unprotected scheme is broken at 1100 traces.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    5
    Citations
    NaN
    KQI
    []