λ-Scaled-attention: A novel fast attention mechanism for efficient modeling of protein sequences

2022 
- technique for a fast and efficient modeling of the protein sequences aimed at addressing these two problems. This is then used to develop the -- technique over its counterpart approach based on the standard attention technique (+1.68% for BP and +5.27% for MF) and state-of-the-art multi-segment based ProtVecGen-Plus approach (+4.70% for BP and +5.30% for MF). Further, fast convergence (converging in half the number of epochs) and efficient learning (captured by low difference between the training and validation losses) were also observed during the training process.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []