Personal authentication and recognition of aerial input Hiragana using deep neural network

2021 
We use Leap Motion and a deep neural network to perform personal authentication and character recognition of all hiragana characters entered in the air. We use Leap Motion to detect the index finger and store its trajectory as time series data. The input data was pre-processed to unify the data length by linear interpolation. For identification, the accuracy of Long Short Term Memory (LSTM) was compared with Support Vector Machine (SVM). As a result, SVM and LSTM achieved 97.25% and 98.18% F-measure in character recognition, respectively. In personal authentication, SVM has an accuracy of 92.45%, False Acceptance Rate (FAR) was 0.73%, and False Rejection Rate (FRR) was 41.59%. On the other hand, LSTM had an accuracy of 96.13%, FAR of 1.73% and FRR of 14.55%. Overall, the LSTM performed better than the SVM.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []