Self-attention implicit function networks for 3D dental data completion

2021 
Abstract While complete dental models are crucial for digital dentistry, current technologies mostly focus on the 3D dental crown but overlook the dental gum that is important for applications in orthodontics and prosthodontics. To reconstruct the complete dental models with visually realistic geometry from the given crown data, we propose to combine the implicit function representation with the self-attention mechanism. Recent studies have shown that the implicit function is an effective 3D representation for shape completion. However, existing methods fail in dealing with dental models with complex shapes and details, because the convolution and linear operations adopted in their networks are inefficient for modeling long-range dependencies or hard to maintain detailed geometry of the shapes. Therefore, we propose to introduce self-attention to the implicit function network for the first time and use it to effectively capture non-local features at different levels. Extensive ablation studies were conducted to validate the efficiency of our method. Quantitative and qualitative comparisons demonstrate that the feature extracted by our network is more expressive and thus leads to better dental model completion and reconstruction results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    52
    References
    0
    Citations
    NaN
    KQI
    []