Learning a deep fully connected neural network from a single image for edit propagation

2020 
We introduce a simple but effective deep fully connected neural network (FNN) to solve the edit propagation as a multiclass pixel-level classification task. We construct the feature space using three- (or one-) dimensional normalized RGB (or grayscale) vectors and spatial coordinates. Our deep FNN-based model consists of color feature extraction, spatial feature extraction, feature combination, and classifier estimation. We train our model only using the feature within the region labeled by the user with strokes in a single image. Then our method directly outputs the edit propagation results after the forward pass without any refinement process. Our method automatically determines the importance of each image feature across the whole image jointly considering feature vectors from user strokes. Extensive experiments demonstrate that the proposed algorithm achieves superior performance over state-of-the-art methods, yet it remains simple and efficient.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []