A vegetable disease recognition model for complex background based on region proposal and progressive learning

2021 
Abstract The method of vegetable disease recognition based on deep learning has achieved relative success under limited conditions, but a recognition model targeting at simple backgrounds may be subjected to greatly-reduced accuracy in actual production environments. This is due to the fact that the disease images obtained from actual production environments are usually of complex backgrounds, which may contain elements similar to the disease features or symptoms. Such a phenomenon significantly increases the difficulty of disease recognition. In response to this problem, we proposed a vegetable disease recognition model for complex backgrounds based on region proposal and progressive learning (PRP-Net). This model can locate the regions of interest in the images of diseased leaves based on the region proposal network. The located regions will then be cropped and enlarged as the input to proceed with progressive learning in a finer-scale network. The region proposal part of this model can guide the model to focus on the regions of interest in the disease images with complex backgrounds in a weakly supervised manner, so as to avoid the expensive costs to manually label the key regions of the image. The progressive learning network allows the model to learn global features and local delicate features in a progressive manner. In a self-collected image dataset containing 6 types of complex-background vegetable diseases, this model achieves an average recognition accuracy of 98.26%, which is 4.46 percentage points higher than that of the original region proposal network framework (RA-CNN). Meanwhile, it is also superior to any feature extraction network that is used alone. This study provides useful ideas and methodological concepts for recognizing vegetable diseases under complex backgrounds.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    0
    Citations
    NaN
    KQI
    []