Learning compatibility knowledge for outfit recommendation with complementary clothing matching

2022 
Abstract With the rapid development of mobile networks and e-commerce, clothing recommendation has achieved considerable success in recent years. Fashion outfit matching has become an essential component to users while shopping, which helps users to select and present items to individuals in a personalized fashion recommendation. Apparently, it is an arduous task to guide complementary clothing matching due to the complexity and subjectivity of fashion items. Some existing solutions have been presented in recent years, which are tending to discover a series of visual cues to establish the matching relations. However, it would be mismatched easily due to these methods being hard to represent all the potential semantic information from the appearance of clothes. To thoroughly make use of the visual characteristics of clothing products and the related description information, we propose a complementary clothing matching method with some compatibility knowledge, named it CCMCK shortly. For visual compatibility, we adopt the graph neural network to model the visual relationship between items. To generate an outfit that satisfies the requirement of fashion compatibility, we propose a matching way under the compatibility constraint and seek to recommend compatible items based on multi-modal compatibility. Finally, we performed a qualitative investigation on the fill-in-the-blank and fashion outfit compatibility tasks to evaluate the proposed method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    33
    References
    0
    Citations
    NaN
    KQI
    []