ServiceBERT: A Pre-trained Model for Web Service Tagging and Recommendation

2021 
Pre-trained models have shown their significant values on a number of natural language processing (NLP) tasks. However, there is still a lack of corresponding work in the field of service computing to effectively utilize the rich knowledge accumulated in the Web service ecosystem. In this paper, we propose ServiceBERT, which learns domain knowledge of Web service ecosystem aiming to support service intelligence tasks, such as Web API tagging and Mashup-oriented API recommendation. The ServiceBERT is developed with the Transformer-based neural architecture. In addition to using the objective of masked language modeling (MLM), we also introduce the replaced token detection (RTD) objective for efficiently learning pre-trained model. Finally, we also implement the contrastive learning to learn noise-invariant representations at the sentence level in pre-training stage. Comprehensive experiments on two service-related tasks successfully demonstrate the better performance of ServiceBERT through the comparison with a variety of representative methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    0
    Citations
    NaN
    KQI
    []