Relational Pretrained Transformers towards Democratizing Data Preparation [Vision]

2020 
Can AI help automate human-easy but computer-hard data preparation tasks (for example, data cleaning, data integration, and information extraction), which currently heavily involve data scientists, practitioners, and crowd workers? We envision that human-easy data preparation for relational data can be automated. To this end, we first identify the desiderata for computers to achieve near-human intelligence for data preparation: computers need a deep-learning architecture (or model) that can read and understand millions of tables; computers require unsupervised learning to perform self-learning without labeled data, and can gain knowledge from existing tasks and previous experience; and computers desire few-shot learn-ing that can adjust to new tasks with a few examples. Our proposal is called Relational Pretrained Transformers (RPTs), a general frame-work for various data preparation tasks, which typically consists of the following models/methods: (1) transformer, a general and powerful deep-learning model, that can read tables/texts/images;(2) masked language model for self-learning and collaborative train-ing for transferring knowledge and experience; and (3) pattern-exploiting training that better interprets a task from a few examples.We further present concrete RPT architectures for three classical data preparation tasks, namely data cleaning, entity resolution, and information extraction. We demonstrate RPTs with some initial yet promising results. Last but not least, we identify activities that will unleash a series of research opportunities to push forward the field of data preparation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    73
    References
    4
    Citations
    NaN
    KQI
    []