Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
2022
Pretraining large neural language models, such as BERT, has led to impressive gains on many natural language processing (NLP) tasks. However, most pretraining efforts focus on general domain corpor...
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
27
References
8
Citations
NaN
KQI