Old Web
English
Sign In
Acemap
>
Paper
>
Multi-Granularity Structural Knowledge Distillation for Language Model Compression.
Multi-Granularity Structural Knowledge Distillation for Language Model Compression.
2022
Chang Liu
Chongyang Tao
Jiazhan Feng
Dongyan Zhao
Correction
Cite
Save
Machine Reading By IdeaReader
0
References
0
Citations
NaN
KQI
[]