Old Web
English
Sign In
Acemap
>
Paper
>
Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?
Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?
2022
En-Shiun Annie Lee
Sarubi Thillainathan
Shravan Nayak
Surangika Ranathunga
David Ifeoluwa Adelani
Ruisi Su
Arya D. McCarthy
Correction
Cite
Save
Machine Reading By IdeaReader
0
References
0
Citations
NaN
KQI
[]