Mini Pinyin: A modified miniature language for studying language learning and incremental sentence processing.

2020 
Artificial grammar learning (AGL) paradigms are used extensively to characterise (neuro)cognitive bases of language learning. However, despite their effectiveness in characterising the capacity to learn complex structured sequences, AGL paradigms lack ecological validity and typically do not account for cross-linguistic differences in sentence comprehension. Here, we describe a new modified miniature language paradigm – Mini Pinyin – that mimics natural language as it is based on an existing language (Mandarin Chinese) and includes both structure and meaning. Mini Pinyin contains a number of cross-linguistic elements, including varying word orders and classifier-noun rules. To evaluate the effectiveness of Mini Pinyin, 76 (mean age = 24.9; 26 female) monolingual native English speakers completed a learning phase followed by a sentence acceptability judgement task. Generalised mixed effects modelling revealed that participants attained a moderate degree of accuracy on the judgement task, with performance scores ranging from 25% to 100% accuracy depending on the word order of the sentence. Further, sentences compatible with the canonical English word order were learned more efficiently than non-canonical word orders. We controlled for inter-individual differences in statistical learning ability, which accounted for ~20% of the variance in performance on the sentence judgement task. We provide stimuli and statistical analysis scripts as open-source resources and discuss how future research can utilise this paradigm to study the neurobiological basis of language learning. Mini Pinyin affords a convenient tool for improving the future of language learning research by building on the parameters of traditional AGL or existing miniature language paradigms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    110
    References
    3
    Citations
    NaN
    KQI
    []