Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

An L2 Neural Language Model of Adaptation to Dative Alternation in English

Authors
최선주박명관
Issue Date
Feb-2022
Publisher
현대영미어문학회
Keywords
neural language model; syntactic priming; adaptation; dative alternation; learning rate; 신경망 언어모델; 통사점화; 적응(학습); 여격구문; 학습률
Citation
현대영미어문학, v.40, no.1, pp 143 - 159
Pages
17
Indexed
KCI
Journal Title
현대영미어문학
Volume
40
Number
1
Start Page
143
End Page
159
URI
https://scholarworks.dongguk.edu/handle/sw.dongguk/3623
DOI
10.21084/jmball.2022.02.40.1.143
ISSN
1229-3814
2713-5349
Abstract
Neural(-network) language models (NLMs) have recently been shown to adapt not only to lexical items but also to abstract syntactic structures. In this study, we provide further evidence for this thesis by showing that the syntactic priming paradigm on an L2 LSTM (Long Short-Term Memory) language model (LM) enhances the ability for it to track abstract properties of sentences compared to the non-cumulative priming paradigm. Furthermore, we investigate the effect of the learning rate on adaptation. In so doing, we probe how much enhancement is due to adapting such an L2 NLM’s syntactic representations. We report the performances of the L2 LSTM LM in the adaptation experiment focusing on dative alternation in English and confirm that they adapt both lexical items and syntactic structures, just as L1 NLMs do.
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Humanities > Division of English Language & Literature > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Park, Myung Kwan photo

Park, Myung Kwan
College of Humanities (Division of English Language and Literature)
Read more

Altmetrics

Total Views & Downloads

BROWSE