Detailed Information

Cited 0 time in webofscience Cited 1 time in scopus
Metadata Downloads

An L2 Neural Language Model of Adaptation

Authors
Choi, SunjooPark, Myung-Kwan
Issue Date
May-2022
Publisher
한국영어학회
Keywords
adaptation; learning rate; neural language model; perplexity; surprisal; syntactic priming
Citation
영어학, v.22, pp 547 - 562
Pages
16
Indexed
SCOPUS
KCI
Journal Title
영어학
Volume
22
Start Page
547
End Page
562
URI
https://scholarworks.dongguk.edu/handle/sw.dongguk/3820
DOI
10.15738/kjell.22..202205.547
ISSN
1598-1398
2586-7474
Abstract
In recent years, the increasing capacities of neural language models (NLMs) have led to a surge in research into their representations of syntactic structures. A wide range of methods have been used to address the linguistic knowledge that NLMs acquire. In the present study, using the syntactic priming paradigm, we explore the extent to which the L2 LSTM NLM is susceptible to syntactic priming, the phenomenon where the syntactic structure of a sentence makes the same structure more probable in a follow-up sentence. In line with the previous work by van Schijndel and Linzen (2018), we provide further evidence for the issue concerned by showing that the L2 LM adapts to abstract syntactic properties of sentences as well as to lexical items. At the same time we report that the addition of a simple adaptation method to the L2 LSTM NLM does not always improve on the NLM’s predictions of human reading times, compared to its non-adaptive counterpart. © 2022 KASELL All rights reserved.
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Humanities > Division of English Language & Literature > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Park, Myung Kwan photo

Park, Myung Kwan
College of Humanities (Division of English Language and Literature)
Read more

Altmetrics

Total Views & Downloads

BROWSE