Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

L2 영어 교과서를 ‘학습’한 L2-신경망 언어 모델의 문법 일반화 양상

Full metadata record
DC Field Value Language
dc.contributor.author구건우-
dc.contributor.author박명관-
dc.date.accessioned2023-04-27T12:40:54Z-
dc.date.available2023-04-27T12:40:54Z-
dc.date.issued2022-03-
dc.identifier.issn1226-3206-
dc.identifier.urihttps://scholarworks.dongguk.edu/handle/sw.dongguk/3495-
dc.description.abstractRecent studies employing state-of-the-art neural network language models (NLMs) have reported their human-like performances in ‘understanding’ various linguistic phenomena particularly through the Benchmark of Linguistic Minimal Pairs (BLiMP), which is a challenge test dataset of sentences to be used for evaluating the linguistic knowledge of NLMs on major grammatical phenomena in English (Warstadt et al., 2020). Adopting the methodology at hand, this paper aims to assess the level of linguistic knowledge acquired by L2-NLMs trained on English textbooks (alias the K-English datasets) published in Korea and compare it with the corresponding different levels in English native speakers and L1-NLMs. Assuming that an NLM is also a language learner, we used the BLiMP to evaluate the grammaticality rating performances of L2-NLMs based on Generation Pre-trained Transformer-2 (GPT-2) and Long Short-Term Memory (LSTM). In conclusion, this study demonstrates that the L2-NLMs have attained a substantially lower level of grammatical generalization than L1 counterparts as well as English native speakers. The results imply that the K-English training datasets are not robust enough for L2 NLMs to make substantial grammatical generalizations.-
dc.format.extent17-
dc.language한국어-
dc.language.isoKOR-
dc.publisher현대문법학회-
dc.titleL2 영어 교과서를 ‘학습’한 L2-신경망 언어 모델의 문법 일반화 양상-
dc.title.alternativeGrammatical Generalizations in Neural Language Models Trained on L2 Textbooks-
dc.typeArticle-
dc.publisher.location대한민국-
dc.identifier.doi10.14342/smog.2022.113.121-
dc.identifier.bibliographicCitation현대문법연구, no.113, pp 121 - 137-
dc.citation.title현대문법연구-
dc.citation.number113-
dc.citation.startPage121-
dc.citation.endPage137-
dc.identifier.kciidART002828883-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClasskci-
dc.subject.keywordAuthor언어학적 일반화-
dc.subject.keywordAuthor신경망 언어 모델-
dc.subject.keywordAuthorLSTM-
dc.subject.keywordAuthorGPT-2-
dc.subject.keywordAuthorL2-신경망 언어 모델-
dc.subject.keywordAuthorlinguistic generalization-
dc.subject.keywordAuthorneural language model-
dc.subject.keywordAuthorLSTM-
dc.subject.keywordAuthorGPT-2-
dc.subject.keywordAuthorL2-language models-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Humanities > Division of English Language & Literature > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Park, Myung Kwan photo

Park, Myung Kwan
College of Humanities (Division of English Language and Literature)
Read more

Altmetrics

Total Views & Downloads

BROWSE