Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

부정극어 허가 처리 평가: 영어 교과서를 학습한 L2 신경망 언어 모델을 활용하여

Full metadata record
DC Field Value Language
dc.contributor.author구건우-
dc.contributor.author이재민-
dc.contributor.author박명관-
dc.date.accessioned2023-04-27T10:40:46Z-
dc.date.available2023-04-27T10:40:46Z-
dc.date.issued2022-07-
dc.identifier.issn1598-1886-
dc.identifier.issn2713-6817-
dc.identifier.urihttps://scholarworks.dongguk.edu/handle/sw.dongguk/2872-
dc.description.abstractThe GRNN (Gulordava et al. 2018) neural language model (NLM) is viewed as a language learner in that it is trained with sentences and, like humans, ‘predicts’ the next-word given a sequence of words. Recent studies employing NLMs have reported their human-like performances in ‘understanding’ various linguistic phenomena. Building on previous studies, this paper aims to assess the level of linguistic knowledge that an NLM can acquire from a collection of English textbooks published in Korea for last two decades. We applied a psycholinguistic experimental method to compare the L2-GRNN to the L1-GRNN, focusing on the learning/processing of negative polarity item (NPI)-licensing in English. The L1-GRNN LM that was trained with the dataset from Wikipedia was reported to attain the linguistic knowledge that native speakers of English have. The L2-GRNN LM was trained with learning materials for Korean English learners. The result of analyzing the data extracted from the NLMs in NPI processing showed that the overall performance of the L2-GRNN NLM was far behind that of the L1-GRNN LM. In conclusion, this study demonstrates that the L2-GRNN has attained a far lower level of grammatical knowledge in NPI processing than its L1 counterpart. This result implies that the L2 dataset representing English textbooks published in Korea is not enough for the L2-GRNN NLM to acquire substantial grammatical knowledge that the L1-GRNN NLM as well as native speakers has.-
dc.format.extent23-
dc.language한국어-
dc.language.isoKOR-
dc.publisher서강대학교 언어정보연구소-
dc.title부정극어 허가 처리 평가: 영어 교과서를 학습한 L2 신경망 언어 모델을 활용하여-
dc.title.alternativeAn assessment of processing negative polarity items by an L2 neural language model trained on English textbooks-
dc.typeArticle-
dc.publisher.location대한민국-
dc.identifier.doi10.29211/soli.2022.46..004-
dc.identifier.bibliographicCitation언어와 정보 사회, v.46, pp 103 - 125-
dc.citation.title언어와 정보 사회-
dc.citation.volume46-
dc.citation.startPage103-
dc.citation.endPage125-
dc.identifier.kciidART002864758-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClasskci-
dc.subject.keywordAuthorneural language model-
dc.subject.keywordAuthorNPI licensing-
dc.subject.keywordAuthortraining data-
dc.subject.keywordAuthorL2 textbook assessment-
dc.subject.keywordAuthor신경망 언어 모델-
dc.subject.keywordAuthor부정극어 허가-
dc.subject.keywordAuthor훈련 데이터-
dc.subject.keywordAuthorL2 교재 평가-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Humanities > Division of English Language & Literature > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Park, Myung Kwan photo

Park, Myung Kwan
College of Humanities (Division of English Language and Literature)
Read more

Altmetrics

Total Views & Downloads

BROWSE