Decoding BERT’s Internal Processing of Garden-Path Structures through Attention Maps
- Authors
- Lee, Jonghyun; Shin, Jeong-Ah
- Issue Date
- Jun-2023
- Publisher
- 한국영어학회
- Keywords
- attention map; garden-path structure; Natural Language Processing; Psycholinguistics; Transformers
- Citation
- 영어학, v.23, pp 461 - 481
- Pages
- 21
- Indexed
- SCOPUS
KCI
- Journal Title
- 영어학
- Volume
- 23
- Start Page
- 461
- End Page
- 481
- URI
- https://scholarworks.dongguk.edu/handle/sw.dongguk/20601
- DOI
- 10.15738/kjell.23..202306.461
- ISSN
- 1598-1398
2586-7474
- Abstract
- Recent advancements in deep learning neural models, such as BERT, have demonstrated remarkable performance in natural language processing tasks, yet understanding their internal processing remains a challenge. This study employs the method of examining attention maps to uncover the internal processing of BERT, specifically when dealing with garden-path sentences. The analysis focuses on BERT's utilization of linguistic cues, such as transitivity, plausibility, and the presence of a comma, and evaluates its capacity for reanalyzing misinterpretations. The results revealed that BERT exhibits human-like syntactic processing by attending to the presence of a comma, showing sensitivity to transitivity, and reanalyzing misinterpretations, despite initially lacking sensitivity to plausibility. By concentrating on attention maps, the present study provides valuable insights into the inner workings of BERT and contributes to a deeper understanding of how advanced neural language models acquire and process complex linguistic structures. © 2023 KASELL. All rights reserved.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Humanities > Division of English Language & Literature > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.