Parameter-Efficient 12-Lead ECG Reconstruction from a Single Lead
- Authors
- Lee, Junseok; Yoo, Yeonho; Kim, Jinkyu; Lim, Dosun; Yang, Gyeongsik; Yoo, Chuck
- Issue Date
- 2026
- Publisher
- Springer Science and Business Media Deutschland GmbH
- Keywords
- ECG reconstruction; Frequency-based segment partitioning; mEcgNet; Parameter-efficient model; Wearable IoT device
- Citation
- Medical Image Computing and Computer Assisted Intervention – MICCAI 2025, v.15961 LNCS, pp 431 - 441
- Pages
- 11
- Indexed
- SCOPUS
- Journal Title
- Medical Image Computing and Computer Assisted Intervention – MICCAI 2025
- Volume
- 15961 LNCS
- Start Page
- 431
- End Page
- 441
- URI
- https://scholarworks.dongguk.edu/handle/sw.dongguk/61728
- DOI
- 10.1007/978-3-032-04937-7_41
- ISSN
- 0302-9743
1611-3349
- Abstract
- With the rise of wearable IoT devices such as smartwatches and smart rings, ECG signals have become more accessible and made cardiovascular monitoring a reality. However, analyzing the ECG signals for complex conditions, such as bundle branch blocks and myocardial infarction, requires multi-lead ECG data. Although various deep learning models for ECG reconstruction have been proposed, they are computationally expensive and unsuitable on resource-constrained wearable IoT devices. To address this challenge, we propose mEcgNet, a parameter-efficient model for reconstructing 12-lead ECG signals from a single lead. mEcgNet introduces a modular deep learning architecture for parameter efficiency and separates the single lead-I signal into multiple frequency segments to improve accuracy. Our experiments demonstrate that mEcgNet significantly reduces the number of parameters and inference time by ∼23.1× and ∼5.4×, respectively, compared to existing state-of-the-art models. Furthermore, it reduces the reconstruction error by ∼22.1%, demonstrating its high accuracy and efficiency. © 2025 Elsevier B.V., All rights reserved.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - ETC > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.