Cited 0 time in
Transformer with multi-head attention mechanism for bearing fault detection
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Kumar, Prashant | - |
| dc.contributor.author | Raouf, Izaz | - |
| dc.contributor.author | Azad, Muhammad Muzammil | - |
| dc.contributor.author | Kim, Heung Soo | - |
| dc.date.accessioned | 2026-03-17T07:00:19Z | - |
| dc.date.available | 2026-03-17T07:00:19Z | - |
| dc.date.issued | 2026-05 | - |
| dc.identifier.issn | 1568-4946 | - |
| dc.identifier.issn | 1872-9681 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/63999 | - |
| dc.description.abstract | Timely fault detection of bearings is vital to ensure the uninterrupted functioning of rotating machinery in industry. Research on fault detection strategy for bearings has accelerated due to developments in artificial intelligence and computation, although managing large amounts of intricate sensor data remains complex. The traditional approaches rely on handcrafted features and shallow learning models, which in high-dimensional sensor data, may struggle to capture complex fault patterns. Using the transformer model with a multi-head attention mechanism, this work designs an innovative bearing fault detection method. Transformer architecture, renowned for its efficacy in natural language processing applications, is tailored to handle sequential sensor input efficiently, without requiring identifiable features. The multi-head attention mechanism enables the model to capture both local and global dependencies that are essential for fault recognition, which aids in the model's focus on various input sequence segments. The model was evaluated on a two real-world benchmark bearing dataset namely, Case Western Reserve University (CWRU) bearing dataset and Society for Machinery Failure Prevention Technology (MFPT) bearing dataset and compared with the deep learning baselines. This work also eliminated the need for extensive data processing and achieved superior fault detection accuracy. The results reveal the efficacy of the transformer model with multi-head attention in accurately detecting bearing faults, showcasing its potential for real-time condition monitoring in industrial scenarios. These findings demonstrate transformers’ contribution to predictive maintenance advancement, guiding the industry toward scalable, real-time monitoring that reduces downtime and increases sustainability. The future work could integrate the edge computing for on-device use and adapt to multi-fault scenarios. © 2026 Elsevier B.V. | - |
| dc.format.extent | 10 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | Elsevier Ltd | - |
| dc.title | Transformer with multi-head attention mechanism for bearing fault detection | - |
| dc.type | Article | - |
| dc.publisher.location | 네델란드 | - |
| dc.identifier.doi | 10.1016/j.asoc.2026.114980 | - |
| dc.identifier.scopusid | 2-s2.0-105032360996 | - |
| dc.identifier.wosid | 001715262800001 | - |
| dc.identifier.bibliographicCitation | Applied Soft Computing, v.194, pp 1 - 10 | - |
| dc.citation.title | Applied Soft Computing | - |
| dc.citation.volume | 194 | - |
| dc.citation.startPage | 1 | - |
| dc.citation.endPage | 10 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Computer Science | - |
| dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence | - |
| dc.relation.journalWebOfScienceCategory | Computer Science, Interdisciplinary Applications | - |
| dc.subject.keywordAuthor | Bearing fault | - |
| dc.subject.keywordAuthor | Multi-head attention | - |
| dc.subject.keywordAuthor | Transformer | - |
| dc.subject.keywordAuthor | Vibration | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
