Cited 5 time in
Two-dimensional attention-based multi-input LSTM for time series prediction
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Kim, Eun Been | - |
| dc.contributor.author | Park, Jung Hoon | - |
| dc.contributor.author | Lee, Yung-Seop | - |
| dc.contributor.author | Lim, Changwon | - |
| dc.date.accessioned | 2023-04-27T19:40:44Z | - |
| dc.date.available | 2023-04-27T19:40:44Z | - |
| dc.date.issued | 2021-01 | - |
| dc.identifier.issn | 2287-7843 | - |
| dc.identifier.issn | 2383-4757 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/5523 | - |
| dc.description.abstract | Time series prediction is an area of great interest to many people. Algorithms for time series prediction are widely used in many fields such as stock price, temperature, energy and weather forecast; in addtion, classical models as well as recurrent neural networks (RNNs) have been actively developed. After introducing the attention mechanism to neural network models, many new models with improved performance have been developed; in addition, models using attention twice have also recently been proposed, resulting in further performance improvements. In this paper, we consider time series prediction by introducing attention twice to an RNN model. The proposed model is a method that introduces H-attention and T-attention for output value and time step information to select useful information. We conduct experiments on stock price, temperature and energy data and confirm that the proposed model outperforms existing models. | - |
| dc.format.extent | 19 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | KOREAN STATISTICAL SOC | - |
| dc.title | Two-dimensional attention-based multi-input LSTM for time series prediction | - |
| dc.type | Article | - |
| dc.publisher.location | 대한민국 | - |
| dc.identifier.doi | 10.29220/CSAM.2021.28.1.039 | - |
| dc.identifier.scopusid | 2-s2.0-85102135897 | - |
| dc.identifier.wosid | 000616531100003 | - |
| dc.identifier.bibliographicCitation | COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, v.28, no.1, pp 39 - 57 | - |
| dc.citation.title | COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS | - |
| dc.citation.volume | 28 | - |
| dc.citation.number | 1 | - |
| dc.citation.startPage | 39 | - |
| dc.citation.endPage | 57 | - |
| dc.type.docType | Article | - |
| dc.identifier.kciid | ART002682666 | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.description.journalRegisteredClass | esci | - |
| dc.description.journalRegisteredClass | kciCandi | - |
| dc.relation.journalResearchArea | Mathematics | - |
| dc.relation.journalWebOfScienceCategory | Statistics & Probability | - |
| dc.subject.keywordPlus | REPRESENTATIONS | - |
| dc.subject.keywordPlus | MODELS | - |
| dc.subject.keywordAuthor | recurrent neural network | - |
| dc.subject.keywordAuthor | correlation | - |
| dc.subject.keywordAuthor | attention | - |
| dc.subject.keywordAuthor | time series | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
