Cited 0 time in
A Continuous Music Recommendation Method Considering Emotional Change
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Baek, Se In | - |
| dc.contributor.author | Lee, Yong Kyu | - |
| dc.date.accessioned | 2025-07-21T07:30:10Z | - |
| dc.date.available | 2025-07-21T07:30:10Z | - |
| dc.date.issued | 2025-06 | - |
| dc.identifier.issn | 2076-3417 | - |
| dc.identifier.issn | 2076-3417 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/58758 | - |
| dc.description.abstract | Music, movies, books, pictures, and other media can change a user's emotions, which are important factors in recommending appropriate items. As users' emotions change over time, the content they select may vary accordingly. Existing emotion-based content recommendation methods primarily recommend content based on the user's current emotional state. In this study, we propose a continuous music recommendation method that adapts to a user's changing emotions. Based on Thayer's emotion model, emotions were classified into four areas, and music and user emotion vectors were created by analyzing the relationships between valence, arousal, and each emotion using a multiple regression model. Based on the user's emotional history data, a personalized mental model (PMM) was created using a Markov chain. The PMM was used to predict future emotions and generate user emotion vectors for each period. A recommendation list was created by calculating the similarity between music emotion vectors and user emotion vectors. To prove the effectiveness of the proposed method, the accuracy of the music emotion analysis, user emotion prediction, and music recommendation results were evaluated. To evaluate the experiments, the PMM and the modified mental model (MMM) were used to predict user emotions and generate recommendation lists. The accuracy of the content emotion analysis was 87.26%, and the accuracy of user emotion prediction was 86.72%, an improvement of 13.68% compared with the MMM. Additionally, the balanced accuracy of the content recommendation was 79.31%, an improvement of 26.88% compared with the MMM. The proposed method can recommend content that is suitable for users. | - |
| dc.format.extent | 34 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | MDPI | - |
| dc.title | A Continuous Music Recommendation Method Considering Emotional Change | - |
| dc.type | Article | - |
| dc.publisher.location | 스위스 | - |
| dc.identifier.doi | 10.3390/app15137222 | - |
| dc.identifier.scopusid | 2-s2.0-105010314092 | - |
| dc.identifier.wosid | 001526196700001 | - |
| dc.identifier.bibliographicCitation | Applied Sciences, v.15, no.13, pp 1 - 34 | - |
| dc.citation.title | Applied Sciences | - |
| dc.citation.volume | 15 | - |
| dc.citation.number | 13 | - |
| dc.citation.startPage | 1 | - |
| dc.citation.endPage | 34 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Chemistry | - |
| dc.relation.journalResearchArea | Engineering | - |
| dc.relation.journalResearchArea | Materials Science | - |
| dc.relation.journalResearchArea | Physics | - |
| dc.relation.journalWebOfScienceCategory | Chemistry, Multidisciplinary | - |
| dc.relation.journalWebOfScienceCategory | Engineering, Multidisciplinary | - |
| dc.relation.journalWebOfScienceCategory | Materials Science, Multidisciplinary | - |
| dc.relation.journalWebOfScienceCategory | Physics, Applied | - |
| dc.subject.keywordPlus | EXPRESSIONS | - |
| dc.subject.keywordPlus | MODELS | - |
| dc.subject.keywordAuthor | emotional changes | - |
| dc.subject.keywordAuthor | emotion prediction | - |
| dc.subject.keywordAuthor | multiple regression analysis | - |
| dc.subject.keywordAuthor | Markov chain | - |
| dc.subject.keywordAuthor | personalized mental model | - |
| dc.subject.keywordAuthor | cosine similarity | - |
| dc.subject.keywordAuthor | recommendation system | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
