Detailed Information

Cited 1 time in webofscience Cited 1 time in scopus
Metadata Downloads

Reactive virtual agent learning for NUI-based HRI applications

Full metadata record
DC Field Value Language
dc.contributor.authorJin, Daxing-
dc.contributor.authorCho, Seoungjae-
dc.contributor.authorSung, Yunsick-
dc.contributor.authorCho, Kyungeun-
dc.contributor.authorUm, Kyhyun-
dc.date.accessioned2024-09-25T03:00:48Z-
dc.date.available2024-09-25T03:00:48Z-
dc.date.issued2016-12-
dc.identifier.issn1380-7501-
dc.identifier.issn1573-7721-
dc.identifier.urihttps://scholarworks.dongguk.edu/handle/sw.dongguk/23455-
dc.description.abstractThe natural user interface (NUI) has been investigated in a variety of fields in application software. This paper proposes an approach to generate virtual agents that can support users for NUI-based applications through human-robot interaction (HRI) learning in a virtual environment. Conventional human-robot interaction (HRI) learning is carried out by repeating processes that are time-consuming, complicated and dangerous because of certain features of robots. Therefore, a method is needed to train virtual agents that interact with virtual humans imitating human movements in a virtual environment. Then the result of this virtual agent can be applied to NUI-based interactive applications after the interaction learning is completed. The proposed method was applied to a model of a typical house in virtual environment with virtual human performing daily-life activities such as washing, eating, and watching TV. The results show that the virtual agent can predict a human's intent, identify actions that are helpful to the human, and can provide services 16 % faster than a virtual agent trained using traditional Q-learning.-
dc.format.extent14-
dc.language영어-
dc.language.isoENG-
dc.publisherSPRINGER-
dc.titleReactive virtual agent learning for NUI-based HRI applications-
dc.typeArticle-
dc.publisher.location네델란드-
dc.identifier.doi10.1007/s11042-014-2048-5-
dc.identifier.scopusid2-s2.0-84901729002-
dc.identifier.wosid000388121700002-
dc.identifier.bibliographicCitationMULTIMEDIA TOOLS AND APPLICATIONS, v.75, no.23, pp 15157 - 15170-
dc.citation.titleMULTIMEDIA TOOLS AND APPLICATIONS-
dc.citation.volume75-
dc.citation.number23-
dc.citation.startPage15157-
dc.citation.endPage15170-
dc.type.docTypeArticle-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalWebOfScienceCategoryComputer Science, Information Systems-
dc.relation.journalWebOfScienceCategoryComputer Science, Software Engineering-
dc.relation.journalWebOfScienceCategoryComputer Science, Theory & Methods-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.subject.keywordAuthorNatural user interface-
dc.subject.keywordAuthorNatural user experience-
dc.subject.keywordAuthorHuman-robot interaction-
dc.subject.keywordAuthorVirtual agent learning-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Advanced Convergence Engineering > Department of Computer Science and Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Cho, Kyung Eun photo

Cho, Kyung Eun
College of Advanced Convergence Engineering (Department of Computer Science and Artificial Intelligence)
Read more

Altmetrics

Total Views & Downloads

BROWSE