Which LSTM Type is Better for Interaction Force Estimation?
- Authors
- Cho, Hyeon; Kim, Hyungho; Ko, Dae-Kwan; Lim, Soo-Chul; Hwang, Wonjun
- Issue Date
- Nov-2019
- Publisher
- IEEE
- Citation
- 2019 7TH INTERNATIONAL CONFERENCE ON ROBOT INTELLIGENCE TECHNOLOGY AND APPLICATIONS (RITA), pp 61 - 66
- Pages
- 6
- Indexed
- SCOPUS
- Journal Title
- 2019 7TH INTERNATIONAL CONFERENCE ON ROBOT INTELLIGENCE TECHNOLOGY AND APPLICATIONS (RITA)
- Start Page
- 61
- End Page
- 66
- URI
- https://scholarworks.dongguk.edu/handle/sw.dongguk/8636
- DOI
- 10.1109/RITAPP.2019.8932854
- Abstract
- Tactile, one of the five senses classified into the main senses of human, is the first sensation developed when human beings are formed. The tactile includes various information such as pressure, temperature, and texture of objects, it also helps the person to interact with the surrounding environment. One of the tactile information, the pressure is used in various fields such as medical, beauty, mobile devices and so on. However, humans can perceive the real world with multi-modal senses such as sound, vision. In this paper, we study interaction force estimation using haptic sensor and video. Interaction force estimation through video analysis is one of a cross-modal approach that is applicable such as a software haptic feedback method that can give haptic feedback to remote control of robot arm by predicting interaction force even in absence of haptic sensor. we compare and analyze three types of a deep neural network to predict the interaction force. In particular, the best model for the stacking structure of CNN and LSTM is selected through a detailed analysis of how the structure change of LSTM affects the video regression problem. The average error of the best suit model is MSE 0.1306, RMSE 0.2740, MAE 0.1878.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Engineering > Department of Mechanical, Robotics and Energy Engineering > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.