Cited 134 time in
Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Naqvi, Rizwan Ali | - |
| dc.contributor.author | Arsalan, Muhammad | - |
| dc.contributor.author | Batchuluun, Ganbayar | - |
| dc.contributor.author | Yoon, Hyo Sik | - |
| dc.contributor.author | Park, Kang Ryoung | - |
| dc.date.accessioned | 2024-08-08T03:30:46Z | - |
| dc.date.available | 2024-08-08T03:30:46Z | - |
| dc.date.issued | 2018-02 | - |
| dc.identifier.issn | 1424-8220 | - |
| dc.identifier.issn | 1424-3210 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/16968 | - |
| dc.description.abstract | A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver's point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR)-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR) camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB). The proposed method demonstrated greater accuracy than the previous gaze classification methods. | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | MDPI | - |
| dc.title | Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor | - |
| dc.type | Article | - |
| dc.publisher.location | 스위스 | - |
| dc.identifier.doi | 10.3390/s18020456 | - |
| dc.identifier.scopusid | 2-s2.0-85041704473 | - |
| dc.identifier.wosid | 000427544000139 | - |
| dc.identifier.bibliographicCitation | SENSORS, v.18, no.2 | - |
| dc.citation.title | SENSORS | - |
| dc.citation.volume | 18 | - |
| dc.citation.number | 2 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Chemistry | - |
| dc.relation.journalResearchArea | Engineering | - |
| dc.relation.journalResearchArea | Instruments & Instrumentation | - |
| dc.relation.journalWebOfScienceCategory | Chemistry, Analytical | - |
| dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
| dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
| dc.subject.keywordPlus | REAL-TIME EYE | - |
| dc.subject.keywordPlus | VISUAL-ATTENTION | - |
| dc.subject.keywordPlus | TRACKING | - |
| dc.subject.keywordPlus | ROBUST | - |
| dc.subject.keywordPlus | POSE | - |
| dc.subject.keywordPlus | PERFORMANCE | - |
| dc.subject.keywordPlus | IRIS | - |
| dc.subject.keywordAuthor | eye gaze tracking | - |
| dc.subject.keywordAuthor | driver attention | - |
| dc.subject.keywordAuthor | NIR camera sensor | - |
| dc.subject.keywordAuthor | deep learning | - |
| dc.subject.keywordAuthor | user calibration | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
