Cited 14 time in
PSS-net: Parallel semantic segmentation network for detecting marine animals in underwater scene
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Kim, Yu Hwan | - |
| dc.contributor.author | Park, Kang Ryoung | - |
| dc.date.accessioned | 2023-04-27T09:40:40Z | - |
| dc.date.available | 2023-04-27T09:40:40Z | - |
| dc.date.issued | 2022-09 | - |
| dc.identifier.issn | 2296-7745 | - |
| dc.identifier.issn | 2296-7745 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/2536 | - |
| dc.description.abstract | Marine scene segmentation is a core technology in marine biology and autonomous underwater vehicle research. However, it is challenging from the perspective of having a different environment from that of the conventional traffic segmentation on roads. There are two major challenges. The first is the difficulty of searching for objects under seawater caused by the relatively low-light environment. The second problem is segmenting marine animals with protective colors. To solve such challenges, in previous research, a method of simultaneously segmenting the foreground and the background was proposed based on a simple modification of the conventional model; however, it has limitations in improving the segmentation accuracy. Therefore, we propose a parallel semantic segmentation network to solve the above issues in which a model and a loss are employed to locate the foreground and the background separately. The training task to locate the foreground and the background is reinforced in the proposed method by adding an attention technique in a parallel model. Furthermore, the final segmentation is performed by aggregating two feature maps obtained by separately locating the foreground and the background.The test results using an open dataset for marine animal segmentation reveal that the proposed method achieves performance of 87%, 97.3%, 88%, 95.2%, and 0.029 in the mean intersection of the union, structure similarities, weighted F-measure, enhanced-alignment measure, and mean absolute error, respectively. These findings confirm that the proposed method has higher accuracy than the state-of-the-art methods. The proposed model and code are publicly available via Github(1). | - |
| dc.format.extent | 13 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | FRONTIERS MEDIA SA | - |
| dc.title | PSS-net: Parallel semantic segmentation network for detecting marine animals in underwater scene | - |
| dc.type | Article | - |
| dc.publisher.location | 스위스 | - |
| dc.identifier.doi | 10.3389/fmars.2022.1003568 | - |
| dc.identifier.scopusid | 2-s2.0-85138532173 | - |
| dc.identifier.wosid | 000861088500001 | - |
| dc.identifier.bibliographicCitation | Frontiers in Marine Science, v.9, pp 01 - 13 | - |
| dc.citation.title | Frontiers in Marine Science | - |
| dc.citation.volume | 9 | - |
| dc.citation.startPage | 01 | - |
| dc.citation.endPage | 13 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Environmental Sciences & Ecology | - |
| dc.relation.journalResearchArea | Marine & Freshwater Biology | - |
| dc.relation.journalWebOfScienceCategory | Environmental Sciences | - |
| dc.relation.journalWebOfScienceCategory | Marine & Freshwater Biology | - |
| dc.subject.keywordAuthor | detecting marine animal | - |
| dc.subject.keywordAuthor | underwater scene | - |
| dc.subject.keywordAuthor | protective colors | - |
| dc.subject.keywordAuthor | PSS-net | - |
| dc.subject.keywordAuthor | attention technique | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
