Mobile QR Code QR CODE : Journal of the Korean Society of Civil Engineers

  1. ์ •ํšŒ์› ยท ์ธํ•˜๋Œ€ํ•™๊ต ์Šค๋งˆํŠธ์‹œํ‹ฐ๊ณตํ•™์ „๊ณต ์„์‚ฌ๊ณผ์ • (Inha University ยท toyoro1@inha.edu)
  2. ์ธํ•˜๋Œ€ํ•™๊ต ๊ณต๊ฐ„์ •๋ณด๊ณตํ•™๊ณผ ์„์‚ฌ๊ณผ์ • (Inha University ยท smkim@inha.edu)
  3. ์ข…์‹ ํšŒ์› ยท ๊ต์‹ ์ €์ž ยท ์ธํ•˜๋Œ€ํ•™๊ต ์Šค๋งˆํŠธ์‹œํ‹ฐ๊ณตํ•™์ „๊ณต ยท ๊ณต๊ฐ„์ •๋ณด๊ณตํ•™๊ณผ ๋ถ€๊ต์ˆ˜ (Corresponding Author ยท Inha University ยท schong@inha.ac.kr)



์‹ค๋‚ด ์•ผ๊ฐ„ ์ˆœ์ฐฐ ๋กœ๋ด‡, ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™”, YOLOv8n-seg, ์‚ฌ๋žŒ ํƒ์ง€
Indoor night patrol robot, Low-light image enhancement, YOLOv8n-seg, Human detection

1. ์„œ ๋ก 

์‹ค๋‚ด ๋ณด์•ˆ๊ณผ ๋ฐฉ๋ฒ”์€ ๊ฑด๋ฌผ ๋‚ด๋ถ€์˜ ์ธ์ ยท๋ฌผ์  ์ž์›์„ ๋ณดํ˜ธํ•˜๊ณ  ์นจ์ž…, ์ ˆ๋„ ๋“ฑ์˜ ๋ฒ”์ฃ„๋ฅผ ์˜ˆ๋ฐฉํ•˜๋Š” ํ™œ๋™์œผ๋กœ, ์ „ํ†ต์ ์œผ๋กœ ๊ฒฝ๋น„ ์ธ๋ ฅ๊ณผ CCTV์— ์˜์กดํ•˜์—ฌ ์šด์˜๋˜๊ณ  ์žˆ๋‹ค. ํ•˜์ง€๋งŒ ๊ฒฝ๋น„ ์ธ๋ ฅ์˜ ๊ณ ๋ นํ™”์™€ ์ธ๋ ฅ ๋ถ€์กฑ์€ ๋Œ€ํ˜• ๊ฑด๋ฌผ์˜ ๋„“๊ณ  ๋ณต์žกํ•œ ๊ณต๊ฐ„์—์„œ ํšจ๊ณผ์ ์ธ ์ˆœ์ฐฐ๊ณผ ์‹ ์†ํ•œ ๋Œ€์‘์„ ์–ด๋ ต๊ฒŒ ํ•˜๊ณ  ์žˆ๋‹ค(Park and Bae, 2015; Kwak, 2014). ๋˜ํ•œ, CCTV๋Š” ๊ณ ์ •๋œ ์œ„์น˜์™€ ์ œํ•œ๋œ ์‹œ์•ผ๊ฐ์œผ๋กœ ์ธํ•ด ๊ฐ์‹œ ์‚ฌ๊ฐ์ง€๋Œ€๊ฐ€ ๋ฐœ์ƒํ•˜๋Š” ํ•œ๊ณ„๊ฐ€ ์žˆ๋‹ค(Lee et al., 2015). ์ „ํ†ต์ ์ธ ๋ฐฉ๋ฒ” ๋ฐ ๋ณด์•ˆ ์ฒด๊ณ„์˜ ๋‹จ์ ์„ ๋ณด์™„ํ•˜๊ณ  ํšจ์œจ์ ์ธ ์šด์˜์„ ์œ„ํ•ด ์ตœ๊ทผ์—๋Š” ์‹ค๋‚ด ๋กœ๋ด‡์ด ๋„์ž…๋˜์–ด ์šด์˜๋˜๊ณ  ์žˆ๋‹ค. ์‹ค๋‚ด ๋กœ๋ด‡์€ ์ง€์†์ ์ธ ์ด๋™๊ณผ ์‹ค์‹œ๊ฐ„ ์ฆ๊ฑฐ ์ˆ˜์ง‘์ด ๊ฐ€๋Šฅํ•˜๋ฏ€๋กœ, ๋Œ€๊ทœ๋ชจ ์‹ค๋‚ด ๊ณต๊ฐ„์—์„œ ๊ฐ€์ด๋“œ ์—ญํ•  ๋ฐ ๋ณด์•ˆ ์ˆœ์ฐฐ(Lopez et al., 2017), ์ธ๋ช… ํ”ผํ•ด๋ฅผ ์ตœ์†Œํ™”ํ•˜๊ธฐ ์œ„ํ•œ ์œ ๋…๊ฐ€์Šค ํƒ์ง€(Yousif and El-Medany, 2022), ๊ฑด๋ฌผ ์‹œ์„ค๋ฌผ์˜ ์œ ์ง€๊ด€๋ฆฌ(Lรณpez et al., 2013) ๋“ฑ์˜ ์ˆœ์ฐฐ ์—…๋ฌด๋ฅผ ์œ„ํ•ด ํ™œ์šฉ๋˜๊ณ  ์žˆ๋‹ค.

์ˆœ์ฐฐ ๋กœ๋ด‡์€ ์ฃผํ–‰ํ™˜๊ฒฝ๊ณผ ์ž„๋ฌด์— ๋”ฐ๋ผ ๊ถค๋„ํ˜• ๋กœ๋ด‡, ๋ฐ”ํ€ดํ˜• ๋กœ๋ด‡, 4์กฑ ๋ณดํ–‰ ๋กœ๋ด‡ ๋“ฑ ๋‹ค์–‘ํ•œ ํ˜•ํƒœ๋กœ ๊ฐœ๋ฐœ๋˜๊ณ  ์žˆ๋‹ค. ๊ถค๋„ํ˜• ๋กœ๋ด‡์€ ์ง€๋ฉด๊ณผ์˜ ์ ‘์ง€ ๋ฉด์ ์ด ๋„“์–ด ๋ชจ๋ž˜์™€ ์ง„ํ™, ํ—˜์ค€ํ•œ ์ง€ํ˜•์—์„œ ์•ˆ์ •์ ์ธ ์ฃผํ–‰์ด ๊ฐ€๋Šฅํ•˜๋ฏ€๋กœ, ๊ฑด์„ค ํ˜„์žฅ๊ณผ ์žฌ๋‚œ ์žฌํ•ด ์ง€์—ญ ๋“ฑ์˜ ์‹ค์™ธ ์ˆœ์ฐฐ ์ž„๋ฌด๋ฅผ ์œ„ํ•ด ๊ฐœ๋ฐœ๋˜๊ณ  ์žˆ์œผ๋ฉฐ(Hamid et al., 2022), ๋ฐ”ํ€ดํ˜• ๋กœ๋ด‡์€ ํ‰ํƒ„ํ•œ ์ง€๋ฉด์—์„œ ๋น ๋ฅธ ์ฃผํ–‰์ด ๊ฐ€๋Šฅํ•˜๋ฏ€๋กœ, ๋„๋กœ ์‹œ์„ค๋ฌผ ๊ด€๋ฆฌ์™€ ๊ณตํ•ญ๊ณผ ์ฃผ์ฐจ์žฅ ๋“ฑ๊ณผ ๊ฐ™์€ ๋„“์€ ์‹ค๋‚ด ๊ณต๊ฐ„์—์„œ ์ˆœ์ฐฐ ์ž„๋ฌด์— ํ™œ์šฉ๋˜๊ณ  ์žˆ๋‹ค(Chen, 2022). ๋ฐ˜๋ฉด, 4์กฑ ๋ณดํ–‰ ๋กœ๋ด‡์€ ๊ถค๋„ํ˜• ๋ฐ ๋ฐ”ํ€ดํ˜•๊ณผ ๋‹ฌ๋ฆฌ ๋†’์€ ์žฅ์• ๋ฌผ ํšŒํ”ผ์„ฑ ๋Šฅ๋ ฅ๊ณผ ์ง€ํ˜• ์ ์‘์„ฑ์„ ๊ฐ€์ง„๋‹ค. ํŠนํžˆ, ์‹ค๋‚ด ๊ณต๊ฐ„์—์„œ 4์กฑ ๋ณดํ–‰ ๋กœ๋ด‡์€ ๊ณ„๋‹จ์„ ํ†ตํ•œ ์ธต๊ฐ„ ์ด๋™์ด ๊ฐ€๋Šฅํ•˜๋ฏ€๋กœ ๋Œ€ํ˜• ์‹œ์„ค๋ฌผ๊ณผ ๊ณ ์ธต ๊ฑด๋ฌผ์—์„œ ์ˆœ์ฐฐ๊ณผ ๊ฐ์‹œ ์ž„๋ฌด์— ํšจ๊ณผ์ ์œผ๋กœ ์šด์˜๋˜๊ณ  ์žˆ๋‹ค(Kim et al., 2022).

๋กœ๋ด‡ ์ˆœ์ฐฐ์„ ์œ„ํ•ด ๊ด‘ํ•™ ์นด๋ฉ”๋ผ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ์ ์™ธ์„ , ์—ดํ™”์ƒ, ๊นŠ์ด ์นด๋ฉ”๋ผ, LiDAR(Light Detection And Ranging) ๋“ฑ์˜ ์„ผ์„œ๊ฐ€ ์ถ”๊ฐ€๋กœ ํƒ‘์žฌ๋˜๊ณ  ์žˆ๋‹ค(Yoo and Shin, 2024; Chang et al., 2022). ํ•˜์ง€๋งŒ, ๋‹ค์–‘ํ•œ ์„ผ์„œ๋ฅผ ํƒ‘์žฌํ•  ๊ฒฝ์šฐ ๋†’์€ ๋น„์šฉ๊ณผ ์ „๋ ฅ ์†Œ๋ชจ๊ฐ€ ๋ฐœ์ƒํ•˜๋ฉฐ, ๋Œ€์šฉ๋Ÿ‰์˜ ๋ฐ์ดํ„ฐ ์ฒ˜๋ฆฌ๋ฅผ ์œ„ํ•œ ๊ณ ์„ฑ๋Šฅ ์—ฐ์‚ฐ์žฅ์น˜์™€ ์„ผ์„œ ๊ฐ„ ๋™๊ธฐํ™”๊ฐ€ ํ•„์š”ํ•˜๋‹ค. ๋ฐ˜๋ฉด ๊ด‘ํ•™ ์นด๋ฉ”๋ผ๋Š” ๊ฒฝ์ œ์ ์ด๊ณ  ํšจ์œจ์ ์ธ ์„ผ์„œ๋กœ, ๊ตฌ๋งค ๋น„์šฉ๊ณผ ์ „๋ ฅ ์†Œ๋ชจ๊ฐ€ ์ƒ๋Œ€์ ์œผ๋กœ ๋‚ฎ๋‹ค. ๋˜ํ•œ, ์ˆœ์ฐฐ ๋กœ๋ด‡ ์ฃผ๋ณ€ ํ™˜๊ฒฝ์˜ ์ƒ‰์ƒ๊ณผ ์งˆ๊ฐ ๋“ฑ ํ’๋ถ€ํ•œ ์‹œ๊ฐ ์ •๋ณด๋ฅผ ์‹ค์‹œ๊ฐ„์œผ๋กœ ์ˆ˜์ง‘ํ•  ์ˆ˜ ์žˆ์–ด, ์›๊ฒฉ ์กฐ์ข…์ž์˜ ๋กœ๋ด‡ ์กฐ์ž‘์€ ๋ฌผ๋ก , ์ˆœ์ฐฐ ๊ตฌ์—ญ์˜ ํšจ์œจ์ ์ธ ํŒŒ์•…๊ณผ ์‹ ์†ํ•˜๊ณ  ์ •ํ™•ํ•œ ์˜์‚ฌ ๊ฒฐ์ •์„ ์ง€์›ํ•  ์ˆ˜ ์žˆ๋‹ค.

์ตœ๊ทผ์—๋Š” ์ˆœ์ฐฐ ๋กœ๋ด‡์ด ์ „์†กํ•˜๋Š” ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ ๊ด€์‹ฌ ๊ฐ์ฒด๋ฅผ ์‹ ์†ํ•˜๊ณ  ์ •ํ™•ํ•˜๊ฒŒ ์‹๋ณ„ํ•˜๊ธฐ ์œ„ํ•ด ๋”ฅ๋Ÿฌ๋‹ ๊ธฐ๋ฐ˜์˜ ๊ฐ์ฒด ํƒ์ง€ ๋ฐ ๋ถ„ํ•  ๊ธฐ๋ฒ•์ด ๊ฐœ๋ฐœ๋˜๊ณ  ์žˆ๋‹ค. ํŠนํžˆ, ์ˆœ์ฐฐ ๋กœ๋ด‡์˜ ์‚ฌ๋žŒ ํƒ์ง€๋Š” ์‹ค๋‚ด ๋ฐฉ๋ฒ”, ์•ˆ์ „ ๋ชจ๋‹ˆํ„ฐ๋ง, ๊ธด๊ธ‰ ์ƒํ™ฉ ๋Œ€์‘ ๋“ฑ ๋‹ค์–‘ํ•œ ๋ถ„์•ผ์—์„œ ํ•„์ˆ˜์ ์ธ ๊ธฐ๋Šฅ์ด๋‹ค. ์นจ์ž…์ž๋ฅผ ํƒ์ง€ํ•˜์—ฌ ๊ฑด๋ฌผ์˜ ๋ณด์•ˆ์„ ์œ ์ง€ํ•˜๊ณ (Choi et al., 2022; Banerjee et al., 2024), ๊ฑด์„ค ํ˜„์žฅ์—์„œ ๊ทผ๋กœ์ž์˜ ๋ณดํ˜ธ์žฅ๋น„๋ฅผ ์‹๋ณ„ํ•˜์—ฌ ์•ˆ์ „์„ฑ์„ ๊ฐ•ํ™”ํ•˜๋ฉฐ(Lee and Chien, 2020), ๋‚™์ƒ ์‚ฌ๊ณ ๋ฅผ ๊ฐ์ง€ํ•ด ์‹ ์†ํ•œ ๋Œ€์‘์„ ๊ฐ€๋Šฅํ•˜๊ฒŒ ํ•œ๋‹ค(Lafuente-Arroyo et al., 2022). ํ•˜์ง€๋งŒ, ์ˆœ์ฐฐ ๋กœ๋ด‡์€ ์ฃผ๋กœ ๊ฑด๋ฌผ ์ถœ์ž…์ด ์ œํ•œ๋˜๋Š” ์•ผ๊ฐ„ ์‹œ๊ฐ„๋Œ€์— ํ™œ์šฉ๋˜๊ณ  ์žˆ๋‹ค. ์‹ค๋‚ด๋Š” ์กฐ๋ช… ์‚ฌ์šฉ ์—ฌ๋ถ€์™€ ์„ค์น˜ ์œ„์น˜์— ๋”ฐ๋ผ ์กฐ๋„ ํ™˜๊ฒฝ์ด ๋Š์ž„์—†์ด ๋ณ€ํ•˜๋ฏ€๋กœ, ์ˆœ์ฐฐ ์˜์ƒ์˜ ์ผ๊ด€๋œ ๋ฐ๊ธฐ์™€ ์ƒ‰์ƒ์„ ์œ ์ง€ํ•˜๊ธฐ ์–ด๋ ต๋‹ค. ํŠนํžˆ, ์กฐ๋ช…์ด ์•ฝํ•œ ํ™˜๊ฒฝ์—์„œ ์ดฌ์˜๋œ ์ˆœ์ฐฐ ์˜์ƒ์€ ์–ด๋‘ก๊ณ  ๋Œ€๋น„๊ฐ€ ๋‚ฎ์œผ๋ฉฐ ๋งŽ์€ ๋…ธ์ด์ฆˆ๋ฅผ ํฌํ•จํ•˜๋ฏ€๋กœ, ๊ฐ์ฒด ํƒ์ง€ ๋ฐ ๋ถ„ํ•  ๊ธฐ๋ฒ•์˜ ์„ฑ๋Šฅ์„ ์ €ํ•˜์‹œ์ผœ ์•ผ๊ฐ„ ์ˆœ์ฐฐ ๋กœ๋ด‡์˜ ์„ฑ๋Šฅ์„ ๊ฐ์†Œ์‹œํ‚ฌ ์ˆ˜ ์žˆ๋‹ค(Chen et al., 2023).

์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•์€ ์กฐ๋„๊ฐ€ ๋‚ฎ์€ ํ™˜๊ฒฝ์—์„œ ์ดฌ์˜๋œ ์˜์ƒ์˜ ๋ฐ๊ธฐ, ๋Œ€๋น„, ์ƒ‰์ƒ์„ ๊ฐœ์„ ํ•˜์—ฌ ์‹œ๊ฐ์  ํ’ˆ์งˆ์„ ๋†’์ด๊ณ , ์˜์ƒ ๋ถ„์„ ๋ฐ ์ฒ˜๋ฆฌ ์„ฑ๋Šฅ์„ ํ–ฅ์ƒํ•˜๋Š” ๊ธฐ์ˆ ์ด๋‹ค(Jingchun et al., 2024). ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•์„ ์‚ฌ์šฉํ•˜๋ฉด ์–ด๋‘์šด ํ™˜๊ฒฝ์—์„œ ์ดฌ์˜ํ•œ ์‚ฌ๋žŒ ์˜์ƒ ๋ฐ์ดํ„ฐ๋ฅผ ๋ณ„๋„๋กœ ํ•™์Šตํ•˜์ง€ ์•Š์•„๋„, ๊ฐ•ํ™”๋œ ์˜์ƒ์„ ํ†ตํ•ด ์ˆœ์ฐฐ ๋กœ๋ด‡์˜ ์‚ฌ๋žŒ ํƒ์ง€ ๋ฐ ๋ถ„ํ•  ์„ฑ๋Šฅ์„ ํ–ฅ์ƒ์‹œํ‚ฌ ๊ฒƒ์œผ๋กœ ๊ธฐ๋Œ€๋œ๋‹ค. ์ด์— ๋ณธ ์—ฐ๊ตฌ๋Š” ์•ผ๊ฐ„ ์ˆœ์ฐฐ ๋กœ๋ด‡์˜ ํšจ์œจ์ ์ธ ์›๊ฒฉ ์กฐ์ž‘๊ณผ ์‚ฌ๋žŒ ํƒ์ง€ ์„ฑ๋Šฅ ํ–ฅ์ƒ์„ ๋ชฉํ‘œ๋กœ, ์ˆœ์ฐฐ ์˜์ƒ์˜ ์‹œ์ธ์„ฑ์„ ์‹ค์‹œ๊ฐ„์œผ๋กœ ๊ฐœ์„ ํ•  ์ˆ˜ ์žˆ๋Š” ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ• ํ‰๊ฐ€ ๋ฐฉ๋ฒ•์„ ์ œ์•ˆํ•˜๊ณ ์ž ํ•œ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์€ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ๊ตฌ์„ฑ๋˜์–ด ์žˆ๋‹ค. 2์žฅ์—์„œ๋Š” ์—ฐ๊ตฌ ๋ฐฉ๋ฒ•์„ ์ œ์‹œํ•˜์˜€๋‹ค. ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•๊ณผ YOLOv8n-seg ๋ชจ๋ธ์„ ์†Œ๊ฐœํ•˜์˜€๊ณ  ์„ฑ๋Šฅ ๋ถ„์„์„ ์œ„ํ•œ ํ‰๊ฐ€ ์ง€ํ‘œ๋ฅผ ์„ค๋ช…ํ•˜์˜€๋‹ค. 3์žฅ์—์„œ๋Š” ๊ฑด๋ฌผ ์‹ค๋‚ด์—์„œ ์•ผ๊ฐ„ ์ˆœ์ฐฐ ์˜์ƒ์˜ ์‹œ์ธ์„ฑ ๊ฐœ์„ ๊ณผ ์‚ฌ๋žŒ ํƒ์ง€ ๊ฒฐ๊ณผ๋ฅผ ํ™•์ธํ•˜์˜€์œผ๋ฉฐ, 4์žฅ์—์„œ๋Š” ๊ฒฐ๋ก  ๋ฐ ํ–ฅํ›„ ์—ฐ๊ตฌ ๋ฐฉํ–ฅ์„ ์ œ์‹œํ•˜์˜€๋‹ค.

2. ์—ฐ๊ตฌ ๋ฐฉ๋ฒ• ๋ฐ ๋‚ด์šฉ

2.1 ๊ฐœ์š”

๊ฑด๋ฌผ ๋‚ด๋ถ€์˜ ์•ผ๊ฐ„ ์กฐ๋„ ํ™˜๊ฒฝ์—์„œ ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ• ์ ์šฉ์— ๋”ฐ๋ฅธ ์ˆœ์ฐฐ ์˜์ƒ์˜ ์‹œ์ธ์„ฑ ๊ฐœ์„ ๊ณผ ์‚ฌ๋žŒ ํƒ์ง€ ์„ฑ๋Šฅ ํ–ฅ์ƒ ํšจ๊ณผ๋ฅผ ๋ถ„์„ํ•˜๊ธฐ ์œ„ํ•œ ์—ฐ๊ตฌ ํ๋ฆ„์€ Fig. 1๊ณผ ๊ฐ™๋‹ค. ๋จผ์ €, ์‹คํ—˜ ํ™˜๊ฒฝ ๊ตฌ์ถ• ๋‹จ๊ณ„์—์„œ๋Š” ๊ฑด๋ฌผ ๋‚ด๋ถ€์˜ ์•ผ๊ฐ„ ์กฐ๋„ ํ™˜๊ฒฝ์—์„œ ์‹ค๋‚ด ์ˆœ์ฐฐ ์˜์ƒ์„ ์ทจ๋“ํ•˜์˜€์œผ๋ฉฐ, ์‹ค์‹œ๊ฐ„ ์‚ฌ๋žŒ ํƒ์ง€๋ฅผ ์œ„ํ•ด ๊ฒฝ๋Ÿ‰ํ™”๋œ ๊ตฌ์กฐ๋กœ ๋กœ๋ด‡์˜ ์ œํ•œ๋œ ์—ฐ์‚ฐ ์ž์›์—์„œ๋„ ๋น ๋ฅธ ์ถ”๋ก ์„ ์ง€์›ํ•˜๋Š” YOLOv8n-seg ๋ชจ๋ธ์„ ์„ ์ •ํ•˜์˜€๋‹ค. ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ• ํ‰๊ฐ€ ๋‹จ๊ณ„์—์„œ๋Š” ์‹ค๋‚ด ์ €์กฐ๋„ ํ™˜๊ฒฝ์— ๊ฐ•์ธํ•œ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•์„ ์„ ์ •ํ•˜์˜€๋‹ค. ๋‹ค์–‘ํ•œ ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•(GLADNet, KinD, TBEFN, LLFormer, EnlightenGAN, Zero-DCE)์„ ์‹ค๋‚ด ์ˆœ์ฐฐ ์˜์ƒ์— ์ ์šฉํ•˜์˜€๊ณ , ์œก์•ˆ์— ์˜ํ•œ ์ •์„ฑ์  ํ‰๊ฐ€์™€ ํ•จ๊ป˜ ์ •๋Ÿ‰์ ์ธ ์˜์ƒ ํ‰๊ฐ€ ์ง€ํ‘œ๋ฅผ ์‚ฐ์ถœํ•˜์—ฌ ์›๋ณธ ์˜์ƒ๊ณผ ๋น„๊ต ๋ถ„์„ํ•˜์˜€๋‹ค. ๋งˆ์ง€๋ง‰์œผ๋กœ, ์ €์กฐ๋„ ์˜์ƒ์˜ ์‹œ์ธ์„ฑ ๊ฐœ์„ ์— ๋”ฐ๋ฅธ ์‚ฌ๋žŒ ํƒ์ง€ ์„ฑ๋Šฅ ํ–ฅ์ƒ์„ ํ‰๊ฐ€ํ•˜๊ธฐ ์œ„ํ•ด, YOLOv8n-seg ๋ชจ๋ธ์˜ ํƒ์ง€ ์ •ํ™•๋„์™€ ํƒ์ง€ ์†๋„๋ฅผ ๋ถ„์„ํ•˜์—ฌ ์ˆœ์ฐฐ ๋กœ๋ด‡์˜ ์‹ค์‹œ๊ฐ„ ์‚ฌ๋žŒ ํƒ์ง€ ๊ฐ€๋Šฅ์„ฑ์„ ๊ฒ€ํ† ํ•˜์˜€๋‹ค.

Fig. 1. Research Flow

../../Resources/KSCE/Ksce.2025.45.2.0277/fig1.png

2.2 ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ• ๋ฐ ํ‰๊ฐ€ ์ง€ํ‘œ

์‹ค๋‚ด ์•ผ๊ฐ„ ์ˆœ์ฐฐ ๋กœ๋ด‡์ด ์ดฌ์˜ํ•œ ์˜์ƒ์˜ ์ƒ‰์ƒ ๋ฐ ๋ฐ๊ธฐ๋ฅผ ๋ณต์›ํ•˜๊ณ  ๊ฐ์ฒด ํƒ์ง€ ์„ฑ๋Šฅ์„ ํ–ฅ์ƒ์‹œํ‚ค๊ธฐ ์œ„ํ•ด GLADNet, KinD, TBEFN, LLFormer, EnlightenGAN, Zero-DCE๋ฅผ ์‚ฌ์šฉํ•˜์˜€๋‹ค. GLADNet์€ ์ธ์ฝ”๋”-๋””์ฝ”๋” ๋„คํŠธ์›Œํฌ๋ฅผ ํ†ตํ•ด ์˜์ƒ์˜ ์ „์—ญ์ ์ธ ์กฐ๋„๋ฅผ ์ถ”์ •ํ•˜๊ณ , ์†์‹ค๋œ ์˜์ƒ์˜ ์„ธ๋ถ€ ์ •๋ณด๋ฅผ ์žฌ๊ตฌ์„ฑํ•จ์œผ๋กœ์จ ์ €์กฐ๋„ ์˜์ƒ์˜ ํ’ˆ์งˆ์„ ๊ฐœ์„ ํ•œ๋‹ค(Wang et al., 2018). KinD๋Š” Retinex ์›๋ฆฌ์— ๊ธฐ๋ฐ˜ํ•œ ๋ชจ๋ธ๋กœ, ์ž…๋ ฅ ์˜์ƒ์„ ๋ฐ˜์‚ฌ ์„ฑ๋ถ„๊ณผ ์กฐ๋ช… ์„ฑ๋ถ„์œผ๋กœ ๋ถ„ํ•ดํ•œ ํ›„, ๊ฐ ์„ฑ๋ถ„์„ ๊ฐ•ํ™”ํ•˜๊ณ  ๋‹ค์‹œ ๊ฒฐํ•ฉํ•˜์—ฌ ์ €์กฐ๋„ ์˜์ƒ์„ ๊ฐœ์„ ํ•œ๋‹ค(Zhang et al., 2019). TBEFN์€ ๋‘ ๊ฐœ์˜ ๋ธŒ๋žœ์น˜์—์„œ ๋‹ค์–‘ํ•œ ์กฐ๋ช… ์ˆ˜์ค€ ๊ฐ„์˜ ๋ณ€ํ™˜ ๊ด€๊ณ„๋ฅผ ๊ฐ๊ฐ ์ถ”์ •ํ•˜์—ฌ ๊ฐ•ํ™” ์˜์ƒ์„ ์ƒ์„ฑํ•˜๊ณ , ๊ฒฐํ•ฉํ•จ์œผ๋กœ์จ ์ €์กฐ๋„ ์˜์ƒ์„ ๊ฐœ์„ ํ•œ๋‹ค(Lu and Zhang, 2020). LLFormer๋Š” ๊ณ„์ธต์  ์ธ์ฝ”๋”-๋””์ฝ”๋” ๊ตฌ์กฐ์˜ ํŠธ๋žœ์Šคํฌ๋จธ๋ฅผ ์ด์šฉํ•˜์—ฌ ์ €์กฐ๋„ ์˜์ƒ์„ ๊ฐœ์„ ํ•œ๋‹ค(Wang et al., 2023). EnlightenGAN์€ GAN(Generative Adversarial Networks) ๊ธฐ๋ฐ˜์˜ ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•์œผ๋กœ ์•ž์„  ๊ธฐ๋ฒ•๋“ค๊ณผ ๋‹ฌ๋ฆฌ ํ•™์Šต์„ ์œ„ํ•ด ์ €์กฐ๋„ ์˜์ƒ๊ณผ ์ผ๋ฐ˜ ์กฐ๋„ ์˜์ƒ ์Œ์ด ํ•„์š” ์—†๋‹ค. ์–ดํ…์…˜ U-net ์ƒ์„ฑ์ž๋ฅผ ํ†ตํ•ด ์–ด๋‘์šด ์˜์—ญ์„ ๊ฐ•์กฐํ•˜๊ณ , ์ „์—ญ-๊ตญ์†Œ ํŒ๋ณ„์ž๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋‹ค์–‘ํ•œ ์กฐ๋ช… ์กฐ๊ฑด์„ ๊ฐ€์ง„ ์ž…๋ ฅ ์˜์ƒ ํ’ˆ์งˆ์„ ํšจ๊ณผ์ ์œผ๋กœ ๊ฐœ์„ ํ•œ๋‹ค(Jiang et al., 2021). Zero-DCE๋Š” ๋ชจ๋ธ ํ•™์Šต ์‹œ ์ฐธ์กฐ ์˜์ƒ์„ ํ•„์š”๋กœ ํ•˜์ง€ ์•Š์œผ๋ฉฐ, ์ž…๋ ฅ ์˜์ƒ์œผ๋กœ๋ถ€ํ„ฐ ์ฑ„๋„๋ณ„ ํ”ฝ์…€ ๋‹จ์œ„์˜ ์ตœ์  ๊ณก์„ ์„ ์ถ”์ •ํ•˜๊ณ  ๋ฐ˜๋ณต์ ์ธ ํ”ฝ์…€ ๋‹จ์œ„์˜ ์กฐ์ •์„ ์ˆ˜ํ–‰ํ•˜์—ฌ ์ €์กฐ๋„ ์˜์ƒ์„ ๊ฐœ์„ ํ•œ๋‹ค(Guo et al., 2020).

์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•์˜ ํ’ˆ์งˆ ๊ฐœ์„  ํšจ๊ณผ๋ฅผ ์ •๋Ÿ‰์ ์œผ๋กœ ํ‰๊ฐ€ํ•˜๊ธฐ ์œ„ํ•ด ์˜์ƒ ํ‰๊ฐ€ ์ง€ํ‘œ๋ฅผ ์‚ฌ์šฉํ•˜์˜€๋‹ค. ์˜์ƒ ํ’ˆ์งˆ ํ‰๊ฐ€ ๋ฐฉ๋ฒ•์—๋Š” ์ „์ฒด ์ฐธ์กฐ(Full-Reference), ๋ฌด ์ฐธ์กฐ(No-Reference), ๋ถ€๋ถ„ ์ฐธ์กฐ(Reduced-Reference) ๋ฐฉ์‹์ด ์žˆ๋‹ค(Bosse et al., 2017). ์ „์ฒด ์ฐธ์กฐ ๋ฐฉ์‹์€ ๊ธฐ์ค€ ์˜์ƒ์„ ์ฐธ์กฐํ•˜์—ฌ ๋Œ€์ƒ ์˜์ƒ์˜ ํ’ˆ์งˆ์„ ํ‰๊ฐ€ํ•˜๋ฉฐ, ๋ฌด ์ฐธ์กฐ ๋ฐฉ์‹์€ ๊ธฐ์ค€ ์˜์ƒ ์—†์ด ๋Œ€์ƒ ์˜์ƒ๋งŒ์„ ์ด์šฉํ•˜์—ฌ ํ’ˆ์งˆ์„ ํ‰๊ฐ€ํ•œ๋‹ค. ๋ถ€๋ถ„ ์ฐธ์กฐ ๋ฐฉ์‹์€ ์ „์ฒด ์ฐธ์กฐ ๋ฐ ๋ฌด ์ฐธ์กฐ ๋ฐฉ์‹์˜ ์ค‘๊ฐ„ ๋‹จ๊ณ„๋กœ, ๊ธฐ์ค€ ์˜์ƒ์˜ ์ผ๋ถ€ ์ •๋ณด๋ฅผ ์ด์šฉํ•˜์—ฌ ๋Œ€์ƒ ์˜์ƒ์˜ ํ’ˆ์งˆ์„ ํ‰๊ฐ€ํ•œ๋‹ค. ์•ผ๊ฐ„ ์ˆœ์ฐฐ ๋กœ๋ด‡์˜ ๊ฒฝ์šฐ ๊ธฐ์ค€ ์˜์ƒ์„ ์ทจ๋“ํ•˜๊ธฐ ์–ด๋ ค์šฐ๋ฏ€๋กœ, ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ๊ธฐ์ค€ ์˜์ƒ์ด ํ•„์š”์—†๋Š” ๋ฌด ์ฐธ์กฐ ๋ฐฉ์‹์˜ BRISQUE (Blind/Referenceless Image Spatial Quality Evaluator)์™€ NIQE(Natural Image Quality Evaluator) ์ง€ํ‘œ๋ฅผ ์ด์šฉํ•˜์˜€๋‹ค. BRISQUE์™€ NIQE๋Š” ์™œ๊ณก์ด ์—†๋Š” ๊นจ๋—ํ•œ ์˜์ƒ์˜ ํ†ต๊ณ„์  ํŠน์„ฑ๊ณผ ๋Œ€์ƒ ์˜์ƒ์˜ ํ†ต๊ณ„์  ํŠน์„ฑ์„ ๋น„๊ตํ•˜์—ฌ ์˜์ƒ์˜ ํ’ˆ์งˆ์„ ํ‰๊ฐ€ํ•˜๋ฉฐ, ๊ฐ’์ด ์ž‘์„์ˆ˜๋ก ์–‘์งˆ์˜ ์˜์ƒ์ž„์„ ์˜๋ฏธํ•œ๋‹ค. BRISQUE๋Š” ์˜์ƒ์— MSCN(Mean Subtraction and Contrast Normalization) ์ฒ˜๋ฆฌ๋ฅผ ํ†ตํ•ด ๊ณ„์‚ฐ๋œ ํŠน์„ฑ์„ ์‚ฌ์ „ ํ•™์Šต๋œ SVR(Support Vector Regressor)์— ์ž…๋ ฅํ•˜์—ฌ ์˜์ƒ์˜ ํ’ˆ์งˆ์„ ์˜ˆ์ธกํ•œ๋‹ค(Mittal et al., 2012a). NIQE๋Š” MSCN ์ฒ˜๋ฆฌ ํ›„, ์ผ์ •ํ•œ ํฌ๊ธฐ์˜ ํŒจ์น˜๋กœ๋ถ€ํ„ฐ ํ†ต๊ณ„์  ํŠน์„ฑ์ธ ๋ฒกํ„ฐ์™€ ๊ณต๋ถ„์‚ฐ์„ ๊ณ„์‚ฐํ•œ๋‹ค. ์ดํ›„ ์ž์—ฐ ์˜์ƒ์—์„œ ๋„์ถœ๋œ ๋ฒกํ„ฐ ๋ฐ ๊ณต๋ถ„์‚ฐ๊ณผ์˜ ์œ ์‚ฌ๋„๋ฅผ ๊ณ„์‚ฐํ•˜์—ฌ ์ž…๋ ฅ ์˜์ƒ์˜ ํ’ˆ์งˆ์„ ์˜ˆ์ธกํ•œ๋‹ค(Mittal et al., 2012b).

2.3 YOLOv8n-seg ๊ธฐ๋ฐ˜ ๊ฐ์ฒด ํƒ์ง€ ๋ฐ ํ‰๊ฐ€ ์ง€ํ‘œ

YOLO๋Š” ๋Œ€ํ‘œ์ ์ธ ๋‹จ์ผ ๋‹จ๊ณ„ ๊ฒ€์ถœ๊ธฐ(one-stage detector)๋กœ, ์˜์ƒ์—์„œ ๊ฐ์ฒด์˜ ์œ„์น˜์™€ ์ข…๋ฅ˜๋ฅผ ๋™์‹œ์— ๊ณ„์‚ฐํ•˜์—ฌ ์‹ค์‹œ๊ฐ„ ๊ฐ์ฒด ํƒ์ง€์— ์ ํ•ฉํ•˜๋‹ค. YOLO ๋ชจ๋ธ์˜ ๋„คํŠธ์›Œํฌ๋Š” ๋ฐฑ๋ณธ(backbone), ๋„ฅ(neck), ํ—ค๋“œ(head)์˜ ์„ธ ๊ฐ€์ง€ ์š”์†Œ๋กœ ๊ตฌ์„ฑ๋œ๋‹ค. ๋ฐฑ๋ณธ์€ ํ•ฉ์„ฑ๊ณฑ ์—ฐ์‚ฐ์„ ํ†ตํ•ด ์ž…๋ ฅ ์˜์ƒ์—์„œ ๋‹ค์–‘ํ•œ ์Šค์ผ€์ผ์˜ ํŠน์ง•์„ ์ถ”์ถœํ•˜๊ณ , ๋„ฅ์€ ๋‹ค์–‘ํ•œ ์Šค์ผ€์ผ์˜ ๊ฐ์ฒด ํƒ์ง€๋ฅผ ์œ„ํ•ด ํŠน์ง•์„ ๋ณ‘ํ•ฉํ•œ๋‹ค. ํ—ค๋“œ๋Š” ์ตœ์ข…์ ์œผ๋กœ ๊ฐ์ฒด์˜ ์œ„์น˜์™€ ํด๋ž˜์Šค ํ™•๋ฅ ์„ ์˜ˆ์ธกํ•œ๋‹ค. YOLOv8์€ YOLOv5์™€ ์œ ์‚ฌํ•œ ๋ฐฑ๋ณธ์„ ํ™œ์šฉํ•˜๋ฉฐ, ๋„คํŠธ์›Œํฌ ๊ตฌ์กฐ๋ฅผ ๊ฐœ์„ ํ•˜๊ณ  ์•ต์ปค ํ”„๋ฆฌ(anchor-free) ๋ฐฉ์‹์„ ์‚ฌ์šฉํ•จ์œผ๋กœ์จ ๊ธฐ์กด ๋ชจ๋ธ์˜ ํƒ์ง€ ์†๋„์™€ ์ •ํ™•๋„๋ฅผ ํฌ๊ฒŒ ํ–ฅ์ƒ์‹œ์ผฐ๋‹ค(Terven et al., 2023). YOLOv8์€ ํ™œ์šฉ ๋ชฉ์ ์— ๋”ฐ๋ผ YOLOv8, YOLOv8-seg, YOLOv8-pose, YOLOv8-obb, YOLOv8-cls๋กœ ๊ตฌ๋ถ„๋˜๋ฉฐ, ๊ฐ ๋ชจ๋ธ์€ ํƒ์ง€, ๋ถ„ํ• , ํฌ์ฆˆ, ๋ฐฉํ–ฅ ํƒ์ง€, ๋ถ„๋ฅ˜์— ํŠนํ™”๋˜์–ด ์žˆ๋‹ค(Ultralytics, 2023). ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ๊ฐ์ฒด์˜ ์œ„์น˜, ํ˜•ํƒœ, ํฌ๊ธฐ๋ฅผ ์ •ํ™•ํ•˜๊ฒŒ ํŒŒ์•…ํ•˜๊ธฐ ์œ„ํ•ด, ๊ฐ์ฒด๋ฅผ ๋ฐ”์šด๋”ฉ ๋ฐ•์Šค ํ˜•ํƒœ๋กœ ํƒ์ง€ํ•˜๊ณ  ๋ถ„๋ฅ˜ํ•˜๋ฉฐ ๋งˆ์Šคํฌ๋ฅผ ์ƒ์„ฑํ•  ์ˆ˜ ์žˆ๋Š” YOLOv8-seg ๋ชจ๋ธ ์ค‘, ์ˆœ์ฐฐ ๋กœ๋ด‡์˜ ์ œํ•œ๋œ ์—ฐ์‚ฐ ์„ฑ๋Šฅ์„ ๊ณ ๋ คํ•˜์—ฌ ํŒŒ๋ผ๋ฏธํ„ฐ ๊ฐœ์ˆ˜์™€ ์—ฐ์‚ฐ๋Ÿ‰์ด ์ ์€ YOLOv8n-seg ๋ชจ๋ธ์„ ์‚ฌ์šฉํ•˜์˜€๋‹ค.

๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” YOLOv8n-seg ๋ชจ๋ธ์˜ ์„ฑ๋Šฅ ํ‰๊ฐ€ ์ง€ํ‘œ๋กœ ํƒ์ง€ ์ •ํ™•๋„ ์ธก๋ฉด์—์„œ F1 score์™€ AP(Average Precision)๋ฅผ ์‚ฌ์šฉํ•˜์˜€๊ณ , ํƒ์ง€ ์†๋„ ์ธก๋ฉด์—์„œ FPS(Frames Per Second)๋ฅผ ์‚ฌ์šฉํ•˜์˜€๋‹ค. F1 score๋Š” ์ •๋ฐ€๋„(precision)์™€ ์žฌํ˜„์œจ(recall)์˜ ์กฐํ™” ํ‰๊ท ์œผ๋กœ ์ข…ํ•ฉ์ ์ธ ๋ชจ๋ธ ์„ฑ๋Šฅ์„ ๋ณด์—ฌ์ค€๋‹ค(Eq. (1)). ์ •๋ฐ€๋„๋Š” ๋ชจ๋ธ์ด ์–‘์„ฑ์œผ๋กœ ์˜ˆ์ธกํ•œ ๊ฒƒ ์ค‘ ์‹ค์ œ ์–‘์„ฑ์ธ ๊ฒƒ์˜ ๋น„์œจ์ด๋ฉฐ, ์žฌํ˜„์œจ์€ ์‹ค์ œ ์–‘์„ฑ์ธ ๊ฒƒ ์ค‘ ๋ชจ๋ธ์ด ์–‘์„ฑ์œผ๋กœ ์ •ํ™•ํ•˜๊ฒŒ ์˜ˆ์ธกํ•œ ๊ฒƒ์˜ ๋น„์œจ์ด๋‹ค. F1 score๋Š” 0์—์„œ 1 ์‚ฌ์ด์˜ ๊ฐ’์„ ๊ฐ€์ง€๋ฉฐ, 1์— ๊ฐ€๊นŒ์šธ์ˆ˜๋ก ํƒ์ง€ ์ •ํ™•๋„๊ฐ€ ๋†’๋‹ค๋Š” ๊ฒƒ์„ ์˜๋ฏธํ•œ๋‹ค.

(1)
$F1 Score=2\times\dfrac{Precision\times Recall}{Precision+Recall}$

where,

$Precision=\dfrac{True \;Positive}{True \;Positive+False \;Positive}$,

$Recall=\dfrac{True \;Positive}{True \;Positive+False\; Negative}$

AP๋Š” ๊ฐ์ฒด ํƒ์ง€ ๋ชจ๋ธ์˜ ์„ฑ๋Šฅ์„ ํ‰๊ฐ€ํ•˜๋Š” ์ง€ํ‘œ๋กœ IoU(Intersection over Union) ์ž„๊ณ„๊ฐ’์„ ์„ค์ •ํ•˜์—ฌ ์‚ฐ์ถœ๋œ๋‹ค. IoU๋Š” ๋ชจ๋ธ์ด ์˜ˆ์ธกํ•œ ์˜์—ญ(predict)๊ณผ ์‹ค์ œ ์˜์—ญ(true) ๊ฐ„์˜ ์ค‘์ฒฉ๋„๋ฅผ ๋‚˜ํƒ€๋‚ด๋ฉฐ, ์˜ˆ์ธก ์˜์—ญ๊ณผ ์‹ค์ œ ์˜์—ญ์˜ ํ•ฉ์ง‘ํ•ฉ ์ค‘ ์˜ˆ์ธก ์˜์—ญ๊ณผ ์‹ค์ œ ์˜์—ญ์˜ ๊ต์ง‘ํ•ฉ์˜ ๋น„์œจ์ด๋‹ค(Eq. (2)).

(2)
$Io U(Predict,\: True)=\dfrac{| Predict\cap True |}{| Predict\cup True |}$

AP50์€ IoU ์ž„๊ณ„๊ฐ’์ด 0.5์ผ ๋•Œ ์ธก์ •ํ•œ ๊ฐ’์ด๋ฉฐ, AP50-95๋Š” IoU ์ž„๊ณ„๊ฐ’์„ 0.5์—์„œ 0.95๊นŒ์ง€ 0.05 ๊ฐ„๊ฒฉ์œผ๋กœ ์ฆ๊ฐ€์‹œํ‚ค๋ฉด์„œ ๊ณ„์‚ฐํ•œ AP ๊ฐ’์˜ ํ‰๊ท ์ด๋‹ค. AP50๊ณผ AP50-95๋Š” 0์—์„œ 1 ์‚ฌ์ด์˜ ๊ฐ’์„ ๊ฐ€์ง€๋ฉฐ, ๊ฐ’์ด ํด์ˆ˜๋ก ๋ชจ๋ธ์˜ ํƒ์ง€ ์ •ํ™•๋„๊ฐ€ ๋†’๋‹ค๋Š” ๊ฒƒ์„ ์˜๋ฏธํ•œ๋‹ค.

FPS๋Š” 1์ดˆ๋‹น ์ฒ˜๋ฆฌํ•  ์ˆ˜ ์žˆ๋Š” ์˜์ƒ ํ”„๋ ˆ์ž„์˜ ๊ฐœ์ˆ˜๋ฅผ ๋‚˜ํƒ€๋‚ด๋ฉฐ, FPS ๊ฐ’์ด ํด์ˆ˜๋ก ํƒ์ง€ ์†๋„๊ฐ€ ๋น ๋ฅด๋‹ค๋Š” ๊ฒƒ์„ ์˜๋ฏธํ•œ๋‹ค. FPS๋Š” ์‘์šฉ ๋ถ„์•ผ์— ๋”ฐ๋ผ ์š”๊ตฌ๋˜๋Š” ์ฒ˜๋ฆฌ ์†๋„๊ฐ€ ๋‹ค์–‘ํ•˜๋ฉฐ, ์˜์ƒ ๊ฐ์‹œ ๋ถ„์•ผ์—์„œ๋Š” ์ผ๋ฐ˜์ ์œผ๋กœ 10~15 FPS๊ฐ€ ์‚ฌ์šฉ๋œ๋‹ค(IPVM, 2021).

3. ๊ฒฐ๊ณผ ๋ฐ ๋ถ„์„

3.1 ์‹คํ—˜ ํ™˜๊ฒฝ ๊ตฌ์ถ•

๋ณธ ์‹คํ—˜์€ Intel(R) Core(TM) i9-12900F CPU@ 2.40 GHz, 128GB RAM, NVIDIA GeForce RTX 2060 ๊ทธ๋ž˜ํ”ฝ ์นด๋“œ, Windows 10(64bit) ํ™˜๊ฒฝ์—์„œ ์ˆ˜ํ–‰๋˜์—ˆ๋‹ค. ๋จผ์ €, ๋ฐ์ดํ„ฐ ๋ผ๋ฒจ๋ง ํ”Œ๋žซํผ์ธ ๋กœ๋ณดํ”Œ๋กœ์šฐ(Roboflow, 2025)์—์„œ 13,306์žฅ์˜ ์ธ๋ฌผ ์‚ฌ์ง„์„ ์ˆ˜์ง‘ํ•˜๊ณ  ํ•™์Šต ๋ฐ์ดํ„ฐ๋ฅผ ๊ตฌ์ถ•ํ•˜์˜€๋‹ค. ์ˆ˜์ง‘ํ•œ ๋ฐ์ดํ„ฐ๋ฅผ 7:3 ๋น„์œจ๋กœ ๋‚˜๋ˆ„์–ด YOLOv8n-seg ๋ชจ๋ธ์˜ ํ•™์Šต๊ณผ ๊ฒ€์ฆ์— ํ™œ์šฉํ•˜์˜€๋‹ค. ๋ชจ๋ธ ํ•™์Šต ํ•˜์ดํผ ํŒŒ๋ผ๋ฏธํ„ฐ๋กœ ์˜์ƒ ํฌ๊ธฐ๋Š” 640 ร— 640, ๋ฐฐ์น˜ ์‚ฌ์ด์ฆˆ๋Š” 16, ์—ํฌํฌ๋Š” 200ํšŒ๋ฅผ ์„ค์ •ํ•˜์˜€๋‹ค. ํ•™์Šต๋œ ๋ชจ๋ธ์„ ํ‰๊ฐ€ํ•œ ๊ฒฐ๊ณผ, F1-score๋Š” 0.82, AP50์€ 0.85, AP50-95๋Š” 0.61๋กœ ์ธก์ •๋˜์—ˆ๋‹ค.

๊ฐ ์‹คํ—˜ ๊ณต๊ฐ„์—์„œ ๊ด‘ํ•™ ์˜์ƒ์„ ์ทจ๋“ํ•˜๊ธฐ ์œ„ํ•ด 4์กฑ ๋ณดํ–‰ ๋กœ๋ด‡์— ์žฅ์ฐฉ๋œ ๊ด‘ํ•™ ์นด๋ฉ”๋ผ์™€ LED ์กฐ๋ช…์„ ์ด์šฉํ•˜์˜€๋‹ค(Fig. 2). Table 1์€ ์˜์ƒ ์ทจ๋“์— ์‚ฌ์šฉ๋œ ์นด๋ฉ”๋ผ์˜ ์‚ฌ์–‘์„ ๋ณด์—ฌ์ฃผ๋ฉฐ, ์นด๋ฉ”๋ผ์˜ ๋…ธ์ถœ์‹œ๊ฐ„(exposure time), ์ด๋“๊ฐ’(gain), ํ™”์ดํŠธ ๋ฐธ๋Ÿฐ์Šค(white balance)๋Š” ๊ฐ ์‹คํ—˜ ๊ณต๊ฐ„์— ๋”ฐ๋ผ ์ž๋™์œผ๋กœ ์ตœ์ ํ™”๋œ ๊ฐ’์„ ์‚ฌ์šฉํ•˜์˜€๋‹ค. ์‹ค๋‚ด ๊ณต๊ฐ„์€ ์กฐ๋ช…์˜ ์œ„์น˜์™€ ์‚ฌ์šฉ ์—ฌ๋ถ€์— ๋”ฐ๋ผ ์ƒ์ดํ•œ ์กฐ๋„ ํ™˜๊ฒฝ์ด ํ˜•์„ฑ๋˜๋ฉฐ, ๋ฐ์€ ์กฐ๋ช… ํ™˜๊ฒฝ(bright light), ํ๋ฆฟํ•œ ์กฐ๋ช… ํ™˜๊ฒฝ(dim light), ๋ถ€๋ถ„ ์กฐ๋ช… ํ™˜๊ฒฝ(partial light), ์กฐ๋ช…์ด ๊ฑฐ์˜ ์—†๋Š” ํ™˜๊ฒฝ(dark)์œผ๋กœ ๊ตฌ๋ถ„ํ•˜์˜€๋‹ค(Fig. 3).

Fig. 3(a)๋Š” ์‹ค๋‚ด ์กฐ๋ช…์„ ๋ชจ๋‘ ์‚ฌ์šฉํ•˜์—ฌ ๊ฐ€์žฅ ๋ฐ์€ ํ™˜๊ฒฝ์œผ๋กœ, ์‚ฌ๋žŒ์„ ์ธ์‹ํ•˜๊ธฐ ์šฉ์ดํ•˜๋‹ค. Fig. 3(b)์€ Fig. 3(a)๋ณด๋‹ค ์กฐ๋„๊ฐ€ ๋‚ฎ์ง€๋งŒ, ๋ฒฝ์— ๋ฐ˜์‚ฌ๋œ LED ์กฐ๋ช…์œผ๋กœ ์ธํ•ด ๊ทผ๊ฑฐ๋ฆฌ์— ์žˆ๋Š” ์‚ฌ๋žŒ์„ ์‰ฝ๊ฒŒ ๊ด€์ฐฐํ•  ์ˆ˜ ์žˆ๋‹ค. Fig. 3(c)๋Š” ์–ด๋‘ก์ง€๋งŒ, ๋ถ€๋ถ„์ ์œผ๋กœ ๋ฐ์€ ๊ณต๊ฐ„์ด ์กด์žฌํ•˜๋ฉฐ, ์‚ฌ๋žŒ์˜ ํ˜•ํƒœ๋ฅผ ํŒŒ์•…ํ•  ์ˆ˜ ์žˆ์œผ๋‚˜ ๊ฒฝ๊ณ„๊ฐ€ ํ๋ฆฟํ•˜๋‹ค. Fig. 3(d)๋Š” ๊ฐ€์žฅ ์–ด๋‘์šฐ๋ฉด์„œ ์‚ฌ๋žŒ์˜ ํฌ๊ธฐ๊ฐ€ ์ž‘๊ฒŒ ๋‚˜ํƒ€๋‚˜, ํ˜•ํƒœ์™€ ์œ„์น˜๋ฅผ ๋ช…ํ™•ํ•˜๊ฒŒ ํŒ๋ณ„ํ•˜๊ธฐ ์–ด๋ ต๋‹ค.

Fig. 2. A Quadruped Robot Equipped with a Camera

../../Resources/KSCE/Ksce.2025.45.2.0277/fig2.png

Fig. 3. Test Sites. (a) Bright Light, (b) Dim Light, (c) Partial Light, (d) Dark

../../Resources/KSCE/Ksce.2025.45.2.0277/fig3.png

Table 1. Specifications of the Camera

Camera Model

ZED 2i

Sensor Type

Dual 1/3" 4MP CMOS

Channel

R / G / B

Pixel Size

2ใŽ› ร— 2ใŽ›

Resolution (W ร— H)

672 ร— 376

3.2 ์ €์กฐ๋„ ์˜์ƒ์˜ ์‹œ์ธ์„ฑ ๊ฐœ์„  ํ‰๊ฐ€

Table 2๋Š” ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ• ์ ์šฉ ๊ฒฐ๊ณผ๋ฅผ ๋ณด์—ฌ์ค€๋‹ค. ์›๋ณธ ์˜์ƒ์€ ์กฐ๋„๊ฐ€ ๋‚ฎ์•„์ง์— ๋”ฐ๋ผ ๋ฐ๊ธฐ์™€ ๋Œ€๋น„๊ฐ€ ๊ฐ์†Œํ•˜์—ฌ ์‚ฌ๋žŒ์„ ์‹๋ณ„ํ•˜๊ธฐ ์–ด๋ ค์› ์œผ๋ฉฐ, ์ผ๋ถ€ ์–ด๋‘์šด ์˜์—ญ์—์„œ๋Š” ๋ฏธ์„ธํ•œ ๋ถ‰์€์ƒ‰ ๋…ธ์ด์ฆˆ๊ฐ€ ๊ด€์ฐฐ๋˜์—ˆ๋‹ค. ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•์„ ์ ์šฉํ•œ ๊ฒฐ๊ณผ, ์–ด๋‘์šด ์˜์—ญ์˜ ํ”ฝ์…€๊ฐ’์ด ์ฆํญ๋˜์–ด ๋‹ค์ˆ˜์˜ ์˜์ƒ์—์„œ ๋ถ‰์€์ƒ‰ ๋…ธ์ด์ฆˆ๊ฐ€ ๋‘๋“œ๋Ÿฌ์ง€๊ฒŒ ๋‚˜ํƒ€๋‚ฌ์œผ๋‚˜, ๋ฐ๊ธฐ์™€ ๋Œ€๋น„๊ฐ€ ๊ฐœ์„ ๋˜์–ด ๊ฐ€์‹œ๊ฑฐ๋ฆฌ๊ฐ€ ์ฆ๊ฐ€ํ•จ์— ๋”ฐ๋ผ ์‚ฌ๋žŒ ์ธ์‹์ด ์ˆ˜์›”ํ•ด์กŒ๋‹ค. GLADNet์€ ์˜์ƒ์˜ ์ƒ‰์ƒ๊ณผ ๋ฐ๊ธฐ๋ฅผ ๋™์‹œ์— ๊ฐœ์„ ํ•˜๋Š” ๋ฐ ํšจ๊ณผ์ ์ด์—ˆ์œผ๋‚˜, ์–ด๋‘์šด ์˜์ƒ์˜ ๋ถ‰์€์ƒ‰ ๋…ธ์ด์ฆˆ๋ฅผ ์ฆํญ์‹œ์ผฐ๋‹ค. KinD๋Š” ๋‹ค๋ฅธ ๊ธฐ๋ฒ•๋ณด๋‹ค ๋…ธ์ด์ฆˆ๊ฐ€ ์ ๊ณ , ๋ฐ๊ธฐ์™€ ์ƒ‰์ƒ์„ ๊ท ํ˜• ์žˆ๊ฒŒ ๊ฐœ์„ ํ•˜์˜€๋‹ค. TBEFN์€ ์–ด๋‘์šด ์˜์ƒ์˜ ๋ฐ๊ธฐ์™€ ๋Œ€๋น„๋ฅผ ๊ณ ๋ฅด๊ฒŒ ํ–ฅ์ƒ์‹œ์ผฐ์œผ๋ฉฐ, ๋ฐ์€ ์˜์ƒ์˜ ํ”ฝ์…€๊ฐ’ ๋˜ํ•œ ํฌ๊ฒŒ ์ฆ๊ฐ€์‹œ์ผฐ๋‹ค. LLFormer๋Š” ์ƒ‰์ƒ ๋ณต์› ํšจ๊ณผ๊ฐ€ ๋›ฐ์–ด๋‚ฌ์œผ๋‚˜, ๋ฐ๊ธฐ ๊ฐœ์„ ์— ํ•œ๊ณ„๊ฐ€ ์žˆ๋Š” ๊ฒƒ์œผ๋กœ ๊ด€์ฐฐ๋˜์—ˆ๋‹ค. EnlightenGAN์€ ์ €์กฐ๋„ ์˜์ƒ์˜ ๋ฐ๊ธฐ๋ฅผ ์ฆ๊ฐ€์‹œ์ผฐ์œผ๋‚˜, ์ง€๋‚˜์น˜๊ฒŒ ๊ฐ•ํ™”ํ•˜์—ฌ ์ƒ‰์ƒ์„ ์™œ๊ณก์‹œ์ผฐ๋‹ค. Zero-DCE๋Š” ์‹œ์ธ์„ฑ์„ ํ–ฅ์ƒ์‹œ์ผฐ์œผ๋‚˜, ๋ฐ๊ธฐ์™€ ๋Œ€๋น„๋ฅผ ๊ณผ๋„ํ•˜๊ฒŒ ์ฆ๊ฐ€์‹œ์ผœ ๋ถ€์ž์—ฐ์Šค๋Ÿฌ์šด ์˜์ƒ์„ ์ƒ์„ฑํ•˜์˜€๋‹ค.

Table 3์€ ์ €์กฐ๋„ ๊ฐ•ํ™” ์˜์ƒ์˜ ํ‰๊ท  ์˜์ƒ ํ‰๊ฐ€ ์ง€ํ‘œ๋ฅผ ๋ณด์—ฌ์ค€๋‹ค. BRISQUE์™€ NIQE์—์„œ Zero-DCE๋ฅผ ์ œ์™ธํ•œ ๋ชจ๋“  ๊ฐ•ํ™” ์˜์ƒ์˜ ํ’ˆ์งˆ์ด ๊ฐœ์„ ๋œ ๊ฒƒ์œผ๋กœ ๋‚˜ํƒ€๋‚ฌ๋‹ค. BRISQUE์—์„œ ์›๋ณธ ์˜์ƒ์€ 29.04์ธ ๋ฐ˜๋ฉด, TBEFN ์˜์ƒ์€ 20.24๋กœ ๊ฐ€์žฅ ๋‚ฎ์€ ๊ฐ’์ด ์‚ฐ์ถœ๋˜์–ด ์˜์ƒ ํ’ˆ์งˆ์ด ๊ฐ€์žฅ ํฌ๊ฒŒ ๊ฐœ์„ ๋œ ๊ฒƒ์œผ๋กœ ๋‚˜ํƒ€๋‚ฌ๋‹ค. EnlightenGAN๊ณผ KinD ์˜์ƒ์€ ๊ฐ๊ฐ 20.73๊ณผ 21.81๋กœ TBEFN์— ์ด์–ด ์šฐ์ˆ˜ํ•œ ์˜์ƒ ํ’ˆ์งˆ์„ ๋ณด์˜€๋‹ค. ํ•˜์ง€๋งŒ, ์œก์•ˆ ๋ถ„์„์—์„œ ๋ถ€์ž์—ฐ์Šค๋Ÿฌ์šด ์˜์ƒ์ด ๊ด€์ฐฐ๋œ EnlightenGAN์ด BRISQUE์—์„œ ๋‘ ๋ฒˆ์งธ๋กœ ์šฐ์ˆ˜ํ•œ ๊ธฐ๋ฒ•์œผ๋กœ ๋‚˜ํƒ€๋‚ฌ๋‹ค. ์ด๋Š” BRISQUE๊ฐ€ ์‚ฌ๋žŒ์˜ ์‹œ๊ฐ์  ํ’ˆ์งˆ ํŒ๋‹จ๊ณผ ๋‹ค์†Œ ์ฐจ์ด๊ฐ€ ์žˆ์Œ์„ ๋ณด์—ฌ์ค€๋‹ค. ํ•œํŽธ, NIQE์—์„œ๋Š” KinD ์˜์ƒ์ด 5.27๋กœ ๊ฐ€์žฅ ๋‚ฎ์€ ๊ฐ’์ด ์‚ฐ์ถœ๋˜์–ด ์˜์ƒ ํ’ˆ์งˆ์ด ๊ฐ€์žฅ ํšจ๊ณผ์ ์œผ๋กœ ๊ฐœ์„ ๋œ ๊ฒƒ์œผ๋กœ ๋‚˜ํƒ€๋‚ฌ๋‹ค. LLFormer์™€ TBEFN ์˜์ƒ์€ ๊ฐ๊ฐ 6.10, 6.79๋กœ ๊ณ„์‚ฐ๋˜์–ด KinD ์˜์ƒ์— ์ด์–ด ์˜์ƒ ํ’ˆ์งˆ์ด ๊ณ ๋ฅด๊ฒŒ ๊ฐœ์„ ๋œ ๊ฒƒ์œผ๋กœ ๋ถ„์„๋˜์—ˆ๋‹ค.

์ €์กฐ๋„ ์˜์ƒ์˜ ์‹œ์ธ์„ฑ ๊ฐœ์„ ์— ๋”ฐ๋ฅธ ์‚ฌ๋žŒ ํƒ์ง€ ์„ฑ๋Šฅ ํ–ฅ์ƒ์„ ํ‰๊ฐ€ํ•˜๊ธฐ ์œ„ํ•ด KinD, TBEFN, LLFormer ์˜์ƒ์„ ์„ ์ •ํ•˜์˜€๋‹ค. KinD์™€ TBEFN์€ ์œก์•ˆ ๋ถ„์„๊ณผ ํ‰๊ฐ€ ์ง€ํ‘œ์—์„œ ๋ชจ๋‘ ์šฐ์ˆ˜ํ•œ ๊ฒฐ๊ณผ๋ฅผ ๋ณด์ด๋ฉฐ, ๋‹ค์–‘ํ•œ ์ €์กฐ๋„ ํ™˜๊ฒฝ์—์„œ ๊ฐ•์ธํ•œ ์„ฑ๋Šฅ์„ ๋‚˜ํƒ€๋ƒˆ๋‹ค. LLFormer๋Š” ๋‘ ๊ธฐ๋ฒ•์— ์ด์–ด ์šฐ์ˆ˜ํ•œ ์˜์ƒ ํ’ˆ์งˆ ์ง€ํ‘œ๊ฐ€ ์‚ฐ์ถœ๋˜์—ˆ์œผ๋ฉฐ, ์œก์•ˆ ๋ถ„์„์—์„œ ์ €์กฐ๋„ ์˜์ƒ์˜ ์ƒ‰์ƒ์„ ํšจ๊ณผ์ ์œผ๋กœ ๋ณต์›ํ•˜๋Š” ํŠน์ง•์„ ๋ณด์˜€๋‹ค.

Table 2. Low-Light Image Enhancement Results

../../Resources/KSCE/Ksce.2025.45.2.0277/tb2.png

Table 3. Image Quality Metrics for the Raw Image and Enhanced Images

Methods

BRISQUE

(rank)

NIQE

(rank)

Raw

29.04 (6)

9.68 (6)

GLADNet

27.27 (5)

7.42 (4)

KinD

21.81 (3)

5.27 (1)

TBEFN

20.24 (1)

6.79 (3)

LLFormer

22.73 (4)

6.10 (2)

EnlightenGAN

20.73 (2)

7.60 (5)

Zero-DCE

29.48 (7)

10.39 (7)

3.3 ์ €์กฐ๋„ ๊ฐ•ํ™” ์˜์ƒ ๊ธฐ๋ฐ˜ ์‚ฌ๋žŒ ํƒ์ง€ ์„ฑ๋Šฅ ํ‰๊ฐ€

์ €์กฐ๋„ ๊ฐ•ํ™” ์˜์ƒ์˜ ์‹œ์ธ์„ฑ ๊ฐœ์„ ์— ๋”ฐ๋ฅธ ์‚ฌ๋žŒ ํƒ์ง€ ์„ฑ๋Šฅ ํ–ฅ์ƒ์„ ํ‰๊ฐ€ํ•˜๊ธฐ ์œ„ํ•ด KinD, TBEFN, LLFormer ์˜์ƒ์—์„œ F1-score, AP50, AP50-95๋ฅผ ๋น„๊ต ๋ถ„์„ํ•˜์˜€๋‹ค. Table 4๋Š” ์›๋ณธ ์˜์ƒ๊ณผ ๊ฐ•ํ™” ์˜์ƒ์—์„œ์˜ YOLOv8n-seg ๋ชจ๋ธ ์ ์šฉ ๊ฒฐ๊ณผ๋ฅผ ๋ณด์—ฌ์ค€๋‹ค. F1 score์—์„œ ์›๋ณธ ์˜์ƒ์€ 72.7 %์ธ ๋ฐ˜๋ฉด, KinD์™€ LLFormer์—์„œ๋Š” ๊ฐ๊ฐ 96.3 %์™€ 92.3 %๊ฐ€ ์‚ฐ์ถœ๋˜์–ด ํƒ์ง€ ์ •ํ™•๋„๊ฐ€ ํฌ๊ฒŒ ํ–ฅ์ƒ๋˜์—ˆ์Œ์„ ๋ณด์—ฌ์ค€๋‹ค. ํ•˜์ง€๋งŒ, TBEFN์€ 72.7 %๋กœ, ์›๋ณธ ์˜์ƒ๊ณผ ๋™์ผํ•œ ์„ฑ๋Šฅ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. AP50์—์„œ KinD๋Š” 96.4 %๊ฐ€ ์‚ฐ์ถœ๋˜์–ด ๊ฐ€์žฅ ๋†’์€ ์ •ํ™•๋„๋ฅผ ๋ณด์˜€๊ณ , LLFormer๋Š” 92.8 %๊ฐ€ ์‚ฐ์ถœ๋˜์–ด ๋‘ ๋ฒˆ์งธ๋กœ ๋†’์€ ์ •ํ™•๋„๋ฅผ ๋ณด์˜€๋‹ค. ๋ฐ˜๋ฉด, TBEFN์€ ํƒ์ง€ ์„ฑ๋Šฅ์ด ์œ ์ง€๋œ ๊ฒƒ์œผ๋กœ ๋‚˜ํƒ€๋‚ฌ๋‹ค. AP50-95์—์„œ KinD์™€ LLFormer๋Š” 69.5 %, 64.3 %๋กœ ๋†’์€ ํƒ์ง€ ์ •ํ™•๋„๊ฐ€ ์‚ฐ์ถœ๋˜์—ˆ์œผ๋‚˜, TBEFN์€ 60.5 %๋กœ ์›๋ณธ ์˜์ƒ(61.2 %)๋ณด๋‹ค ๋‚ฎ์€ ํƒ์ง€ ์ •ํ™•๋„๊ฐ€ ์‚ฐ์ถœ๋˜์—ˆ๋‹ค.

Table 5๋Š” YOLOv8n-seg ๋ชจ๋ธ ์ ์šฉ ๊ฒฐ๊ณผ๋ฅผ ์‹œ๊ฐ์ ์œผ๋กœ ๋ณด์—ฌ์ค€๋‹ค. ์›๋ณธ ์˜์ƒ์˜ ๊ฒฝ์šฐ ๋ฐ์€ ํ™˜๊ฒฝ์—์„œ๋Š” ์‚ฌ๋žŒ์˜ ๋ฐ”์šด๋”ฉ ๋ฐ•์Šค์™€ ๋งˆ์Šคํฌ๋ฅผ ์ •ํ™•ํžˆ ํƒ์ง€ํ•  ์ˆ˜ ์žˆ์—ˆ์œผ๋‚˜, ์กฐ๋„๊ฐ€ ๋‚ฎ์•„์งˆ์ˆ˜๋ก ๊ฐ์ฒด ๊ฐ„ ๊ฒฝ๊ณ„๊ฐ€ ํ๋ฆฟํ•˜์—ฌ ์‚ฌ๋žŒ์„ ์ •ํ™•ํžˆ ํƒ์ง€ํ•˜์ง€ ๋ชปํ•˜์˜€๋‹ค. ๋ฐ˜๋ฉด, KinD์™€ LLFormer๋Š” ์ €์กฐ๋„ ์˜์ƒ์˜ ์‹œ์ธ์„ฑ์„ ๊ฐœ์„ ํ•จ์œผ๋กœ์จ ๋‚ฎ์€ ์กฐ๋„ ํ™˜๊ฒฝ์—์„œ๋„ ๋†’์€ ํƒ์ง€ ์„ฑ๋Šฅ์„ ๋‚˜ํƒ€๋ƒˆ๊ณ , ์‹ค์ œ ์‚ฌ๋žŒ์˜ ํ˜•์ƒ๊ณผ ์œ ์‚ฌํ•œ ๋งˆ์Šคํฌ๋ฅผ ์ƒ์„ฑํ•˜์˜€๋‹ค. ํŠนํžˆ, ๊ฐ€์žฅ ์–ด๋‘์šด ํ™˜๊ฒฝ์—์„œ ์›๋ณธ ์˜์ƒ์€ ์‚ฌ๋žŒ์„ ํƒ์ง€ํ•˜์ง€ ๋ชปํ•˜์˜€์œผ๋‚˜, KinD์™€ LLFormer ์˜์ƒ์€ ์˜์ž์™€ ์ค‘์ฒฉ๋œ ์‚ฌ๋žŒ์„ ์ œ์™ธํ•œ ๋ชจ๋“  ์‚ฌ๋žŒ์„ ์ •ํ™•ํ•˜๊ฒŒ ํƒ์ง€ํ•˜์˜€๋‹ค. ํ•˜์ง€๋งŒ, TBEFN ์˜์ƒ์€ ๋‚ฎ์€ ํƒ์ง€ ์„ฑ๋Šฅ์ด ๋‚˜ํƒ€๋‚ฌ๋‹ค. ๋ฐ์€ ํ™˜๊ฒฝ์—์„œ๋Š” ๊ณผ๋„ํ•œ ๊ฐ•ํ™”๋กœ ์ธํ•ด ์˜์ƒ์˜ ์„ธ๋ถ€ ์ •๋ณด๊ฐ€ ์†์‹ค๋˜์–ด ์‚ฌ๋žŒ ํ•œ ๋ช…์„ ํƒ์ง€ํ•˜์ง€ ๋ชปํ•˜์˜€๊ณ , ๊ฐ€์žฅ ์–ด๋‘์šด ํ™˜๊ฒฝ์—์„œ๋Š” ์‹œ์ธ์„ฑ์ด ๊ฐœ์„ ๋˜์—ˆ์Œ์—๋„ ์‚ฌ๋žŒ ํƒ์ง€์— ์‹คํŒจํ•˜์˜€๋‹ค. ์ด๋Š” ์˜์ƒ์˜ ์‹œ์ธ์„ฑ์„ ๊ฐœ์„ ํ•˜๋Š” ๊ณผ์ •์—์„œ ์†์‹ค๋œ ์˜์ƒ์˜ ์„ธ๋ถ€ ์ •๋ณด๊ฐ€ YOLOv8n-seg ๋ชจ๋ธ์˜ ํƒ์ง€ ์„ฑ๋Šฅ์— ๋ถ€์ •์ ์ธ ์˜ํ–ฅ์„ ๋ฏธ์นœ ๊ฒƒ์œผ๋กœ ํŒ๋‹จ๋˜๋ฉฐ, ๊ธฐ๋ฒ•์— ๋”ฐ๋ผ ํƒ์ง€ ์„ฑ๋Šฅ์ด ํฌ๊ฒŒ ๋‹ฌ๋ผ์งˆ ์ˆ˜ ์žˆ์Œ์„ ๋ณด์—ฌ์ค€๋‹ค.

์‹คํ—˜ ๊ฒฐ๊ณผ KinD๊ฐ€ ๋ชจ๋“  ์ง€ํ‘œ(F1 score, AP50, AP50-95)์—์„œ ๊ฐ€์žฅ ์šฐ์ˆ˜ํ•œ ์„ฑ๋Šฅ์„ ๋‚˜ํƒ€๋ƒˆ๊ณ , LLFormer๊ฐ€ ๋‘ ๋ฒˆ์งธ๋กœ ์šฐ์ˆ˜ํ•œ ํƒ์ง€ ์„ฑ๋Šฅ์„ ๋‚˜ํƒ€๋ƒˆ๋‹ค. ๋‘ ๊ธฐ๋ฒ•์€ ๊ฑด๋ฌผ ๋‚ด๋ถ€์˜ ์•ผ๊ฐ„ ์กฐ๋„ ํ™˜๊ฒฝ์—์„œ YOLOv8n-seg ๋ชจ๋ธ์˜ ํƒ์ง€ ์„ฑ๋Šฅ์„ ํšจ๊ณผ์ ์œผ๋กœ ๊ฐœ์„ ํ•œ ๊ฒƒ์œผ๋กœ ํ™•์ธ๋˜์—ˆ๋‹ค. ํ•˜์ง€๋งŒ, TBEFN์€ ํ‰๊ฐ€ ์ง€ํ‘œ์—์„œ ์›๋ณธ ์˜์ƒ๊ณผ ๋™์ผํ•˜๊ฑฐ๋‚˜ ๋‚ฎ์€ ๊ฐ’์„ ๊ธฐ๋กํ•˜์—ฌ YOLOv8n-seg ๋ชจ๋ธ์˜ ์„ฑ๋Šฅ์ด ์ €ํ•˜๋œ ๊ฒƒ์œผ๋กœ ๋ถ„์„๋˜์—ˆ๋‹ค.

์•ผ๊ฐ„ ์ˆœ์ฐฐ ์˜์ƒ์—์„œ ์‹ค์‹œ๊ฐ„ ์‚ฌ๋žŒ ํƒ์ง€ ๊ฐ€๋Šฅ์„ฑ์„ ๊ฒ€ํ† ํ•˜๊ธฐ ์œ„ํ•ด ์‹œ์ธ์„ฑ๊ณผ ํƒ์ง€ ์„ฑ๋Šฅ์ด ํฌ๊ฒŒ ํ–ฅ์ƒ๋œ KinD์™€ LLFormer ์˜์ƒ์„ ๋Œ€์ƒ์œผ๋กœ ๊ฐ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•์˜ ์‹œ์ธ์„ฑ ๊ฐœ์„ ๊ณผ YOLOv8n-seg ๋ชจ๋ธ์˜ ์‚ฌ๋žŒ ํƒ์ง€ ๊ณผ์ •์„ ๋ชจ๋‘ ํฌํ•จํ•œ ์ „์ฒด ์ฒ˜๋ฆฌ ์†๋„๋ฅผ ๋ถ„์„ํ•˜์˜€๋‹ค(Table 6). ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•์ด ์ ์šฉ๋˜์ง€ ์•Š์€ ๊ฒฝ์šฐ, ์˜์ƒ๋‹น 0.011์ดˆ๊ฐ€ ์†Œ์š”๋˜๋ฉฐ 90.91 FPS์˜ ํƒ์ง€ ์†๋„๋ฅผ ๋ณด์˜€๋‹ค. LLFormer ๊ธฐ๋ฒ•์€ 2.62 FPS์˜ ์ถ”๋ก  ์†๋„๋ฅผ ๋ณด์ด๋ฉฐ, ์ˆœ์ฐฐ ๋กœ๋ด‡์˜ ์‹ค์‹œ๊ฐ„ ํƒ์ง€์— ์ ํ•ฉํ•˜์ง€ ์•Š์€ ๊ฒƒ์œผ๋กœ ๋ถ„์„๋˜์—ˆ๋‹ค. ๋ฐ˜๋ฉด, ๊ฐ€์žฅ ์šฐ์ˆ˜ํ•œ ํƒ์ง€ ์„ฑ๋Šฅ์„ ๋ณด์˜€๋˜ KinD ๊ธฐ๋ฒ•์€ 13.89 FPS์˜ ํƒ์ง€ ์†๋„๋ฅผ ๋ณด์—ฌ ์ˆœ์ฐฐ ๋กœ๋ด‡์˜ ์‹ค์‹œ๊ฐ„ ํƒ์ง€์— ํ™œ์šฉํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์œผ๋กœ ํŒ๋‹จ๋œ๋‹ค.

Table 4. Results of YOLOv8n-seg Using Different Image Enhance- ment Methods

Methods

F1 Score

(%)

AP50

(%)

AP50-95

(%)

YOLOv8n-seg

72.7

78.6

61.2

KinD + YOLOv8n-seg

96.3

96.4

69.5

TBEFN + YOLOv8n-seg

72.7

78.6

60.5

LLFormer + YOLOv8n-seg

92.3

92.8

64.3

Table 5. Human Detection Results of the YOLOv8n-seg with Image Enhancement Methods

../../Resources/KSCE/Ksce.2025.45.2.0277/tb5.png

Table 6. Comparison of Speed for Human Detection

Methods

Run Time

(s)

Frames Per Second (FPS)

YOLOv8n-seg

0.011

90.91

KinD + YOLOv8n-seg

0.072

13.89

LLFormer + YOLOv8n-seg

0.381

2.62

4. ๊ฒฐ ๋ก 

๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ๋‹ค์–‘ํ•œ ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ•(GLADNet, KinD, TBEFN, LLFormer, EnlightenGAN, Zero-DCE)์„ ์•ผ๊ฐ„ ์ˆœ์ฐฐ ์˜์ƒ์— ์ ์šฉํ•˜์—ฌ ์‹œ์ธ์„ฑ ๊ฐœ์„  ํšจ๊ณผ์™€ YOLOv8n-seg ๋ชจ๋ธ์˜ ์‚ฌ๋žŒ ํƒ์ง€ ์„ฑ๋Šฅ ํ–ฅ์ƒ์„ ๋ถ„์„ํ•˜์˜€๋‹ค. ์ €์กฐ๋„ ๊ฐ•ํ™” ์˜์ƒ์€ ๋ฐ๊ธฐ์™€ ๋Œ€๋น„๊ฐ€ ์ฆ๊ฐ€ํ•จ์— ๋”ฐ๋ผ ์‹œ์ธ์„ฑ์ด ๊ฐœ์„ ๋˜์–ด ์‚ฌ๋žŒ์˜ ์œ„์น˜์™€ ํ˜•ํƒœ๋ฅผ ๋ช…ํ™•ํ•˜๊ฒŒ ํŒŒ์•…ํ•  ์ˆ˜ ์žˆ์—ˆ๊ณ , YOLOv8n-seg ๋ชจ๋ธ์˜ ํƒ์ง€ ์ •ํ™•๋„๊ฐ€ ์ฆ๊ฐ€ํ•˜์˜€๋‹ค.

๊ฑด๋ฌผ ๋‚ด๋ถ€์˜ ์•ผ๊ฐ„ ์กฐ๋„ ํ™˜๊ฒฝ์—์„œ ์ˆœ์ฐฐ ์˜์ƒ์˜ ์ƒ‰์ƒ๊ณผ ๋ฐ๊ธฐ๋ฅผ ๊ณ ๋ฅด๊ฒŒ ๊ฐœ์„ ํ•œ ๊ธฐ๋ฒ•์€ KinD, TBEFN, LLFormer๋กœ ๋ถ„์„๋˜์—ˆ๋‹ค. KinD๋Š” ์œก์•ˆ ๋ถ„์„๊ณผ ํ‰๊ฐ€ ์ง€ํ‘œ์—์„œ ์ €์กฐ๋„ ์˜์ƒ์˜ ์ƒ‰์ƒ๊ณผ ๋ฐ๊ธฐ๋ฅผ ๊ฐ€์žฅ ํšจ๊ณผ์ ์œผ๋กœ ๋ณต์›ํ•˜์˜€๋‹ค. TBEFN์€ KinD์™€ ์œ ์‚ฌํ•˜๊ฒŒ ์˜์ƒ ํ’ˆ์งˆ์„ ํฌ๊ฒŒ ๊ฐœ์„ ํ•˜์˜€๊ณ , LLFormer๋Š” ์ƒ๋Œ€์ ์œผ๋กœ ํ’ˆ์งˆ ๊ฐœ์„  ํšจ๊ณผ๊ฐ€ ๋‚ฎ์•˜์œผ๋‚˜, ์ƒ‰์ƒ ๋ณต์› ํšจ๊ณผ๊ฐ€ ๋›ฐ์–ด๋‚œ ํŠน์ง•์ด ์žˆ์—ˆ๋‹ค. KinD, TBEFN, LLFormer ์˜์ƒ์„ ๋Œ€์ƒ์œผ๋กœ YOLOv8n-seg ๋ชจ๋ธ์„ ์ ์šฉํ•˜์˜€๊ณ , KinD์™€ LLFormer ์˜์ƒ์—์„œ ํƒ์ง€ ์„ฑ๋Šฅ์ด ํฌ๊ฒŒ ํ–ฅ์ƒ๋œ ๊ฒƒ์œผ๋กœ ๋‚˜ํƒ€๋‚ฌ๋‹ค. ํŠนํžˆ, KinD๋Š” ์ €์กฐ๋„ ์˜์ƒ์˜ ์‹œ์ธ์„ฑ์„ ์‹ค์‹œ๊ฐ„์œผ๋กœ ๊ฐœ์„ ํ•˜๊ณ , ํƒ์ง€ ์„ฑ๋Šฅ์„ ๊ฐ€์žฅ ํฌ๊ฒŒ ๊ฐœ์„ ํ•˜๋Š” ๊ฒƒ์œผ๋กœ ๋ถ„์„๋˜์—ˆ๋‹ค.

์ˆœ์ฐฐ ๋กœ๋ด‡์ด ์ดฌ์˜ํ•œ ์˜์ƒ์€ ์›๊ฒฉ ์กฐ์ข…์ž์˜ ๋กœ๋ด‡ ์กฐ์ž‘, ์ˆœ์ฐฐ ๊ตฌ์—ญ ํŒŒ์•…, ์˜์‚ฌ ๊ฒฐ์ •์„ ์ง€์›ํ•˜๋ฉฐ, ๋”ฅ๋Ÿฌ๋‹ ๊ธฐ๋ฐ˜ ๊ฐ์ฒด ํƒ์ง€ ๊ธฐ๋ฒ•๊ณผ ๊ฒฐํ•ฉํ•˜์—ฌ ์‹ค๋‚ด ๋ณด์•ˆ ๋ฐ ๋ฐฉ๋ฒ”์— ํ™œ์šฉ๋  ์ˆ˜ ์žˆ๋‹ค. ์‹ค๋‚ด ์•ผ๊ฐ„ ์กฐ๋„ ํ™˜๊ฒฝ์— ๊ฐ•์ธํ•œ ๊ฒƒ์œผ๋กœ ๋ถ„์„๋œ KinD๋Š” ์ˆœ์ฐฐ ์˜์ƒ์˜ ์‹œ์ธ์„ฑ์„ ์‹ค์‹œ๊ฐ„์œผ๋กœ ๊ฐœ์„ ํ•˜๊ณ  ๋กœ๋ด‡์˜ ํƒ์ง€ ์„ฑ๋Šฅ์„ ํฌ๊ฒŒ ํ–ฅ์ƒ์‹œ์ผœ ์ˆœ์ฐฐ ๋กœ๋ด‡์˜ ์ž„๋ฌด ์ˆ˜ํ–‰ ๋Šฅ๋ ฅ์„ ์ฆ๋Œ€์‹œํ‚ฌ ์ˆ˜ ์žˆ์„ ๊ฒƒ์œผ๋กœ ์˜ˆ์ƒ๋œ๋‹ค. ์ˆœ์ฐฐ ๋กœ๋ด‡์€ ๊ฑด๋ฌผ ๋‚ด๋ถ€๋ฅผ ์ž์œ ๋กญ๊ฒŒ ์ด๋™ํ•˜๋ฉด์„œ ์นจ์ž…์ž๋‚˜ ์ด์ƒ ์ƒํ™ฉ์„ ์‹ ์†ํ•˜๊ฒŒ ํƒ์ง€ํ•˜๊ณ  ๋Œ€์‘ํ•ด์•ผ ํ•˜๋ฏ€๋กœ, ์˜์ƒ์˜ ์‹œ์ธ์„ฑ ๊ฐœ์„  ๋ฐ ํƒ์ง€ ์„ฑ๋Šฅ ํ–ฅ์ƒ๊ณผ ๋”๋ถˆ์–ด ์ž์œจ ์ฃผํ–‰์„ ์œ„ํ•œ ์ •ํ™•ํ•œ ์œ„์น˜ ์ถ”์ •๊ณผ ์ง€๋„ ๊ตฌ์ถ•์ด ํ•„์ˆ˜์ ์ด๋‹ค. ํ–ฅํ›„ ์—ฐ๊ตฌ์—์„œ๋Š” ๋ฌด์ธ ์ˆœ์ฐฐ ๋กœ๋ด‡ ์šด์šฉ์„ ์œ„ํ•ด ์ €์กฐ๋„ ์˜์ƒ๊ฐ•ํ™” ๊ธฐ๋ฒ• ๋ฐ ๊ฐ์ฒด ํƒ์ง€ ๊ธฐ๋ฒ•๊ณผ Visual SLAM(Simultaneous Localization And Mapping) ๊ธฐ๋ฒ• ๊ฐœ๋ฐœ์ด ํ•„์š”ํ•˜๋‹ค.

Acknowledgements

This work was supported by the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIT) (No. 2022R1F1A1064577).

References

1 
"Banerjee, S., Kumar, A. and Shekhar, A. (2024). โ€œIndoor surveillance robot with person following and re-identification.โ€ Proc. of the 2024 International Joint Conference on Neural Networks (IJCNN), IEEE, Yokohama, Japan, pp. 1-8."DOI
2 
"Bosse, S., Maniry, D., Muller, K. R., Wiegand, T. and Samek, W. (2017). โ€œDeep neural networks for no-reference and full- reference image quality assessment.โ€ IEEE Transactions on Image Processing, IEEE, Vol. 27, No. 1, pp. 206-219, https://doi. org/10.1109/TIP.2017.2760518."DOI
3 
"Chang, J. H., Na, K. I. and Shin, H. C. (2022). โ€œTrend of technology for outdoor security robots based on multimodal sensors.โ€ Electronics and Telecommunications Trends, Electronics and Telecommunications Research Institute, Vol. 37, No. 1, pp. 1-9, http://dx.doi.org/10.22648/ETRI.2022.J.370101 (in Korean)."DOI
4 
"Chen, G. (2022). โ€œRobotics applications at airports: Situation and tendencies.โ€ Proc. of the 2022 14th International Conference on Measuring Technology and Mechatronics Automation (ICMTMA), IEEE, Changsha, China, pp. 536-539."DOI
5 
"Chen, L., Fu, Y., Wei, K., Zheng, D. and Heide, F. (2023). โ€œInstance segmentation in the dark.โ€ International Journal of Computer Vision, Springer, Vol. 131, No. 8, pp. 2198-2218, https://doi.org/ 10.1007/s11263-023-01808-8."DOI
6 
"Choi, J. W., Park, J. T. and Kim, M. S. (2022). โ€œIndoor intruder detection using YOLO and VMD in indoor security robot.โ€ Journal of The Korean Institute of Plant Engineering and Safety, The Korean Institute of Plant Engineering and Safety, Vol. 27, No. 4, pp. 13-19 (in Korean)."URL
7 
"Guo, C., Li, C., Guo, J., Loy, C. C., Hou, J., Kwong, S. and Cong, R. (2020). โ€œZero-reference deep curve estimation for low-light image enhancement.โ€ Proc. of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, Seattle, WA, USA, pp. 1780-1789."URL
8 
"Hamid, W., Faudzi, A. A. M. and Ismail, K. B. (2022). โ€œDesign and analysis of an articulated tracked robot for search and rescue operations.โ€ Proc. of the 2022 IEEE 5th International Symposium in Robotics and Manufacturing Automation (ROMA), IEEE, Malacca, Malaysia, pp. 1-6."DOI
9 
"IPVM (2021). Frame rate guide for video surveillance, Available at: https://ipvm.com/reports/frame-rate-surveillance-guide (Accessed: February 4, 2025)."URL
10 
"Jiang, Y., Gong, X., Liu, D., Cheng, Y., Fang, C., Shen, X., Yang, J., Zhou, P. and Wang, Z. (2021). โ€œEnlightengan: Deep light enhancement without paired supervision.โ€ IEEE Transactions on Image Processing, IEEE, Vol. 30, pp. 2340-2349, https://doi.org/10.1109/TIP.2021.3051462."DOI
11 
"Jingchun, Z., Su, G. E. and Sunar, M. S. (2024). โ€œLow-light image enhancement: A comprehensive review on methods, datasets and evaluation metrics.โ€ Journal of King Saud University-Computer and Information Sciences, Elsevier, Vol. 36, No. 10, p. 102234, https://doi.org/10.1016/j.jksuci.2024.102234."DOI
12 
"Kim, K. H., Lim, S. H., Kim, M. S., Chung, J. S. and Choi, Y. J. (2022). โ€œUnmanned facility inspection at steelworks using a quadrupedal robot.โ€ Proc. of the 2022 the 37th ICROS Annual Conference, Institute of Control, Robotics and Systems, pp. 379- 380 (in Korean)."URL
13 
"Kwak, S. S. (2014). Working conditions and improvement plan for apartment security workers in Seoul (in Korean)."URL
14 
"Lafuente-Arroyo, S., Martรญn-Martรญn, P., Iglesias-Iglesias, C., Maldonado-Bascรณn, S. and Acevedo-Rodrรญguez, F. J. (2022). โ€œRGB camera-based fallen person detection system embedded on a mobile platform.โ€ Expert Systems with Applications, Elsevier, Vol. 197, p. 116715, https://doi.org/10.1016/j.eswa.2022.116715."DOI
15 
"Lee, M. F. R. and Chien, T. W. (2020). โ€œIntelligent robot for worker safety surveillance: Deep learning perception and visual navigation.โ€ Proc. of the 2020 International Conference on Advanced Robotics and Intelligent Systems (ARIS), IEEE, Taipei, Taiwan, pp. 1-6."DOI
16 
"Lee, K. S., Ovinis, M., Nagarajan, T., Seulin, R. and Morel, O. (2015). โ€œAutonomous patrol and surveillance system using unmanned aerial vehicles.โ€ Proc. of the 2015 IEEE 15th International Conference on Environment and Electrical Engineering (EEEIC), IEEE, Rome, Italy, pp. 1291-1297."DOI
17 
"Lopez, A., Paredes, R., Quiroz, D., Trovato, G. and Cuellar, F. (2017). โ€œRobotman: A security robot for human-robot interaction.โ€ Proc. of the 2017 18th International Conference on Advanced Robotics (ICAR), IEEE, Hong Kong, China, pp. 7-12."DOI
18 
"Lรณpez, J., Pรฉrez, D., Paz, E. and Santana, A. (2013). โ€œWatchBot: A building maintenance and surveillance system based on autonomous robots.โ€ Robotics and Autonomous Systems, Elsevier, Vol. 61, No. 12, pp. 1559-1571, https://doi.org/10.1016/j.robot.2013.06.012."DOI
19 
"Lu, K. and Zhang, L. (2020). โ€œTBEFN: A two-branch exposure- fusion network for low-light image enhancement.โ€ IEEE Transactions on Multimedia, IEEE, Vol. 23, pp. 4093-4105, https://doi.org/10.1109/TMM.2020.3037526."DOI
20 
"Mittal, A., Moorthy, A. K. and Bovik, A. C. (2012a). โ€œNo-reference image quality assessment in the spatial domain.โ€ IEEE Transactions on Image Processing, IEEE, Vol. 21, No. 12, pp. 4695-4708, https://doi.org/10.1109/TIP.2012.2214050."DOI
21 
"Mittal, A., Soundararajan, R. and Bovik, A. C. (2012b). โ€œMaking a โ€œcompletely blindโ€ image quality analyzer.โ€ IEEE Signal Processing Letters, IEEE, Vol. 20, No. 3, pp. 209-212, https://doi.org/10.1109/LSP.2012.2227726."DOI
22 
"Park, S. H. and Bae, D. Y. (2015). โ€œA change of private security labor market and countermeasures by population aging.โ€ Journal of Convergence Security, Korea Convergence Security Association, Vol. 15, No. 1, pp. 11-18 (in Korean)."URL
23 
"Roboflow (2025). Roboflow: Computer vision tools for developers and enterprises, Available at: https://roboflow.com/ (Accessed: January 31, 2025)."URL
24 
"Terven, J., Cordova-Esparza, D. M. and Romero-Gonzalez, J. A. (2023). โ€œA comprehensive review of yolo architectures in computer vision: From yolov1 to yolov8 and yolo-nas.โ€ Machine Learning and Knowledge Extraction, MDPI, Vol. 5, No. 4, pp. 1680-1716, https://doi.org/10.3390/make5040083."DOI
25 
"Ultralytics (2023). YOLOv8 โ€“ Ultralytics YOLO docs, Available at: https://docs.ultralytics.com/models/yolov8/ (Accessed: January 31, 2025)."URL
26 
"Wang, W., Wei, C., Yang, W. and Liu, J. (2018). โ€œGladnet: Low-light enhancement network with global awareness.โ€ Proc. of the 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), IEEE, Xiโ€™an, China, pp. 751-755."DOI
27 
"Wang, T., Zhang, K., Shen, T., Luo, W., Stenger, B. and Lu, T. (2023). โ€œUltra-high-definition low-light image enhancement: A benchmark and transformer-based method.โ€ Proc. of the 37th AAAI Conference on Artificial Intelligence, AAAI, Washington, DC, USA, pp. 2654-2662."DOI
28 
"Yoo, B. C. and Shin, S. J. (2024). โ€œProposal for research model of high-function patrol robot using integrated sensor system.โ€ The Journal of the Institute of Internet, Broadcasting and Communication, The Institute of Internet, Broadcasting and Communication, Vol. 24, No. 3, pp. 77-85, https://doi.org/10.7236/JIIBC.2024.24.3.77 (in Korean)."DOI
29 
"Yousif, T. and El-Medany, W. (2022). โ€œDevelopment and hardware implementation of IoT-based patrol robot for remote gas leak inspection.โ€ International Journal of Electrical and Computer Engineering Systems, J.J. Strossmayer University of Osijek, Faculty of Electrical Engineering, Computer Science and Information Technology, Vol. 13, No. 4, pp. 279-292, https://doi.org/10.32985/ijeces.13.4.4."DOI
30 
"Zhang, Y., Zhang, J. and Guo, X. (2019). โ€œKindling the darkness: A practical low-light image enhancer.โ€ Proc. of the 27th ACM International Conference on Multimedia, Association for Computing Machinery, Nice, France, pp. 1632-1640."DOI