DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Yoon, Kuk-Jin | - |
dc.contributor.advisor | 윤국진 | - |
dc.contributor.author | Lee, Jungwon | - |
dc.date.accessioned | 2022-04-21T19:30:36Z | - |
dc.date.available | 2022-04-21T19:30:36Z | - |
dc.date.issued | 2021 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=948598&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/295223 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 미래자동차학제전공, 2021.2,[iv, 30 p. :] | - |
dc.description.abstract | It is very important to detect objects in autonomous driving situations. However, it will be difficult to detect objects in certain situations where the surroundings are dark, fog, or rain. Among such specific situations, I propose a model that enables object detection both day and night. Since most of the existing 3D driving data have many daytime conditions, it is difficult to detect an object at night. So, using the image conversion model, the corresponding day image is converted into a night image based on the night image that is not paired with the day image. At this time, the depth map was used as a guide so that it could be more helpful in image conversion. In the 3D object detection part, depth aware convolution is used that considers the characteristics of the driving image. In addition, corner loss was applied to correct the vehicle heading angle, and the image translation model and the object detector model were fine-tuned to each other. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | 3D Object detection▼aImage translation▼aDeep learning▼aDepth estimation | - |
dc.subject | 3D 객체 탐지▼a이미지 변환▼a딥 러닝▼a깊이 추정 | - |
dc.title | All-day 3D object detection using image to image translation | - |
dc.title.alternative | 이미지 변환을 이용한 주야간에서의 3D 객체 탐지 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :미래자동차학제전공, | - |
dc.contributor.alternativeauthor | 이정원 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.