DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | 이도헌 | - |
dc.contributor.author | Kim, Yeongrok | - |
dc.contributor.author | 김영록 | - |
dc.date.accessioned | 2024-07-30T19:30:56Z | - |
dc.date.available | 2024-07-30T19:30:56Z | - |
dc.date.issued | 2024 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1096657&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/321440 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 바이오및뇌공학과, 2024.2,[iv, 34 p. :] | - |
dc.description.abstract | Breast cancer is the most prevalent cancer and a major contributor of cancer-related deaths among women. Accurate prognostic analysis of breast cancer is essential for effective treatment. To achieve this, there is an ongoing effort to construct multimodal deep neural network models, using a comprehensive range of data including clinical data and genomic information. However, while clinical data are relatively abundant, genomic data acquisition is time-consuming and costly, presenting a significant challenge. This paper addresses this limitation by employing active learning, a method that prioritizes unverified data likely to significantly enhance model performance for training. Our findings demonstrate that this active learning-based data selection approach significantly improves model performance compared to random data extraction, offering a promising strategy for efficient and effective breast cancer prognostic analysis. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | 능동 학습▼a유방암▼a예후▼a다중 모달▼a심층 신경망 | - |
dc.subject | Active learning▼abreast cancer aprognostic▼amultimodal▼adeep neural network | - |
dc.title | Active learning for improving multimodal breast cancer prognostic model performance | - |
dc.title.alternative | 다중 모달 유방암 예후 모델의 성능 향상을 위한 능동 학습 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :바이오및뇌공학과, | - |
dc.contributor.alternativeauthor | Lee, Doheon | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.