DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Yoo, Chang-Dong | - |
dc.contributor.advisor | 유창동 | - |
dc.contributor.author | Na, Yongcheon | - |
dc.contributor.author | 나용천 | - |
dc.date.accessioned | 2017-03-29T02:32:16Z | - |
dc.date.available | 2017-03-29T02:32:16Z | - |
dc.date.issued | 2015 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=657545&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/221385 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 미래자동차학제전공, 2015.2 ,[iv, 29 p. :] | - |
dc.description.abstract | This paper describes an algorithm using deep probabilistic model, referred to as sum-product networks (SPNs), for cell classification: it take a trained pathologist to distinguish the human epithelial type 2 cells with 73.3{\%} accuracy. The SPNs reduce generalization errors by maximizing the margin between the conditional probability of the true label and the maximum conditional probability of the label that is not a true label. In the SPNs architecture, the most confusing classes are grouped such that have a common parent sum node, referred to as sub-networks of SPNs (sub-SPNs). The sub-SPNs are one of the solutions to gradient diffusion problems, and are combined with maximum margin learning algorithm. The proposed SPN performed better than all other state-of-the-art algorithms on HEp-2 cells dataset and convolutional neural networks on Feulgen stained cells dataset. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | sum-product networks | - |
dc.subject | sub-networks of SPNs | - |
dc.subject | maximum margin learning | - |
dc.subject | 합-곱 네트워크 | - |
dc.subject | 보조 네트워크 | - |
dc.subject | 최대 마진 훈련 | - |
dc.title | Maximum margin learning with sub-SPNs for cell classification | - |
dc.title.alternative | 보조 합-곱 네트워크와 최대 마진 훈련을 이용한 세포 이미지 분류 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :미래자동차학제전공, | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.