DC Field | Value | Language |
---|---|---|
dc.contributor.author | Do, Won-Joon | ko |
dc.contributor.author | Han, Yo Seob | ko |
dc.contributor.author | Choi, Seung Hong | ko |
dc.contributor.author | Ye, Jong Chul | ko |
dc.contributor.author | Park, Sung-Hong | ko |
dc.date.accessioned | 2019-01-23T05:32:26Z | - |
dc.date.available | 2019-01-23T05:32:26Z | - |
dc.date.created | 2018-12-13 | - |
dc.date.issued | 2018-06-21 | - |
dc.identifier.citation | International Society for Magnetic Resonance in Medicine 2018, pp.2738 | - |
dc.identifier.uri | http://hdl.handle.net/10203/249560 | - |
dc.description.abstract | We propose a new deep neural network (Y-net) that can utilize images acquired with a different MR contrast for reconstruction of down-sampled images. K-space center of down-sampled T2-weighted images and k-space edge of full-sampled T1-weighted images were combined through one Y-net, and desired high-resolution T2-weighted images were generated by another Y-net. The proposed network not only improved spatial resolution but also suppressed ringing artifacts caused by the down‑sampling at the k-space center. The developed technique potentially enables to accelerate the multi-contrast MR imaging in routine clinical studies. | - |
dc.language | English | - |
dc.publisher | International Society for Magnetic Resonance in Medicine | - |
dc.title | Reconstruction of MR images by combining k-spaces of multi-contrast MR data through deep learning | - |
dc.type | Conference | - |
dc.type.rims | CONF | - |
dc.citation.beginningpage | 2738 | - |
dc.citation.publicationname | International Society for Magnetic Resonance in Medicine 2018 | - |
dc.identifier.conferencecountry | FR | - |
dc.identifier.conferencelocation | Paris, France | - |
dc.contributor.localauthor | Park, Sung-Hong | - |
dc.contributor.nonIdAuthor | Do, Won-Joon | - |
dc.contributor.nonIdAuthor | Han, Yo Seob | - |
dc.contributor.nonIdAuthor | Choi, Seung Hong | - |
dc.contributor.nonIdAuthor | Ye, Jong Chul | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.