Reference Based Sketch Extraction via Attention MechanismReference Based Sketch Extraction via Attention Mechanism

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 272
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorSeo, Chang Wookko
dc.contributor.authorAshtari, Amirsamanko
dc.contributor.authorNoh, Junyongko
dc.contributor.authorCha, Sihunko
dc.contributor.authorCholmin Kangko
dc.date.accessioned2022-12-14T01:00:31Z-
dc.date.available2022-12-14T01:00:31Z-
dc.date.created2022-12-01-
dc.date.created2022-12-01-
dc.date.issued2022-12-07-
dc.identifier.citationSIGGRAPH Asia 2022-
dc.identifier.urihttp://hdl.handle.net/10203/302951-
dc.description.abstractWe propose a model that extracts a sketch from a colorized image in such a way that the extracted sketch has a line style similar to a given reference sketch while preserving the visual content identically to the colorized image. Authentic sketches drawn by artists have various sketch styles to add visual interest and contribute feeling to the sketch. However, existing sketch-extraction methods generate sketches with only one style. Moreover, existing style transfer models fail to transfer sketch styles because they are mostly designed to transfer textures of a source style image instead of transferring the sparse line styles from a reference sketch. Lacking the necessary volumes of data for standard training of translation systems, at the core of our GAN-based solution is a self-reference sketch style generator that produces various reference sketches with a similar style but different spatial layouts. We use independent attention modules to detect the edges of a colorized image and reference sketch as well as the visual correspondences between them. We apply several loss terms to imitate the style and enforce sparsity in the extracted sketches. Our sketch-extraction method results in a close imitation of a reference sketch style drawn by an artist and outperforms all baseline methods. Using our method, we produce a synthetic dataset representing various sketch styles and improve the performance of auto-colorization models, in high demand in comics. The validity of our approach is confirmed via qualitative and quantitative evaluations.-
dc.languageEnglish-
dc.publisherAssociation for Computing Machinery-
dc.titleReference Based Sketch Extraction via Attention Mechanism-
dc.title.alternativeReference Based Sketch Extraction via Attention Mechanism-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationnameSIGGRAPH Asia 2022-
dc.identifier.conferencecountryKO-
dc.identifier.conferencelocationEXCO, Daegu-
dc.identifier.doi10.1145/3550454.3555504-
dc.contributor.localauthorNoh, Junyong-
dc.contributor.nonIdAuthorCholmin Kang-
Appears in Collection
GCT-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0