DC Field | Value | Language |
---|---|---|
dc.contributor.author | Doumanoglou, Andreas | ko |
dc.contributor.author | Stria, Jan | ko |
dc.contributor.author | Peleka, Georgia | ko |
dc.contributor.author | Mariolis, Ioannis | ko |
dc.contributor.author | Petrik, Vladimir | ko |
dc.contributor.author | Kargakos, Andreas | ko |
dc.contributor.author | Wagner, Libor | ko |
dc.contributor.author | Hlavac, Vaclav | ko |
dc.contributor.author | Kim, Tae-Kyun | ko |
dc.contributor.author | Malassiotis, Sotiris | ko |
dc.date.accessioned | 2021-06-17T06:30:11Z | - |
dc.date.available | 2021-06-17T06:30:11Z | - |
dc.date.created | 2021-06-17 | - |
dc.date.issued | 2016-12 | - |
dc.identifier.citation | IEEE TRANSACTIONS ON ROBOTICS, v.32, no.6, pp.1461 - 1478 | - |
dc.identifier.issn | 1552-3098 | - |
dc.identifier.uri | http://hdl.handle.net/10203/285964 | - |
dc.description.abstract | This work presents a complete pipeline for folding a pile of clothes using a dual-armed robot. This is a challenging task both from the viewpoint of machine vision and robotic manipulation. The presented pipeline is comprised of the following parts: isolating and picking up a single garment from a pile of crumpled garments, recognizing its category, unfolding the garment using a series of manipulations performed in the air, placing the garment roughly flat on a work table, spreading it, and, finally, folding it in several steps. The pile is segmented into separate garments using color and texture information, and the ideal grasping point is selected based on the features computed from a depth map. The recognition and unfolding of the hanging garment are performed in an active manner, utilizing the framework of active random forests to detect grasp points, while optimizing the robot actions. The spreading procedure is based on the detection of deformations of the garment's contour. The perception for folding employs fitting of polygonal models to the contour of the observed garment, both spread and already partially folded. We have conducted several experiments on the full pipeline producing very promising results. To our knowledge, this is the first work addressing the complete unfolding and folding pipeline on a variety of garments, including T-shirts, towels, and shorts. | - |
dc.language | English | - |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | - |
dc.title | Folding Clothes Autonomously: A Complete Pipeline | - |
dc.type | Article | - |
dc.identifier.wosid | 000389849700011 | - |
dc.identifier.scopusid | 2-s2.0-84991059625 | - |
dc.type.rims | ART | - |
dc.citation.volume | 32 | - |
dc.citation.issue | 6 | - |
dc.citation.beginningpage | 1461 | - |
dc.citation.endingpage | 1478 | - |
dc.citation.publicationname | IEEE TRANSACTIONS ON ROBOTICS | - |
dc.identifier.doi | 10.1109/TRO.2016.2602376 | - |
dc.contributor.localauthor | Kim, Tae-Kyun | - |
dc.contributor.nonIdAuthor | Doumanoglou, Andreas | - |
dc.contributor.nonIdAuthor | Stria, Jan | - |
dc.contributor.nonIdAuthor | Peleka, Georgia | - |
dc.contributor.nonIdAuthor | Mariolis, Ioannis | - |
dc.contributor.nonIdAuthor | Petrik, Vladimir | - |
dc.contributor.nonIdAuthor | Kargakos, Andreas | - |
dc.contributor.nonIdAuthor | Wagner, Libor | - |
dc.contributor.nonIdAuthor | Hlavac, Vaclav | - |
dc.contributor.nonIdAuthor | Malassiotis, Sotiris | - |
dc.description.isOpenAccess | N | - |
dc.type.journalArticle | Article | - |
dc.subject.keywordAuthor | Active vision | - |
dc.subject.keywordAuthor | clothes | - |
dc.subject.keywordAuthor | deformable objects | - |
dc.subject.keywordAuthor | manipulation | - |
dc.subject.keywordAuthor | perception | - |
dc.subject.keywordAuthor | random forests | - |
dc.subject.keywordAuthor | robotics | - |
dc.subject.keywordPlus | VISUAL RECOGNITION | - |
dc.subject.keywordPlus | DEFORMABLE OBJECTS | - |
dc.subject.keywordPlus | POSE ESTIMATION | - |
dc.subject.keywordPlus | ROBOT | - |
dc.subject.keywordPlus | CLASSIFICATION | - |
dc.subject.keywordPlus | REGRESSION | - |
dc.subject.keywordPlus | DEPTH | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.