Robot Learning to Paint from Demonstrations

Cited 1 time in webofscience Cited 0 time in scopus
  • Hit : 65
  • Download : 0
Robotic painting tasks in the real world are often made complicated by the highly complex and stochastic nature of the dynamics that underlie, e.g., physical contact between the painting tool and a canvas, color blendings between painting mediums, and many more. Simulation-based inverse graphics algorithms, for example, can not be directly transferred to the real-world due in large to the considerable gap in the viable range of painting strokes the robot can accurately generate onto the physical canvas. In this paper, we aim at minimizing this gap by appealing to a data-driven skill learning approach. The core idea lies in allowing the robot to learn continuous stroke-level skills that jointly encodes action trajectories and painted outcomes from an extensive collection of human demonstrations. We demonstrate the efficacy of our method through extensive real-world experiments using a 4-dof torque-controllable manipulator with a digital canvas(iPad).
Publisher
Institute of Electrical and Electronics Engineers Inc.
Issue Date
2022-10
Language
English
Citation

IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2022, pp.3053 - 3060

ISSN
2153-0858
DOI
10.1109/IROS47612.2022.9981633
URI
http://hdl.handle.net/10203/312102
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 1 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0