CAMPs: Learning Context-Specific Abstractions for Efficient Planning in Factored MDPs

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 190
  • Download : 0
Meta-planning, or learning to guide planning from experience, is apromising approach to improving the computational cost of planning. A general meta-planning strategy is to learn to impose constraints on the states considered and actions taken by the agent. We observe that (1) imposing a constraint can induce context-specific independences that render some aspects of the domain irrelevant, and (2) an agent can take advantage of this fact by imposing constraints on its own behavior. These observations lead us to propose the context-specific abstract Markov decision process (CAMP), an abstraction of a factored MDP that affords efficient planning. We then describe how to learn constraints to impose so the CAMP optimizes a trade-off between rewards and computational cost. Our experiments consider five planners across four domains, including robotic navigation among movable obstacles (NAMO), robotic task and motion planning for sequential manipulation, and classical planning. We find planning with learned CAMPs to consistently outperform baselines, including Stilman’s NAMO-specific algorithm.
Publisher
Massachusetts Institute of Technology
Issue Date
2020-11
Language
English
Citation

4th Conference on Robot Learning (CoRL 2020)

URI
http://hdl.handle.net/10203/280573
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0