Full metadata
Title
Autonomously Learning World-Model Representations For Efficient Robot Planning
Description
In today's world, robotic technology has become increasingly prevalent across various fields such as manufacturing, warehouses, delivery, and household applications. Planning is crucial for robots to solve various tasks in such difficult domains. However, most robots rely heavily on humans for world models that enable planning. Consequently, it is not only expensive to create such world models, as it requires human experts who understand the domain as well as robot limitations, these models may also be biased by human embodiment, which can be limiting for robots whose kinematics are not human-like.
This thesis answers the fundamental question: Can we learn such world models automatically? This research shows that we can learn complex world models directly from unannotated and unlabeled demonstrations containing only the configurations of the robot and the objects in the environment. The core contributions of this thesis are the first known approaches for i) task and motion planning that explicitly handle stochasticity, ii) automatically inventing neuro-symbolic state and action abstractions for deterministic and stochastic motion planning, and iii) automatically inventing relational and interpretable world models in the form of symbolic predicates and actions. This thesis also presents a thorough and rigorous empirical experimentation. With experiments in both simulated and real-world settings, this thesis has demonstrated the efficacy and robustness of automatically learned world models in overcoming challenges, generalizing beyond situations encountered during training.
Date Created
2024
Contributors
- Shah, Naman (Author)
- Srivastava, Siddharth (Thesis advisor)
- Kambhampati, Subbarao (Committee member)
- Konidaris, George (Committee member)
- Speranzon, Alberto (Committee member)
- Zhang, Yu (Committee member)
- Arizona State University (Publisher)
Topical Subject
Resource Type
Extent
185 pages
Language
eng
Copyright Statement
In Copyright
Primary Member of
Peer-reviewed
No
Open Access
No
Handle
https://hdl.handle.net/2286/R.2.N.193613
Level of coding
minimal
Cataloging Standards
Note
Partial requirement for: Ph.D., Arizona State University, 2024
Field of study: Computer Science
System Created
- 2024-05-02 02:21:06
System Modified
- 2024-05-02 02:21:15
- 6 months 1 week ago
Additional Formats