论文标题

遵循自然语言命令的深层构图机器人计划者

Deep compositional robotic planners that follow natural language commands

论文作者

Kuo, Yen-Ling, Katz, Boris, Barbu, Andrei

论文摘要

我们演示了如何增加基于抽样的机器人计划器,以学会在连续的配置空间中了解一系列自然语言命令,以移动和操纵对象。我们的方法结合了一个根据复杂命令的解析,包括对象,动词,空间关系和属性,以及基于抽样的计划者RRT,结合了一个深层网络。经常出现的层次深度网络控制计划者如何探索环境,确定计划的道路何时可能实现目标,并估计每次进行交易的信心,以进行交易,以权衡网络与计划者之间的剥削和探索。当缺少有关任务的信息时,计划人员的行为几乎是最佳的行为,而网络学会利用从环境中获得的观察值,这使得两者自然互补。将这两个将概括性化为新地图,新型障碍以及训练集中没有发生的更复杂的句子。尽管该模型共同获取了CNN,该模型仍需要训练该模型,该CNN在学习单词的含义时从环境中提取特征。该模型通过使用注意图提供了一定程度的解释性,尽管端到端模型使用户可以看到其推理步骤。这种端到端模型允许机器人在挑战性的连续环境中学习遵循自然语言命令。

We demonstrate how a sampling-based robotic planner can be augmented to learn to understand a sequence of natural language commands in a continuous configuration space to move and manipulate objects. Our approach combines a deep network structured according to the parse of a complex command that includes objects, verbs, spatial relations, and attributes, with a sampling-based planner, RRT. A recurrent hierarchical deep network controls how the planner explores the environment, determines when a planned path is likely to achieve a goal, and estimates the confidence of each move to trade off exploitation and exploration between the network and the planner. Planners are designed to have near-optimal behavior when information about the task is missing, while networks learn to exploit observations which are available from the environment, making the two naturally complementary. Combining the two enables generalization to new maps, new kinds of obstacles, and more complex sentences that do not occur in the training set. Little data is required to train the model despite it jointly acquiring a CNN that extracts features from the environment as it learns the meanings of words. The model provides a level of interpretability through the use of attention maps allowing users to see its reasoning steps despite being an end-to-end model. This end-to-end model allows robots to learn to follow natural language commands in challenging continuous environments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源