Task-Aware Semantic Map: Autonomous Robot Task Assignment Beyond Commands

All authors are with Hanyang University.
🎉Congratulation!🎉
Our paper got accepted in International Conference on Robotics and Automation(ICRA) 🤭

This video is attached as a supplementary material for ICRA.

Abstract

Task-Aware Semantic Map

With recent advancements in Large Language Models, task planning methods that interpret human commands have garnered significant attention. However, as home robots become more common, specifying every daily task could become impractical. This paper introduces a novel semantic map called the Task-Aware Semantic Map (TASMap), which enables robots to autonomously assign and propose necessary tasks in a scene without explicit human commands. The core innovation of this approach is the ability of TASMap to comprehend the context of objects within a scene and autonomously generate task proposals. This capability significantly advances autonomous robotic assistance, reducing the dependency on specific commands and enhancing interaction with environments.

Task-Aware Semantic Map Construction

Interpolate start reference image.

Overview of the proposed framework. In the task generation module, egocentric RGB images serve as inputs to associate tasks with each object present in the image. Inputs such as depth and camera pose are used during the semantic fusion module, where the results from task generation are combined with the object entities to create a TASMap. On the right, the object entities are represented as blue spheres with bar graphs indicating the task-significant vector

Task-Aware Semantic Map

House #1: Post-Shower

House #2: Odorous

House #3: Post-Excercise

This is a video of a walkthrough of the inside of TASMap, explaining the labels of object entities and their assigned tasks. The bar above each object entity represents the task-significant vector, while the sphere at the center of each object entity represents 30% of its volume.

TASMap Explorer

Leave
Fold
Mop
Dispose
Relocate
Turn off
Wash
Vacuum
Reorient
Wipe
Close
Empty

Every time the task-significant vector button is pressed, the task-significant vector in the Map alternates between an on and off state. Every time the Top-down View button is pressed, you can view the map in a Top-down view.
Control: Click + Drag = Rotate Ctrl + Drag = Translate Scroll Up/Down = Zoom In/Out

Dataset Generation for Simulation Evaluation

To evaluate the performance of TASMap, we produce a new dataset to simulate realistically cluttered residential environments using OmniGibson simulator. Five discrete house samples from the 3D-FRONT dataset are selected to ensure that the dataset covers various room types and sizes. We artificially place objects, dust, and liquids using objects in BEHAVIOR-1K to create a cluttered environments. We construct a total of 35 houses, each consisting of 5 to 7 rooms. These houses are populated with a diverse array of objects, categorized into over 200 types. Each house is designed to reflect a specific concept. For instance, houses with the 'Post-Cooking' concept predominantly contain tasks such as Wash, Wipe, and Vacuum, while the 'Post-Shower' concept houses primarily include tasks like Mop and Relocate. Other concepts, such as 'Post-Exercise', 'Odorous' and 'Laundry-Filled', are also included.
Control: Click image = Show distribution of concept

BibTeX

Comming soon!