Robust, Scalable, Distributed Semantic Mapping for Search-and-Rescue and Manufacturing Co-Robots

Overview

The goal of this project is to enable multiple co-robots to map and understand the environment they are in to efficiently collaborate among themselves and with human operators in education, medical assistance, agriculture, and manufacturing applications. The first distinctive characteristic of this project is that the environment will be modeled semantically, that is, it will contain human-interpretable labels (e.g., object category names) in addition to geometric data. This will be achieved through a novel, robust integration of methods from both computer vision and robotics, allowing easier communications between robots and humans in the field. The second distinctive characteristic of this project is that the increased computation load due to the addition of human-interpretable information will be handled by judiciously approximating and spreading the computations across the entire network. The novel developed methods will be evaluated by emulating real-world scenarios in manufacturing and for search-and-rescue operations, leading to potential benefits for large segments of the society. The project will include opportunities for training students at the high-school, undergraduate, and graduate levels by promoting the development of marketable skills.

This is a joint project with Dario Pompili at Rutgers University.

Rotational Outlier Identification in Pose Graphs using Dual Decomposition

The first result of this project is an outlier detection algorithm that finds
incorrect relative poses in pose graphs by checking the geometric consistency of
their cycles. This is an essential step in mapping
unknown environments with single or multiple robots, as incorrectly estimated
relative poses lead to failure in mapping techniques such as
Structure from Motion (SfM) or Simultaneous Localization and Mapping (SLAM).
Increasing the robustness of this process is an active research area, as
incorrect measurements are unavoidable due to feature mismatch or repetitive
structures in the environment.

Our algorithm provides: 1) a novel probabilistic framework for detecting outliers
from the rotational component of relative poses, which uses the geometric
consistency of cycles in the pose graph as evidence; 2) a novel inference algorithm,
of independent interest, which utilizes dual decomposition to break the inference
problem into smaller sub-problems which can be solved in parallel and in a
distributed fashion; 3) an Expectation-Maximization scheme to fine-tune the
distribution parameters and prior information iteratively.

Publications

Software

We therefore developed pySLAM-D, a new SLAM pipeline that addresses two gaps in existing SLAM packages (such as ORB SLAM):

  1. The main implementation is in Python and has a very modular architecture. This allows faster development and implementation of research ideas, such as the integration with modern learning frameworks such as TensorFlow and PyTorch.
  2. It incorporates recent advancements in individual modules (such as the TEASER algorithm for point cloud alignment)

For further details see: pySLAM-D: an open-source library for localization and mapping with depth cameras.

SLAM DatasetBU Robotics Lab Dataset preview

We have collected a new RGB-D SLAM dataset in the BU Robotics Lab. The full dataset can be downloaded at the following link:

The BU Robotics Lab SLAM dataset

Personnel  information (BU)

Tron, Roberto PI
Serlin, Zachary Graduate Student
Yang, Guang Graduate Student
Sookraj, Brandon Research Experience for Undergraduates (REU) Participant
Jin, Hanchong Research Experience for Undergraduates (REU) Participant
Wallace, Gordon Undergraduate Student
Zhu, Jialin Undergraduate Student
Gerontis, Constantinos Undergraduate Student
Pietraski, Miranda High School Student

Funding and support

This project is supported by the National Science Foundation grant “Robust, Scalable, Distributed Semantic Mapping for Search-and-Rescue and Manufacturing Co-Robots” (Award number 1734454).
NSF Logo

Disclaimer: Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.