Select Page

HYPERBOT – A BENCHMARKING TESTBED FOR ACQUISITION OF ROBOT-CENTRIC HYPERSPECTRAL SCENE AND IN-HAND OBJECT DATA
Nathaniel Hanson, Tarik Kelestemur, Joseph Berman, Dominik Ritzenhoff, Taskin Padir | Institute for Experiential Robotics, Northeastern University, Boston, Massachusetts, USA

Abstract

Robots will benefit from identifying novel objects in their environments through multi-modal sensing capabilities. The overarching goal of this research is to accelerate multi-modal sensor data collection for general-purpose robots to infer material properties of objects they interact with. To this end, we designed a benchmarking testbed to enable a robot manipulator to perceive spectral and spatial characteristics of scene items. Our design includes the use of a push broom Visible to Near Infrared (VNIR) hyperspectral camera, co-aligned with a depth camera. This system enables the robot to process and segment spectral characteristics of items in a larger spatial scene. For more targeted item manipulation, we integrate a VNIR spectrometer into the fingertips of a gripper. By acquiring spectral signatures both at a distance and at grasp time, the robot can quickly correlate data from the two sensors, each of which contain distinct quantum efficiencies and noise. Our approach to this challenge is a step towards using spectral data for enhanced grasp selection in cluttered environments and automated ground-truthing of hyperspectral sensor data. This paper describes our approach to the design of this benchmarking testbed. The project code and material list are located here: https://github.com/RIVeR-Lab/HyperBot.

Index Terms: Hyperspectral Imaging, Robot Spectroscopy, Grasp Planning, Sensor Fusion, Ground Truth Acquisition

Full paper HERE

Customer Application Q&A:

We had the opportunity to hear directly from Nathaniel Hanson, PhD who worked on this application!

What was/is the goal of the project?

The goal of the project was to build out a comprehensive robot system to simultaneously acquire hyperspectral images of objects and recognize objects that the robot grasps in-hand. This project is the first step in a larger research thrust to understand the everyday world through multiple spectral sensors working together. Ultimately, we want to enable robots to use spectroscopy to build more intelligent models of their world.

How does our instrument fit into that goal?

We used our BlueWave Spectrometer from StellarNet in a specially designed robot gripper. We used the combination fiber bundle with multiple illumination fibers surrounding a singular read fiber. This allows us to understand the spectral signatures of objects as we approach them for grasping and confirm we have selected the correct item.

How was the process of implementing the instrument within the robotic system?

Super easy! I’ve worked with several spectrometer manufacturers, but I love the StellarNet SDK. It was super easy to write a package that enables StellarNet spectrometers to be used within the Robot Operating System (ROS), which is the most widely used robot programming ecosystem by both academics and industry.

Are there any next steps for the robot?

For this robot, we don’t have any future plans, as we’ve updated the hyperspectral camera system to use snapshot hyperspectral imaging, which removes the need for the linear rail to translate the arm and camera over the scene. We are currently working on some novel robot gripper designs which will allow for precision alignment of spectral probes for automated spectral sampling over complex 3D geometries: https://arxiv.org/pdf/2403.17232.

Fig. 1. Spectral benchmarking testbed with architecture to acquire scene hyperspectral datacubes and in-hand spectral data. Note the axes orientations and cell dimensions.

Fig. 2. Multi-modal perception setup for sensing both spatialspectral characteristics of total workspace and in-hand. NB: Z-axis is perpendicular, coming out of the page.

Does the robot have a name/nickname other than HyperBot?

Nope! Hyperbot is a portmanteau of “Hyperspectral” and “Robot”.

Do you have anything else you want to share about the robot or project?

We also used the StellarNet spectrometer in a different robot to identify ground terrain! Checkout this piece of work, entitled VAST: https://drive.google.com/file/d/1CnuGO8yEm05sLcy206waLOzmC_X4UlIY/view

Recent Customer Applications