Learning Gentle Grasping Using Vision, Sound, and Touch

No Thumbnail Available

Date

2025-03-11

Journal Title

Journal ISSN

Volume Title

Publisher

Technische Universität Dresden

Abstract

This dataset contains 1,500 robotic grasps collected for the paper of Learning Gentle Grasping Using Vision, Sound, and Touch. Additionally, we provide a description of this dataset and Python scripts to visualize the data and process raw data into a training dataset for a PyTorch model. The robotics system used consists of a multi-fingered robotic hand (16-DoF, Allegro Hand v4.0), 7-DoF robotic arms (xArm7), DIGIT tactile sensors, an RGB-D camera (Intel RealSense D435i), and a commodity microphone. The target object is a toy that emits sound when grasped strongly.

Description

Keywords

Citation

Attribution-NonCommercial-NoDerivatives 4.0 International