PressureVision++: Estimating Fingertip Pressure from Diverse RGB Images

1Georgia Institute of Technology 2Meta Reality Labs

PressureVision++ estimates if and how hard fingertips are making contact from a single RGB image.

Abstract

Touch plays a fundamental role in manipulation for humans; however, machine perception of contact and pressure typically requires invasive sensors. Recent research has shown that deep models can estimate hand pressure based on a single RGB image. However, evaluations have been limited to controlled settings since collecting diverse data with ground-truth pressure measurements is difficult. We present a novel approach that enables diverse data to be captured with only an RGB camera and a cooperative participant. Our key insight is that people can be prompted to apply pressure in a certain way, and this prompt can serve as a weak label to supervise models to perform well under varied conditions. We collect a novel dataset with 51 participants making fingertip contact with diverse objects. Our network, PressureVision++, outperforms human annotators and prior work. We also demonstrate an application of PressureVision++ to mixed reality where pressure estimation allows everyday surfaces to be used as arbitrary touch-sensitive interfaces.

Video

Demo

Want to try out our model yourself? Our GitHub repo includes instructions to set up our system with your own hardware!

BibTeX

@article{grady2024pressurevision2,
  author    = {Grady, Patrick and Collins, Jeremy A and Tang, Chengcheng and Twigg, Christopher D and Aneja, Kunal and Hays, James and Kemp, Charles C},
  title     = {{PressureVision++}: Estimating Fingertip Pressure from Diverse RGB Images},
  journal   = {IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
  year      = {2024},
}