Tactile-sensing robotics for food automation and object manipulation
FingerVision builds tactile sensors for robots by estimating touch sensation from camera images—enabling machines to handle delicate tasks like food picking that require human-level dexterity. The company is university-rooted (founded 2021) and engineering-dominant, with a focused stack (ROS, Python, C/C++) and active projects spanning food factory automation, motion control, and inference optimization. Core technical challenges center on scaling image processing for real-time tactile sensing and improving model accuracy in production environments.
FingerVision develops proprietary tactile sensors that allow robots to perceive force and slip distribution through vision-based estimation. The core innovation—inferring touch sensation from camera input—removes the need for complex mechanical sensors, reducing cost and fragility in collaborative robot systems. The company targets food manufacturing and general manipulation tasks, building software platforms and hardware integrations (robot racks, feeders, jackets) to support factory-scale deployment. Based in Tokyo with engineering talent also in India, the team is actively shipping food-picking automation systems and iterating on next-generation production workflows.
ROS and ROS 2 for robot middleware, Python and C/C++ for core logic, PLC for hardware control, Qt for GUI, and Fusion 360 for mechanical CAD. RViz and JavaScript also appear in the stack.
Food picking robot automation and deployment in factories, real-time control systems, motion generation algorithms, cloud-based operation data monitoring, and a software platform redesign to support scaling.
Koto, Tokyo, Japan. The company also hires engineering talent in India and was founded in 2021 as a university-launched start-up.
Other companies in the same industry, closest in size