Robots Learn Dexterity from Smart Glasses (AINA)

Robots Learn Dexterity from Smart Glasses (AINA)

Robots that learn by watching you

What if multi-fingered robots could pick up everyday skills just by watching people in the real world? A new framework, AINA, takes a big step toward that.

Using lightweight Aria Gen 2 smart glasses, anyone can record natural demonstrations at home, in the office, or on the go. The glasses provide high-res video, accurate 3D head and hand poses, and a wide stereo view for depth: rich cues that AINA turns into robot skills.

  • No robot data required: policies are learned directly from human videos—no simulation, reinforcement learning, or online corrections.
  • 3D point-based policies that transfer across changing backgrounds and scenes.
  • Demonstrated across nine everyday manipulation tasks with multi-fingered robot hands.

Why it matters: this helps close the human-to-robot embodiment gap and cuts costly lab data collection, pushing general-purpose robot manipulation closer to real homes and workplaces.

Watch the rollouts: https://aina-robot.github.io • Read the paper: https://arxiv.org/abs/2511.16661v1

Paper: https://arxiv.org/abs/2511.16661v1

Register: https://www.AiFeta.com

Robotics AI RobotLearning Manipulation ComputerVision Wearables AR ImitationLearning EmbodimentGap

Read more