Robots Learn from Physics-Aware AI Videos (PhysWorld)

Robots Learn from Physics-Aware AI Videos (PhysWorld)

What's new

PhysWorld lets robots learn skills from AI-generated videos—without collecting any real robot data.

How it works

  • Given a task and one photo of the scene, a video model "imagines" a demonstration.
  • PhysWorld reconstructs a physics-based world from that video.
  • An object-centric learner turns those motions into precise, physically valid robot actions.

Why it matters

This converts vague pixels into grounded plans, enabling zero-shot, generalizable manipulation that respects real-world physics.

Results

Across diverse real tasks, PhysWorld significantly boosts manipulation accuracy vs. prior approaches.

Learn more: pointscoder.github.io/PhysWorld_Web/ | arxiv.org/abs/2511.07416

Paper: http://arxiv.org/abs/2511.07416v1

Register: https://www.AiFeta.com

Robotics AI GenerativeAI VideoGeneration WorldModels ReinforcementLearning ComputerVision EmbodiedAI Manipulation ZeroShot

Read more