
A lot of researchers use human videos to train their robots. A lot of times, it is much easier to generate a video with an AI model and use that to generate tricky, dynamic motion in robots. That’s exactly what this video from Carlos DP shows. It shows a Unitree robot trained to perform a tricky and dynamic motion with a Veo 3.1 Fast video.
So, AI video -> GVHMR -> sim2real motion tracking works, btw
This is a very tricky and dynamic motion that was trained on a video I generated on @fal using Google Veo 3.1 Fast
You can literally do prompt -> robot motion now, this is crazy lol pic.twitter.com/bvrrem4d3i
β Carlos DP π€π₯πΊπΈ (@the_carlosdp) November 3, 2025
That means you can now teach robots crazy moves with videos generated with simple text prompts, without any human required.












































