The gym is full of people doing squats wrong. Knees caving in, backs rounding, hips drifting off-center. Some get away with it. Others end up sidelined for months, wondering what went wrong. Now researchers at UC San Diego have built an AI system that could help keep athletes out of the trainer’s office by generating personalized videos showing them exactly how to move.
The model, called BIGE (for Biomechanics-informed GenAI for Exercise Science), doesn’t just spit out generic exercise clips. It combines generative AI with actual biomechanical constraints, things like how much force a muscle can produce or the angles joints can safely handle. Feed it motion-capture data of someone squatting, and it produces videos of movements tailored to avoid injury or speed recovery after one.
What Makes This Different From Fitness Apps
Most AI models tasked with generating human movements have a problem. They can make it look right, but the underlying physics might be completely off. Someone might appear to be performing a textbook squat, but the forces on their knee joints could be all wrong. BIGE is, to the researchers’ knowledge, the only model that brings together generative AI with realistic biomechanics. The alternative; methods that don’t use generative AI but do account for physics, requiring so much computational power they’re essentially unusable outside research labs.
To train the system, the team used motion-capture videos of people performing squats, then translated those movements onto three-dimensional skeletal models. By calculating the forces involved, they generated motions that are not just visually convincing but physically plausible. The yellow curves in their demonstration videos show hip joint movement through an entire squat cycle, and BIGE’s output looks smoother and more natural than existing baseline models.
Andrew McCulloch, distinguished professor in the Shu Chien-Gene Lay Department of Bioengineering at UC San Diego and senior author on the work, predicts: “This approach is going to be the future.”
Beyond Squats and Into Rehab
Right now, BIGE works with squats. Next on the agenda: expanding to other movements and personalizing the models for specific individuals. The applications could stretch well beyond athletes. Rose Yu, a professor in UC San Diego’s Department of Computer Science and Engineering and another senior author, notes the methodology has broad potential. In one example, it could help assess fall risk in elderly populations.
Yu adds: “This methodology could be used by anyone.”
The team presented their work at the Learning for Dynamics & Control Conference at the University of Michigan in Ann Arbor. For athletes dealing with injuries, the system could generate movements that let them continue training while protecting damaged tissue. For those trying to avoid injuries in the first place, it offers a way to see what good form actually looks like for their specific body.
It’s one thing to be told your squat form needs work. It’s another to see a video generated specifically for you, showing the exact adjustments your joints and muscles need to make. Whether that will actually keep people injury-free remains to be seen, but at least the physics checks out.
Journal: Proceedings of Machine Learning Research, Vol. 283, pp. 1243-1256
Conference: 7th Annual Learning for Dynamics & Control Conference
Full paper: https://proceedings.mlr.press/v283/maheshwari25a.html
ScienceBlog.com has no paywalls, no sponsored content, and no agenda beyond getting the science right. Every story here is written to inform, not to impress an advertiser or push a point of view.
Good science journalism takes time — reading the papers, checking the claims, finding researchers who can put findings in context. We do that work because we think it matters.
If you find this site useful, consider supporting it with a donation. Even a few dollars a month helps keep the coverage independent and free for everyone.