Researchers at the University of California San Diego have developed an innovative model that utilizes generative AI to significantly reduce the risk of injuries among athletes. This groundbreaking model, known as BIGE (Biomechanics-informed GenAI for Exercise Science), is designed to enhance training techniques and facilitate rehabilitation efforts following injuries.
BIGE combines athlete movement data with biomechanical principles, including the limits of muscle force generation. By analyzing motion-capture videos of individuals performing exercises like squats, the model generates realistic movement videos that athletes can emulate. This feature aims to help athletes train safely and effectively, minimizing the potential for injury.
Advancements in AI-Driven Training
The model’s ability to produce tailored motion patterns is a significant leap forward in sports science. Athletes can use BIGE to identify optimal movements that not only enhance performance but also reduce injury risks. Furthermore, the model can suggest alternative movements for athletes recovering from injuries, allowing them to maintain their training regimens safely.
According to Andrew McCulloch, a distinguished professor in the Shu Chien-Gene Lay Department of Bioengineering at UC San Diego and a senior author of the research, “This approach is going to be the future.” The researchers believe that BIGE is unique because it effectively merges generative AI with accurate biomechanical data, a combination that previous models have struggled to achieve.
Towards Personalized Solutions for All
The methodology behind BIGE is not limited to athletic training; it holds potential for broader applications, including assessing fall risks in elderly individuals. Rose Yu, a professor in the UC San Diego Department of Computer Science and Engineering and another senior author, stated, “This methodology could be used by anyone.” This versatility underscores the model’s potential to impact various aspects of health and fitness.
Researchers trained BIGE using extensive data from motion-capture systems, translating these movements onto 3D skeletal models to generate realistic motion patterns. This process ensures that the motions generated adhere to the anatomical and mechanical constraints of human movement, setting it apart from many existing generative AI models.
The team recently presented their findings at the Learning for Dynamics & Control Conference held at the University of Michigan in Ann Arbor, Michigan. As they look ahead, the researchers plan to expand the model’s capabilities beyond squats and aim to personalize it for individual users.
BIGE represents a significant advancement in the intersection of technology and sports, promising not only to enhance athletic performance but also to improve overall safety and rehabilitation practices in sports and beyond.






































