Need: Biomechanics is a common sub-discipline within movement science, biomedical engineering and human factors which analyzes human body movement by applying principles of Newtonian mechanics to physical activities. Nearly all undergraduate movement science students are required to complete one biomechanics course prior to graduation. However, as courses in biomechanics are highly demanding in mathematics and physics, many undergraduates often fear and delay taking their required biomechanics course, even though they involve necessary biomechanical concepts in their future professions. Therefore, there is a need to develop a student-centered educational tool to help students overcome the anxiety toward this subject matter and to effectively deliver course content.
Guiding Question: Studies have suggested that the inclusion of laboratory experience improves learning outcomes in biomechanics due to the self-reference effect and self-efficacy theory. Students’ learning performance can be enhanced through laboratory experiments by integrating abstract concepts from lectures and real-life examples via hands-on experiences. However, it has been reported that only 62% of biomechanics courses were offered laboratory sessions since implementing laboratories into biomechanics curriculums is difficult due to cost and time constraints.
With researchers’ continuous efforts and innovations on deep neural networks, computer-vision-based pose estimation methods have emerged as a promising approach to enable real-time human kinematics analysis. Nowadays, the most cutting-edge computer-vision algorithms are able to detect anatomical landmarks positions from single-view images collected by a webcam. Considering most mobile devices come with an embedded camera, there is a good potential to adopt a mobile application as an instructional tool into biomechanics courses to help students gain laboratory experience. In this study, we aim to design an Android-based mobile app with embedded computer-vision algorithms that could be applied to undergraduate biomechanics courses. With smartphones or tablets, students will be able to practice human kinematics analysis on several body segments in real-time.
Outcomes: Currently, we have developed a prototype mobile app that is able to capture human motion video from a tablet camera, infer the position of key body joints using the tablet computational power, and calculate the joint angles in a real-time fashion for elbow flexion/extension, shoulder abduction/adduction, and trunk lateral bending (Figure 1 and 2).
Broader Impacts: With this proposed mobile app, we will be able to develop a complete curriculum module that facilitates the education of undergraduate biomechanics concepts. For example, the curriculum module could first show a short animation during which a virtual instructor guides students to illustrate the placement of the mobile device camera and perform certain activities. Next, the computer vision algorithm recognizes the human body, reconstructs the pose, and overlaps the reconstructed pose on the model’s image. Then the curriculum module will show the change of body joint angles over time. The module can also calculate basic descriptive statistics of all measured kinematic variables. Basic descriptive statistics provide exploratory and knowledgeable insights of data. Thus, these statistics could further augment students’ comprehension of the scale of human kinematics variables. The final deliverables will be released as a mobile device for anyone who is interested in biomechanics.
Hanwen Wang, North Carolina State University, Raleigh NC; Xu Xu, North Carolina State University, Raleigh NC;