Kinect-based automated movement quality assessment system

ResearchSpace/Manakin Repository

Show simple item record

dc.contributor.advisor Zhang, Y en
dc.contributor.advisor Smith, H en Dajime, Peter Fermin en 2019-10-15T22:54:24Z en 2019 en
dc.identifier.uri en
dc.description Full Text is available to authenticated members of The University of Auckland only. en
dc.description.abstract Practitioners commonly perform movement quality assessment through visual qualitative assessment protocols, which can be time-intensive and prone to inter-rater measurement bias. The advent of portable and inexpensive marker-less motion capture systems can hasten the assessment process through objective joint kinematic analysis. The current study developed a system that classified the Movement Competency Screen (MCS) score from machine learning models whose kinematic features were obtained from Kinect position data. A Kinect sensor was used to collect position data from thirty-one physically active males as they performed the following movement tasks: bilateral squat, forward lunge, and single leg squat. A Kinect-based biomechanical model allowed joint kinematic analysis from the position data. The movement quality of each task performance was qualitatively rated from 1 (i.e. poor) to 3 (i.e. good) based on the MCS criteria. Four machine learning models including multiclass logistic regression (MLR), linear discriminant analysis (LDA), support vector machine (SVM), and k-nearest neighbors (KNN) were used to classify movement quality from Kinect-based kinematic variables. The kinematic variables included in the model were identified through one-way ANOVA, Kruskall-Wallis test, and MLR analysis. Sensitivity, specificity, and accuracy were calculated after the five-fold cross validation to determine the performance of each model. Joint kinematic analysis revealed that poor movement quality was characterized by greater deviations from the neutral position in the frontal plane kinematic variables (e.g. pelvic tilt angle) and greater angular displacement for sagittal plane kinematic variables (e.g. trunk flexion angle) especially during peak knee flexion. The McFadden R2 for all models were greater than 0.40 except for the right single leg squat due to overlapping of kinematic data between poor and average categories. MLR, SVM, and LDA models consistently performed well across tasks. The sensitivity, specificity, and accuracy for the MLR, LDA, and SVM models ranged from 0.66 to 0.92, from 0.54 to 0.89, and from 0.75 to 0.85 respectively. In conclusion, the MLR forward lunge models using objective kinematic data from Kinect is ideal for movement quality assessment because it had the highest sensitivity and specificity relative to the bilateral squat and single leg squat models. en
dc.publisher ResearchSpace@Auckland en
dc.relation.ispartof Masters Thesis - University of Auckland en
dc.relation.isreferencedby UoA99265201413402091 en
dc.rights Items in ResearchSpace are protected by copyright, with all rights reserved, unless otherwise indicated. Previously published items are made available in accordance with the copyright policy of the publisher. en
dc.rights Restricted Item. Full Text is available to authenticated members of The University of Auckland only. en
dc.rights.uri en
dc.rights.uri en
dc.title Kinect-based automated movement quality assessment system en
dc.type Thesis en Exercise Science en The University of Auckland en Masters en
dc.rights.holder Copyright: The author en
pubs.elements-id 784154 en Science en Exercise Sciences en
pubs.record-created-at-source-date 2019-10-16 en

Full text options

This item appears in the following Collection(s)

Show simple item record Except where otherwise noted, this item's license is described as


Search ResearchSpace

Advanced Search