Gait variability is a sensitive metric for assessing functional deficits in individuals with mobility impairments. To correctly represent the temporal evolution of gait kinematics, nonlinear measures require extended and uninterrupted time series. In this study, we present and validate a novel algorithm for concatenating multiple time-series in order to allow the nonlinear analysis of gait data from standard and unrestricted overground walking protocols. The fullbody gait patterns of twenty healthy subjects were captured during five walking trials (at least 5 minutes) on a treadmill under different weight perturbation conditions. The collected time series were cut into multiple shorter time series of varying lengths and subsequently concatenated using a novel algorithm that identifies similar poses in successive time series in order to determine an optimal concatenation time point. After alignment of the datasets, the approach then concatenated the data to provide a smooth transition. Nonlinear measures to assess stability (Largest Lyapunov Exponent, LyE) and regularity (Sample Entropy, SE) were calculated in order to quantify the efficacy of the concatenation approach using intra-class correlation coefficients, standard error of measurement and paired effect sizes. Our results indicate overall good agreement between the full uninterrupted and the concatenated time series for LyE. However, SE was more sensitive to the proposed concatenation algorithm and might lead to false interpretation of physiological gait signals. This approach opens perspectives for analysis of dynamic stability of gait data from physiological overground walking protocols, but also the re-processing and estimation of nonlinear metrics from previously collected datasets.
Orter S, Ravi DK, Singh NB, Vogl F, Taylor WR, König Ignasiak N (2019) A method to concatenate multiple short time series for evaluating dynamic behaviour during walking. PLoS ONE 14(6): e0218594. https://doi.org/10.1371/journal.pone.0218594
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.