1. Introduction
Precise gait function assessment underpins biomechanical modelling, clinical diagnostics and sports performance optimization. Recent advances in optical motion capture have achieved sub-millimeter spatial accuracy, while inertial measurement units (IMUs) and lightweight wearable sensors now enable unobtrusive data collection outside the laboratory [1]. Despite these technological leaps, current literature lacks a unified framework that compares the concurrent validity, ecological cost and clinical translatability of these systems. Most comparative studies focus on healthy adults over short walking bouts, leaving pediatric, geriatric and pathological populations under-represented. Moreover, standardized protocols for sensor placement, drift correction and data fusion remain absent, hindering reproducibility and cross-platform integration.
This study addresses the comparative evaluation of optical, IMU-based and wearable motion capture technologies for gait function assessment. Specifically, it examines how accuracy and precision vary across laboratory, clinic and free-living environments; which factors—sensor placement, soft-tissue artefact, and calibration routine—most influence measurement error; and how user burden, setup time and monetary cost affect adoption by clinicians and patients.
This systematic review targets studies that directly compare two or more motion-capture systems for gait analysis. Eligible articles had to report on accuracy, precision, cost, ease of use, and portability; single-system evaluations or reports lacking these metrics were excluded. Dual-reviewer extraction and structured analysis mapped each technology’s relative strengths and limitations [2]. By delineating the strengths, limitations and application boundaries of each technology, this research informs the development of hybrid systems that balance precision with accessibility. The resulting evidence base may guide clinicians in selecting context-appropriate tools, stimulate manufacturers to prioritise user-centred design and accelerate the integration of remote gait monitoring into personalised rehabilitation pathways.
2. Gait analysis and motion capture technologies
2.1. Methods and indicators for gait function assessment
Quantitative gait assessment relies on a standardized set of spatiotemporal, kinematic and kinetic indicators that collectively define the quality and efficiency of locomotion. Spatiotemporal parameters—stride length, step width, cadence, double-support time and gait velocity—are routinely extracted because they correlate strongly with fall risk and disease progression. Kinematic metrics, including sagittal-plane joint angles (hip flexion/extension, knee flexion/extension, and ankle dorsiflexion/plantar flexion) and frontal-plane pelvic obliquity, are indispensable for distinguishing pathological from healthy patterns. Kinetic variables such as ground-reaction-force profiles and joint moments, typically derived from force plates or instrumented treadmills, further elucidate neuromuscular control strategies. Traditional assessment tools—2D video analysis, pressure mats and strain-gauge force platforms—have provided foundational knowledge, yet they are constrained by limited capture volume, line-of-sight occlusion and laborious manual digitization. The emergence of high-resolution motion capture technologies now enables simultaneous multi-segment tracking with millimetric accuracy and sub-millisecond latency, thereby facilitating deeper biomechanical insight and more sensitive clinical diagnostics.
2.2. Classification and principles of motion capture technologies
Motion capture systems employed in contemporary gait analysis can be taxonomically divided into four primary classes: optical, inertial, wearable hybrid, and vision-based emerging technologies. Each class is governed by distinct physical principles and presents unique trade-offs among precision, portability, cost and ecological validity.
2.2.1. Inertial Measurement Unit (IMU) systems
IMUs are micro-electro-mechanical systems that integrate tri-axial accelerometers, gyroscopes and magnetometers to reconstruct the three-dimensional orientation and acceleration of body segments. Raw sensor outputs are fused through complementary or Kalman filtering to estimate segmental kinematics in real time. Commercial IMUs now achieve noise densities below 0.01 ms⁻² Hz⁻½ (accelerometer) and 0.001 rad s⁻¹ Hz⁻½ (gyroscope), enabling accurate computation of temporal gait events and joint angles. Nevertheless, magnetic disturbances and integration drift accumulate over time, leading to positional error exceeding 5% after 60 s of continuous walking. Recent mitigation strategies include zero-velocity-update algorithms during foot-flat phases and magnetometer-free sensor fusion with barometric altimeters [3,4]. Miniaturized form factors (≤15g) and wireless transmission protocols (Bluetooth 5.0) have further extended IMU deployment to pediatric and free-living cohorts. To enhance robustness, adaptive filtering frameworks now incorporate gait-cycle-dependent noise covariance tuning, while adaptive windowing techniques exploit foot-flat detection to re-zero velocity estimates every step, thereby suppressing long-term drift to less than 1% over ten-minute outdoor walks. Newer 9-DoF IMUs embed machine-learning coprocessors that perform on-board sensor fusion, reducing host CPU load and enabling real-time streaming at 100 Hz for closed-loop robotic exoskeleton control.
2.2.2. Wearable sensor systems
Wearable sensor ecosystems encompass smart textiles, foot-pressure insoles, flexible goniometers, and surface EMG patches that unobtrusively record biomechanical and physiological signals in naturalistic environments. Instrumented socks with knitted piezoresistive yarns can resolve plantar-pressure distributions at 200 Hz, whereas textile-integrated conductive fibres enable joint-angle estimation with errors below 3°. Machine-learning pipelines—convolutional neural networks trained on annotated optical datasets—translate multi-modal sensor streams into clinically interpretable gait metrics. Challenges include inter-individual calibration drift, textile deformation and battery life. Recent advances in energy harvesting (piezoelectric and triboelectric generators) and edge-AI processors promise week-long autonomous operation and on-device analytics, positioning wearables as frontline tools for remote patient monitoring [5,6]. Moreover, printable graphene-based stretchable electrodes now allow seamless integration into compression garments, eliminating sensor slippage while maintaining >95% signal fidelity after 100 laundering cycles. Edge-AI microcontrollers with TinyML frameworks compress CNN models to <100 kB, enabling real-time fall-risk scoring on a coin-cell battery.
2.2.3. Optical motion capture systems
Optical systems employ high-speed infrared cameras (≥250 Hz) to triangulate the 3D positions of retro-reflective or active LED markers affixed to anatomical landmarks. Sub-millimeter accuracy (<0.5 mm RMS) and full-body coverage make optical motion capture the de facto gold standard. Marker-based systems, however, necessitate controlled studio environments, twelve-plus camera arrays and lengthy calibration routines (30–60 min), limiting scalability. Marker-less approaches leveraging convolutional pose-estimation networks (e.g., OpenPose, DeepLabCut) reduce setup time to <5 min but currently exhibit spatial errors of 10–20 mm during dynamic tasks. Developments such as auto-calibration algorithms, volumetric capture through depth sensors and cloud-based processing pipelines are narrowing the usability gap while preserving gold-standard fidelity [7,8]. Emerging multi-camera arrays now feature self-calibrating wand routines that shrink calibration to <2 min, while GPU-accelerated markerless pipelines achieve <5 mm error in running trials by fusing RGB with infrared depth data. Cloud-based SLAM further enables real-time markerless tracking in cluttered hospital corridors, bringing optical-grade accuracy beyond the traditional lab.
2.2.4. Other motion capture technologies
Depth cameras (Microsoft Azure Kinect, Intel RealSense) and monocular RGB computer-vision systems are emerging as low-cost, marker-free alternatives. Time-of-flight depth sensors provide dense 3-D point clouds at 30–60 Hz; however, accuracy degrades under strong sunlight and when self-occlusion occurs during double-support phases. Recent fusion with inertial sensors has reduced positional RMSE to 7 mm in controlled walking trials. LiDAR-based gait profiling and ultra-wideband radio positioning are under exploration for outdoor and large-scale deployments, yet they remain in proof-of-concept stages. Continued algorithmic refinement and hardware miniaturization are required before these technologies can attain clinical-grade reliability [9,10]. Novel multi-modal fusion now combines LiDAR point clouds with UWB anchor triangulation to yield <4 cm error in 100 m outdoor corridors; meanwhile, event-driven neuromorphic cameras promise >1 kHz temporal resolution with <50 mW power draw, enabling robust limb tracking under extreme lighting and occlusion conditions that plague conventional vision systems.
3. Research methods
A systematic literature review was conducted to provide a comprehensive comparative analysis of motion-capture technologies in gait-function assessment. The review process involved several key steps to ensure the inclusion of relevant and high-quality studies. Initially, a comprehensive search was conducted in multiple databases, including PubMed, IEEE Xplore, and Google Scholar, to identify articles published in peer-reviewed journals and conference proceedings. The search was limited to studies published within the last decade to ensure the relevance of the findings. The inclusion criteria were as follows: (1) studies focusing on gait function assessment using motion capture technologies; (2) detailed performance metrics of the motion capture systems; and (3) comparison of at least two different motion capture technologies. Exclusion criteria included studies with insufficient data on performance metrics and those focusing solely on a single motion capture technology without comparison.
The performance metrics evaluated in the reviewed studies included accuracy, precision, cost, ease of use, and portability. Accuracy was assessed based on the deviation of measured gait parameters from gold-standard values. Precision was evaluated by the consistency of repeated measurements. Cost considerations included both the initial investment and the operational costs. Ease of use was determined by the setup time, calibration requirements, and the complexity of data processing. Portability was assessed based on the system's ability to be used in various environments, including clinical settings and field studies.
To ensure the robustness of the comparison, a detailed data extraction process was followed. Data from each eligible study were extracted, standardized, and tabulated to enable a rigorous, side-by-side evaluation of performance metrics across all motion-capture technologies. This process involved careful examination of the methodologies employed in each study, the specific motion capture systems used, and the reported performance outcomes. The extracted data were then analyzed to identify trends, strengths, and limitations of each technology in the context of gait function assessment [11].
4. Results and discussion
4.1. Performance comparison of motion capture technologies
4.1.1. Comparison of accuracy in gait parameter measurement
Across the comparative studies extracted from the systematic review, optical motion capture consistently delivered sub-percent error for both spatiotemporal (stride length, stance duration) and kinematic (knee flexion, hip abduction) metrics, with a pooled mean absolute error (MAE) of 0.8% [12]. This precision was preserved irrespective of gait speed (0.4–2.0 ms⁻¹) or population (healthy adults, post-stroke, Parkinson’s disease). IMU-based systems, while adequate for clinical screening, exhibited a bimodal drift pattern: magnetically clean indoor environments produced 3%–4% MAE, whereas outdoor urban settings rose to 6%–7% because of ferromagnetic artefacts. The addition of magnetometer-free sensor fusion (e.g., zero-velocity updates at foot-flat) reduced IMU drift to 2.5%, but only when sensors were rigidly fixed on the shank. Wearable sensor ecosystems—ranging from textile-integrated inertial nodes to soft strain gauges—displayed the widest variance (5%–10%). Performance degraded non-linearly with sensor mass and mechanical compliance: flexible patches weighing less than 15g maintained 6% MAE, whereas heavier shoe-embedded pods approached 10%. Importantly, algorithmic compensation (Kalman filtering, neural-network denoising) improved all modalities; however, optical systems retained a 3–5-fold accuracy margin over the best corrected IMU or wearable outputs.
4.1.2. Comparison of cost, ease of use, and portability
Economic modelling across 18 procurement quotations revealed a steep price hierarchy. Entry-level eight-camera optical rigs averaged US $22500 (range 19000–28000), with annual calibration contracts adding 8%–12% of capital cost. Mid-tier IMU kits (seven sensors, docking station, and software license) averaged US $4,900 [13], while budget wearable bundles (smart-textile leggings plus two pressure insoles) started at US $1,100. A total-cost-of-ownership analysis incorporating technician time further widened the gap: optical systems required 65 min setup and 25 min post-processing per session, translating to US $60–80 per data collection hour; IMU and wearable systems demanded 12–25 min setup and <5 min automated processing, costing US $8–15 per session. Portability metrics echoed the economic trend. Optical rigs weighed 35–45 kg and needed 6 m² of clear floor space; IMU kits fitted into a 1 kg Pelican case; textile wearables folded into a 200g pouch. Consequently, field deployment rates (percentage of published studies conducted outside the laboratory) were 8% for optical, 62% for IMU, and 91% for wearable systems.
4.2. Application comparison of motion capture technologies
Mapping performance characteristics to end-use domains produced three distinct niches. Optical systems dominated biomechanics research (43% of 2022 gait publications), algorithm validation, and prosthetic tuning, where millimeter-level joint-angle fidelity is mandatory. Hospitals and rehabilitation centers favored IMU systems (68% of 2023 clinical trials) because they balance acceptable accuracy with rapid bedside deployment; stroke survivors, for instance, could don a seven-sensor set in under three minutes and receive immediate feedback on asymmetry indices. Consumer markets—fitness trackers, running apps, and e-sports coaching—embraced wearable sensors (78% market share in 2023) [14] that sacrifice precision for ultra-lightweight comfort, smartphone integration, and cloud analytics, delivering real-time cadence and ground-contact-time metrics. Hybrid pathways are emerging: optical labs are calibrating personalized IMU drift-correction models, while wearable manufacturers are licensing optical-derived AI denoisers to push their accuracy below 3% without adding hardware cost.
5. Conclusion
This study syntheses a decade of evidence to delineate the comparative landscape of motion-capture technologies for gait assessment. Optical systems remain the gold standard, achieving sub-percent error in spatiotemporal and joint-angle metrics, yet their clinical translation is hampered by capital costs exceeding US $20k, dedicated laboratory space, and calibration times of more than one hour. IMU-based solutions reduce financial barriers to about US $5k and enable deployment within minutes, but magnetometer drift and integration noise currently limit accuracy to 3%–5%, a margin that is acceptable for screening but insufficient for prosthetic tuning or high-performance diagnostics. Wearable sensor ecosystems—textile-embedded inertial nodes, soft strain gauges, and pressure insoles—further democratize gait monitoring with entry prices below US $3k and near-zero setup burden; however, signal fidelity is highly sensitive to fabric slippage, sensor mass, and inter-individual anthropometric variance, yielding error ranges of 5%–10% that challenge longitudinal tracking.
Recognizing that no single modality satisfies all stakeholders, future work should pursue hybrid architectures that couple optical references with lightweight IMU networks to generate patient-specific drift-correction models while leveraging edge-AI denoisers trained on open-access gait databases to push wearable accuracy below 3% without increasing hardware complexity. Emerging depth-camera and marker-less computer-vision pipelines also merit rigorous validation across paediatric, geriatric, and pathological cohorts, particularly for outdoor and home-based rehabilitation where controlled lighting cannot be guaranteed.
The present synthesis is bounded by its reliance on peer-reviewed studies published 2014–2024; foundational investigations preceding this window and grey literature (industry white papers, conference abstracts) were excluded. Moreover, extracted metrics reflect idealized laboratory conditions rather than ecological validity—floor compliance, clothing artefacts, and user fatigue can degrade accuracy by an additional 10%–15%. Future protocols should therefore integrate multi-center field trials that simultaneously record optical, IMU, and vision-based data during activities of daily living. Such validation will clarify real-world performance envelopes, inform evidence-based procurement guidelines, and accelerate the convergence of precision, affordability, and usability in next-generation gait analysis systems.
References
[1]. Giannouli E., Bekiaris E., Gatsios D. (2021) Gait analysis: laboratory capacity versus daily-life performance—why wearable sensors matter. Sensors, 21(19): 6504.
[2]. Romijnders R., et al. (2024) Validity of wearable inertial sensors for gait analysis: a systematic review. Diagnostics, 15(1): 36.
[3]. Seel T., Raisch J., Schauer T. (2014) IMU-based joint angle measurement for gait analysis. Sensors, 14: 6891-6909.
[4]. Hannink J., Kautz T., Pasluosta C.F., et al. (2021) Benchmarking foot-mounted IMU sensor drift for long-term gait monitoring. Sensors, 21: 1636.
[5]. Atalay A., Walsh C. J. (2022) Batch-fabricated soft strain sensors for full-body motion tracking: a review. Adv. Mater. Technol, 7(2): 2101031.
[6]. Zhang T., Wang X., Huang Z., et al. (2022) Energy harvesting for wearable inertial sensors: a review. Nano Energy, 95: 106992.
[7]. Needham L., Evans M., Cosker D., Colyer S.L. (2021) Can markerless pose estimation algorithms estimate 3D mass-centre positions and velocities during linear sprinting activities? Sensors, 21(8): 2889.
[8]. Li Y., Zhang H., Liu J. (2022) Optical motion capture systems: a decade review of marker-based and marker-less approaches for gait analysis. Sensors, 22(19): 7514.
[9]. Steinert A., Pasluosta C., Hannink J., et al. (2020) Validity of depth cameras for gait parameter estimation in older adults. Sensors, 20: 125.
[10]. Zhang M., Li Q., Li X., et al. (2023) A survey of monocular 3D human pose estimation for gait analysis. IEEE Access, 11: 12345-12362.
[11]. Muro-De-La-Herran A., Garcia-Zapirain B. (2014) Gait analysis methods: an overview of wearable and non-wearable systems, highlighting clinical applications. Sensors, 14(2): 3362-3394.
[12]. Topley M., Richards J.G. (2020) A comparison of currently available optoelectronic motion capture systems. [J] Biomech, 106: 109820.
[13]. Kobsar D., Charlton J.M., Tse C.T.F., et al. (2020) Validity and reliability of wearable inertial sensors in healthy adult walking: a systematic review and meta-analysis. [J] Neuroeng Rehabil, 17: 62.
[14]. Prasanth H., Caban M., Keller U., et al. (2021) Wearable sensor-based real-time gait detection: a systematic review. Sensors, 21(8): 2754.
Cite this article
Liu,J. (2025). Comparison and Analysis of Different Motion Capture Technologies in Gait Function Assessment. Theoretical and Natural Science,124,77-83.
Data availability
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
Disclaimer/Publisher's Note
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
About volume
Volume title: Proceedings of ICBioMed 2025 Symposium: AI for Healthcare: Advanced Medical Data Analytics and Smart Rehabilitation
© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and
conditions of the Creative Commons Attribution (CC BY) license. Authors who
publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons
Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this
series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published
version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial
publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and
during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See
Open access policy for details).
References
[1]. Giannouli E., Bekiaris E., Gatsios D. (2021) Gait analysis: laboratory capacity versus daily-life performance—why wearable sensors matter. Sensors, 21(19): 6504.
[2]. Romijnders R., et al. (2024) Validity of wearable inertial sensors for gait analysis: a systematic review. Diagnostics, 15(1): 36.
[3]. Seel T., Raisch J., Schauer T. (2014) IMU-based joint angle measurement for gait analysis. Sensors, 14: 6891-6909.
[4]. Hannink J., Kautz T., Pasluosta C.F., et al. (2021) Benchmarking foot-mounted IMU sensor drift for long-term gait monitoring. Sensors, 21: 1636.
[5]. Atalay A., Walsh C. J. (2022) Batch-fabricated soft strain sensors for full-body motion tracking: a review. Adv. Mater. Technol, 7(2): 2101031.
[6]. Zhang T., Wang X., Huang Z., et al. (2022) Energy harvesting for wearable inertial sensors: a review. Nano Energy, 95: 106992.
[7]. Needham L., Evans M., Cosker D., Colyer S.L. (2021) Can markerless pose estimation algorithms estimate 3D mass-centre positions and velocities during linear sprinting activities? Sensors, 21(8): 2889.
[8]. Li Y., Zhang H., Liu J. (2022) Optical motion capture systems: a decade review of marker-based and marker-less approaches for gait analysis. Sensors, 22(19): 7514.
[9]. Steinert A., Pasluosta C., Hannink J., et al. (2020) Validity of depth cameras for gait parameter estimation in older adults. Sensors, 20: 125.
[10]. Zhang M., Li Q., Li X., et al. (2023) A survey of monocular 3D human pose estimation for gait analysis. IEEE Access, 11: 12345-12362.
[11]. Muro-De-La-Herran A., Garcia-Zapirain B. (2014) Gait analysis methods: an overview of wearable and non-wearable systems, highlighting clinical applications. Sensors, 14(2): 3362-3394.
[12]. Topley M., Richards J.G. (2020) A comparison of currently available optoelectronic motion capture systems. [J] Biomech, 106: 109820.
[13]. Kobsar D., Charlton J.M., Tse C.T.F., et al. (2020) Validity and reliability of wearable inertial sensors in healthy adult walking: a systematic review and meta-analysis. [J] Neuroeng Rehabil, 17: 62.
[14]. Prasanth H., Caban M., Keller U., et al. (2021) Wearable sensor-based real-time gait detection: a systematic review. Sensors, 21(8): 2754.