BACKGROUND: Inter-segment joint angles can be obtained from inertial measurement units (IMUs)
however, accurate 3D joint motion measurement, which requires sensor fusion and signal processing, sensor alignment with segments and joint axis calibration, can be challenging to achieve. RESEARCH QUESTION: Can an artificial neural network modeling framework be used for direct, real-time conversion of IMU data to joint angles during walking and running, and how does sensor number, location on the body and gait speed impact prediction accuracy? METHODS: Thirty healthy adult participants performed gait experiments in which kinematics data were obtained from self-placed IMUs and video motion analysis, the reference standard for joint kinematics. Data were collected during walking at 0.5 m/s, 1.0 m/s and 1.5 m/s, as well as during running at 2.0 m/s and 3.0 m/s. A generative adversarial network was trained and used to predict lower limb joint angles at all gait speeds using IMU data only, and prediction accuracy assessed using all combinations of sensors. RESULTS: Joint angle prediction accuracy was strongly dependent on the number and location of sensors, as well as walking and running speed. A single IMU could be used to predict sagittal plane joint angles at either the hip, knee or ankle during walking with RMS errors below 4.0°, though highest 3D joint motion accuracy was obtained with two or three IMUs for a given joint. SIGNIFICANCE: This study reports a modeling framework for direct conversion of IMU data to joint angles without signal processing or joint calibration. The findings suggest that combinations of up to four IMUs reproduce hip, knee and ankle joint kinematics simultaneously during walking and running with highest accuracy. The findings may be useful in maximizing accuracy of IMU-based motion measurements of the lower limb joints in applications such as remote monitoring of movement, sports training, and in rehabilitation.