Database Open Access
A multimodal gait dataset of brain activity, muscle activity, kinematics and ground forces in young adults
Rateb Katmah , Aamna AlShehhi , Doua Kosaji , Nour Al-Rahmani , Muhammad Abdullah , Abdul Aziz Vaqar Hulleck , Kinda Khalaf
Published: April 30, 2026. Version: 1.0.0
When using this resource, please cite:
Katmah, R., AlShehhi, A., Kosaji, D., Al-Rahmani, N., Abdullah, M., Hulleck, A. A. V., & Khalaf, K. (2026). A multimodal gait dataset of brain activity, muscle activity, kinematics and ground forces in young adults (version 1.0.0). PhysioNet. RRID:SCR_007345. https://doi.org/10.13026/r0ea-7161
Please include the standard citation for PhysioNet:
(show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff, J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000). PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource for complex physiologic signals. Circulation [Online]. 101 (23), pp. e215–e220. RRID:SCR_007345.
Abstract
Gait is a fundamental motor function, and its analysis is essential for understanding locomotor control, rehabilitation, and the early detection of neurological and musculoskeletal disorders. While many datasets capture either biomechanical or neural aspects of gait, publicly available multimodal datasets that integrate brain, muscle, kinematic, and ground reaction force recordings remain scarce. This limitation restricts advances in modeling neuromechanical interactions and the development of machine learning approaches for gait classification and rehabilitation technologies. To address this gap, we provide a comprehensive dataset of treadmill walking from 59 healthy adults with a mean age of 24 ± 5 years, representing both sexes and different body mass index categories. Participants walked at three controlled speeds (0.5, 0.75, and 1.0 m/s), with synchronized recordings from scalp electroencephalography, surface electromyography of 12 lower-limb muscles, inertial sensors capturing kinematics, and bilateral force plates measuring three-dimensional forces, moments, and center of pressure. The dataset enables investigations into brain-body interactions, speed-dependent adaptations, and neuromechanical variability, while supporting the benchmarking of computational models for gait analysis.
Background
Gait analysis plays a critical role in understanding human motor control, musculoskeletal dynamics, and neurological health. Abnormalities in walking patterns are often among the earliest indicators of neurological and orthopedic conditions, including stroke, Parkinson’s disease, and anterior cruciate ligament injuries [1]. For this reason, comprehensive gait data are widely used in both clinical research and the development of rehabilitation technologies, such as robotic devices, prosthetics, and movement-monitoring systems [2].
A wide range of approaches have been used to capture gait-related information. Motion capture systems and force plates are commonly employed to measure spatiotemporal and kinetic features, while inertial sensors provide joint kinematics during locomotion [3]. Surface electromyography (EMG) offers insight into muscle activation patterns underlying movement [4]. More recently, electroencephalography (EEG) has been applied to study cortical activity during walking, with applications in motor control, neurorehabilitation, and brain–computer interfaces [5]. However, many publicly available gait datasets focus on a single modality, limiting their capacity to capture the full complexity of neuromechanical interactions during walking. In particular, EEG is rarely collected in combination with biomechanical and muscular data, restricting opportunities to study how neural signals align with gait dynamics [6, 7].
According to our previously published review paper [8], several datasets have contributed to advancing the field, including those based on motion capture and ground reaction force recordings, as well as datasets focused on EMG or EEG during walking. While each of these resources is valuable for specific research questions, there remains a need for multimodal datasets that integrate neural, muscular, kinematic, and kinetic information in a synchronized manner. Such datasets provide a richer representation of human locomotion, and are particularly useful for validating computational models of gait and for developing machine learning approaches to classification and prediction tasks.
Part of the dataset described here has already been used in a published study examining the influence of body mass index (BMI) on gait dynamics across walking speeds, based on force plate recordings [9]. In that work, spatiotemporal gait parameters such as ground reaction forces, joint moments, and center of pressure trajectories were compared across BMI categories, and machine learning models were employed to classify walking speeds. The study reported BMI-specific biomechanical adaptations, including higher joint loading and variability in obese individuals, while also demonstrating the potential of AI models to distinguish gait speeds. The present Data Descriptor extends beyond this initial application by releasing the complete multimodal dataset, including EEG, EMG, and IMU recordings in addition to force plate data, thereby enabling broader investigations into neuromechanical control and multimodal analysis of gait.
To create this dataset, we collected synchronized recordings from 59 healthy young adults with a mean age of 24 ± 5 years, representing both sexes and spanning normal weight, overweight, and obese body mass index categories. Each participant completed walking trials at three fixed treadmill speeds (0.5, 0.75, and 1.0 m/s), with one minute of data collected per condition. The experimental setup included 19 scalp EEG channels using a dry electrode system, 12 surface EMG electrodes placed on major lower-limb muscles, IMU-derived kinematic signals from 17 sensors, and bilateral split-belt force plates recording three-dimensional ground reaction forces, moments, and center of pressure. The metadata file includes participant demographic information (sex, age, and BMI category).
The dataset provides an open resource for researchers in biomechanics, neuroscience, rehabilitation, and computational modeling. Potential reuse includes investigations into speed-dependent adaptations in gait, analyses of brain–muscle connectivity, studies of neuromechanical variability across body composition, and benchmarking of machine learning algorithms for gait classification and rehabilitation monitoring. The inclusion of synchronized EEG, EMG, IMU, and force plate signals enables cross-disciplinary approaches, allowing questions to be addressed that cannot be explored using single-modality datasets alone.
By making these multimodal recordings openly available, the dataset aims to support reproducible and transparent research on human gait. It may be applied in a wide range of domains, from fundamental studies of locomotor control to the development of clinical assessment tools and intelligent rehabilitation systems.
Methods
Participants
Fifty-nine healthy adults (22 males, 37 females; with a mean age of 24 ± 5 years) were recruited. Participants were free of any self-reported neurological, musculoskeletal, or cardiovascular disorders, and none had a history of lower-limb surgery or gait abnormalities. Body mass index (BMI) was calculated from measured height and weight, and participants were classified into normal weight, overweight, or obese categories according to World Health Organization standards [10]. All participants provided informed written consent before participation. The experimental protocol was approved by the Institutional Review Board of Khalifa University (H19-038).
Data Acquisition Systems
Electroencephalography (EEG). In this work, EEG signals were acquired using a 19-channel dry electrode headset (DSI-24, Wearable Sensing, San Diego, CA, USA). It employs dry, spring-loaded sensors capable of penetrating hair to maintain scalp contact without the need for conductive gel or abrasive preparation. The headset offers active and passive shielding to minimize contamination from electrical noise and motion artifacts, which is particularly critical during dynamic tasks like gait trials.
EEG signals were sampled at 300 Hz, and all 19 electrodes were individually adjustable to optimize contact impedance, which was maintained below 1 MΩ. Electrode placement followed the standard 10–20 international system; a schematic of the electrode layout is provided in Figure 1 in the accompanying files. Pz was assigned as the reference electrode, and the ground was connected to the earlobes. Real-time data acquisition was carried out using DSI-Streamer (version 2.3, Wearable Sensing).
To validate the selection of the DSI-24 system, findings from Mahdid et al. [11] demonstrated that among four commonly used wearable EEG systems, Epoc+, OpenBCI, DSI-24, and Quick-30, the DSI-24 exhibited the highest cosine similarity to the gold-standard EEG systems in terms of both functional and directed connectivity patterns. These results further support the use of DSI-24 for studies involving mobile brain/body imaging (MoBI), such as gait-related EEG.
Electromyography (EMG). Surface EMG was used in this study to assess muscle activity during gait. Data were recorded using the Delsys Trigno Wireless EMG system (Delsys Inc., Boston, MA, USA), a widely used and validated platform in biomechanics and neuromuscular research. This system employs wireless, high-fidelity surface electrodes to capture the electrical signals generated by muscle fibers during contraction, enabling accurate and real-time monitoring of muscle activation patterns. Twelve muscles were selected based on their known involvement in locomotion. These included bilateral recordings from the vastus lateralis, rectus femoris, semitendinosus, gastrocnemius lateral head, gastrocnemius medial head, and tibialis anterior.
The sensors were placed on the muscle belly of each targeted muscle according to SENIAM (Surface Electromyography for the Non-Invasive Assessment of Muscles) guidelines to ensure signal consistency and minimize crosstalk. Figure 2 in the accompanying files shows the Delsys Trigno system, the EMG sensors, and their anatomical placement on the participants.
Inertial Measurement Units (IMU). To capture lower-limb joint kinematics during walking trials, this study employed the MVN Awinda system (Xsens, Enschede, Netherlands), a wireless, IMU-based full-body motion capture system. The MVN Awinda utilizes a network of 17 non-invasive sensors, each containing a 3D accelerometer, gyroscope, and magnetometer, to record real-time body segment orientations and movements. This configuration enables accurate tracking of joint angles without the need for cameras or optical markers, making it ideal for this gait study. The system operates at a sampling frequency of 60 Hz. Sensors were strategically placed on the participant’s body following a standardized anatomical protocol to ensure consistency and measurement validity; the sensor placement is shown in Figure 3 in the accompanying files. The accompanying software, Xsens MVN Analyze (version 2022.0.2), was used for data acquisition
The Xsens MVN system has been validated in numerous studies assessing its accuracy and reliability in monitoring human movement. For example, research has demonstrated its effectiveness in capturing kinematic parameters in both laboratory and field settings, including sports performance and rehabilitation contexts [12-14].
Force Plates (FP). To capture gait dynamics and ground reaction forces, a split-belt instrumented treadmill (Bertec Corporation, Columbus, OH, USA) was employed. Each belt is equipped with an embedded six-component force plate, capable of recording three-dimensional ground reaction forces (Fx, Fy, Fz) and moments (Mx, My, Mz). Additionally, the system captures center of pressure (COP) trajectories in both the medial-lateral (COPx) and anterior-posterior (COPy) directions. These measurements are essential for analyzing gait events such as heel strikes, toe-offs, step durations, stride lengths, and weight distribution patterns. All force and COP data were sampled at a high frequency of 1000 Hz.
Accurate synchronization across modalities was essential to ensure that neural (EEG), muscular (EMG), kinematic (IMU), and kinetic (force plate) signals were temporally aligned within each gait cycle. To achieve this, a trigger-based protocol was implemented in MATLAB, generating a digital pulse transmitted via a Trigger Interface Box to the EEG, EMG, and IMU systems through a wireless hub. The system compensated for fixed wireless transmission delays, ensuring precise alignment of EEG data with the other modalities. Since the Bertec treadmill did not support hardware synchronization, a custom MATLAB script was used to prompt manual initiation of treadmill recording simultaneously with the trigger, allowing near-synchronous acquisition across all systems.
Experimental Protocol
At the beginning of the experiment, participants provided basic demographic information (gender, date of birth, height, and weight) before completing the informed consent process. All participants signed the Institutional Review Board–approved consent form.
Twelve surface EMG sensors were attached to the mid-belly region of selected lower-limb muscles. Skin preparation included gentle abrasion using medical-grade tape and cleansing with alcohol swabs to optimize electrode–skin contact. Then, IMU sensors were positioned according to standardized placement protocols on bony landmarks to minimize soft tissue artifacts, followed by calibration in MVN Analyze software to initialize orientation and reduce magnetic interference. A 19-channel dry EEG headset was then fitted, with electrode–scalp contact adjusted to achieve stable signals. Impedance levels and real-time EEG traces were monitored to confirm signal quality.
Following sensor setup, participants stepped onto the split-belt treadmill. The belt speed was gradually increased from 0 to 0.5 m/s to allow safe adaptation to the walking task. Each speed condition lasted three minutes: the first and last minutes were allocated to acceleration and deceleration, while continuous data were recorded during the middle minute. The same procedure was repeated for 0.75 m/s and 1.0 m/s walking speeds, with transitions performed gradually to avoid abrupt gait changes. After the final trial, the treadmill was slowly brought to a complete stop. Figure 4 in the accompanying files shows a participant during the experimental setup, including EEG, IMU, and EMG instrumentation on the split-belt treadmill.
To determine each participant’s habitual walking speed, a 9-meter walk test was also performed. Timing was measured over the middle 4 meters using a stopwatch, with the first and last 2.5 meters allocated for acceleration and deceleration, respectively. The full experimental session lasted approximately 45 minutes.
Data Description
The dataset consists of four primary subfolders, corresponding to the recorded modalities (EEG, EMG, IMU, and FP), as well as an accompanying metadata file. The full dataset contains recordings from 59 participants across three treadmill speeds (0.5, 0.75, and 1.0 m/s). A small number of files are absent due to technical issues encountered during certain recording sessions. The missing files are documented below. An overview of the dataset structure is provided in Figure 5 in the accompanying files.
Metadata
The file subject-info.csv provides demographic and anthropometric information for each participant. Variables include subject ID, gender, BMI category, and body segment dimensions (lengths of upper arm, lower arm, upper leg, lower leg, neck, and shoulder; circumferences of chest, waist, hip, thigh, ankle, upper arm, and wrist).
EEG
EEG recordings are provided in both .csv and .edf formats, with 177 files in each format (59 participants × 3 speeds). Files are named as S[subjectID]_[speed]_raw, e.g., S1_0.5_raw. Each file includes header information and continuous EEG signals. Data columns contain 19 scalp channels, 2 reference electrodes (A1, A2), and 3 auxiliary channels (unused). A trigger channel and acquisition diagnostics are also included. All EEG files contain raw signals without software filtering.
Each file possesses the following features:
Header information
- Line 1: Mains_Frequency – used to specify notch filtering in post-processing.
- Line 2: Sampling_Frequency (in Hz)
- Line 3: Filter_Delay – delay (in ms) introduced by digital filters in the firmware of Wearable Sensing’s EEG systems.
- Line 4: Sensor_Data_Units – uV.
- Line 5: Headset_name – Wearable Sensing’s EEG hardware used to acquire data.
- Line 6: Data_Logger – Wearable Sensing’s EEG software and version used to acquire data.
- Line 7: Date – Date of start of data acquisition (dd/mm/yyyy).
- Line 8: Time – Time of start of data acquisition (hh/mm/ss).
- Line 9: Patient_ID – Subject ID information.
- Line 10: Record_ID — Record ID information that can be entered in the record tab.
- Line 11: Filter – not filtered data.
- Line 12: Comments - field for entering comments about the file
- Line 13: Reference Location - The reference EEG electrode, Pz
- Line 14: Trigger Source – the way that was used to send the trigger to the headset, wireless
- Line 15: Channel_Number
- Line 16: Column headers – Header of data columns: Time, and text descriptors for each channel of data acquisition, Triggers, and error checking fields (Time_offset, ADC_Status, and ADC_sequence)
Data information
- Column 1: Time – Time in seconds since start of acquisition.
- Column 2 to 25: Channel number – 19 EEG electrodes, 2 reference electrodes A1 and A2, in addition to 3 auxiliary electrodes that weren’t used (X1, X2, X3).
- Column 26: Trigger – Value of multi-input trigger, expressed as a decimal number.
- Column 27: TimeOffset – Number of system clock ticks elapsed since system was turned on.
- Column 28: ADC_Status – Status of Analog/Digital Converter used by Wearable Sensing’s engineers troubleshooting purposes.
- Column 29: ADC_Sequence – Sequence number of Analog/Digital Converter that is used for identifying dropped data packets.
EMG
EMG data are stored as (.csv) files, with three files per participant (one per speed), totalling 169 recordings. Missing files include: S3_0.75, S3_1, S8_0.5, S11_1, S43_1, S50_0.5, S50_0.75, S50_1. Files are named EMG_S[subjectID]_[speed]. Each file contains a timestamp column and 12 muscle activation columns (mean absolute value, volts):
- Right tibialis anterior.
- Left tibialis anterior.
- Right vastus lateralis.
- Left vastus lateralis.
- Right rectus femoris.
- Left rectus femoris.
- Right gastrocnemius medial head.
- Left gastrocnemius medial head.
- Right semitendinosus.
- Left semitendinosus.
- Right gastrocnemius lateral head.
- Left gastrocnemius lateral head.
IMU
IMU data are provided as a compressed archive IMU.tar.gz, which contains 170 subfolders named IMU_S[subjectID]_[speed]. Each subfolder corresponds to one participant at one walking speed. Missing folders include: S2_0.75, S2_1, S3_0.75, S3_1, S24_0.5, S24_0.75, and S24_1.
After extracting the archive, each subfolder contains 18 CSV files providing kinematic measurements derived from the IMU system, including segment orientations (quaternions), positions, velocities, accelerations, angular velocities, and joint angles:
| Segment Orientation – Quat | 1x4 quaternion vector (q0, q1, q2, q3) describing the orientation of the segment with respect to the global frame. |
| Segment Position | 1x3 position vector (x, y, z) of the origin of the segment in the local frame in [m]. |
| Segment Velocity | 1x3 velocity vector (x, y, z) of the origin of the segment in the local frame in [m/s]. |
| Segment Acceleration | 1x3 acceleration vector (x, y, z) of the origin of the segment in the local frame in [m/s2]. |
| Segment Angular Velocity | 1x3 angular velocity vector (x, y, z) of the segment in the local frame in [rad/s]. |
| Segment Angular Acceleration | 1x3 angular acceleration vector (x, y, z) of the origin of the segment in the local frame in [rad/s2]. |
| Joint Angles ZXY | 1x3 Euler representation of the joint angle vector (x, y, z) in [deg], calculated using the Euler sequence ZXY using the ISB based coordinate system. |
| Joint Angles XZY | 1x3 Euler representation of the joint angle vector (x, y, z) in [deg], calculated using the Euler sequence XZY using the ISB based coordinate system. Commonly only used for the shoulder joints |
| Center of Mass | 1x3 position of the body Center of Mass (x,y,z) in the local frame in [m]. |
| Sensor Free Acceleration | 1x3 sensor free acceleration vector (x, y, z) of the sensor in [m/s2]. |
| Sensor Magnetic Field | 1x3 sensor magnetic field vector (x, y, z) of the sensor in [a.u.]. |
| Sensor Orientation - Quat | 1×4 sensor orientation quaternion (q0, q1, q2, q3) representing the orientation of the sensor in the local coordinate frame. |
All angles follow the International Society of Biomechanics (ISB) Euler angle extractions of Z (flexion/extension), X (abduction/adduction) Y (internal/external rotation). Below is an overview of all joints available and a short description:
| 1 | L5S1 | Joint between the lumbar spine segment 5 and sacral spine 1 (ZXY) |
| 2 | L4L3 | Joint between the lumbar spine segment 4 and lumbar spine segment 3 (ZXY) |
| 3 | L1T12 | Joint between the lumbar spine segment 1 and thoracic spine segment 12 (ZXY) |
| 4 | C1Head | Joint between the cervical spine segment 1 and the head segment (ZXY) |
| Left and Right | ||
| 5 | T4Shoulder | Joint between cervical spine 7 and the MVN shoulder segment |
| 6 | Shoulder ZXY | Shoulder joint angle between the shoulder segment and the upper arm; calculated using the Euler sequence ZXY |
| 7 | Shoulder XZY | Shoulder joint angle between the shoulder segment and the upper arm; calculated using the Euler sequence XZY |
| 8 | Elbow | Joint between the upper arm and the forearm. (ZXY) |
| 9 | Wrist | Joint between the forearm and the hand. (ZXY) |
| 10 | Hip | Joint between the pelvis and upper leg. (ZXY) |
| 11 | Knee | Joint between the upper leg and lower leg. (ZXY) |
| 12 | Ankle | Joint between the lower leg and foot. (ZXY) |
| 13 | BallFoot | Joint between the foot and the calculated toe. (ZXY) |
FP
Force plate data are provided in 173 .csv files named FP_S[subjectID]_[speed]. Missing files include: S3_0.75, S3_1, S8_0.75, S45_1. Each file contains bilateral ground reaction forces (Fx, Fy, Fz), moments (Mx, My, Mz), and center of pressure trajectories (COPx, COPy). Columns 1–11, whose headers begin with ‘1:’, correspond to the left leg, whereas columns 12–22, whose headers begin with ‘2:’, correspond to the right leg.
Usage Notes
This multimodal gait dataset provides a resource for researchers in neuroscience, biomechanics, rehabilitation, and computational modeling. It combines synchronized EEG, EMG, IMU, and force plate recordings, enabling investigations that cannot be performed with single-modality datasets alone.
Previous use of the data
Part of the dataset (force plate recordings) has already been used to study the influence of body mass index (BMI) on gait dynamics across different walking speeds [9]. That study demonstrated BMI-specific biomechanical adaptations and successfully applied machine learning models to classify walking speeds. The current release makes the full multimodal dataset available, extending beyond kinetics to include cortical, muscular, and kinematic signals.
Reuse potential
The dataset can be applied to a broad range of research questions, including:
- Brain–muscle connectivity and neuromechanical control of gait.
- Speed-dependent adaptations in gait dynamics.
- Machine learning benchmarks for gait classification, gait recognition, or rehabilitation monitoring.
- Validation of wearable sensing technologies against force plate and IMU standards.
- Development of multimodal fusion algorithms in biomedical engineering and AI.
Known limitations
- The dataset includes only healthy young adults (mean age 24 ± 5 years), limiting its direct clinical generalizability to older or patient populations.
- A small number of files are missing due to technical interruptions; these are documented in the Data Description section.
- EEG signals recorded during gait are susceptible to motion artifacts and require careful preprocessing. A recommended re-referencing procedure is included with the dataset.
Complementary resources
The dataset is provided with example MATLAB code to re-reference EEG signals and reconstruct the Pz channel. Additional preprocessing can be performed using software such as EEGLAB (for EEG), EMGworks or Python toolboxes (for EMG), Xsens MVN (for IMU), and Bertec toolkits (for force plates).
Release Notes
Version 1.0.0: Initial public release of the dataset.
Ethics
This study was conducted in accordance with the principles of the Declaration of Helsinki and was approved by the Institutional Review Board of Khalifa University (Protocol No. H19-038). All participants provided written informed consent prior to participation and were informed about the study objectives, experimental procedures, and their right to withdraw from the study at any time without consequence.
Data were collected exclusively from healthy adult volunteers (mean age 24 ± 5 years). No clinical interventions or invasive procedures were performed during the experiments. The dataset was anonymized prior to release, and no personally identifiable information is included. All data collection, processing, and sharing procedures comply with applicable research ethics and data protection regulations in the United Arab Emirates. Public sharing of the anonymized dataset on an open research platform is permitted under the approved Institutional Review Board protocol and Khalifa University research governance policies.
The primary benefit of releasing this dataset is to support research in gait analysis, rehabilitation engineering, neuroscience, and artificial intelligence by providing a comprehensive multimodal benchmark resource. Risks to participants are minimal due to the non-invasive nature of the measurements and the anonymization of all shared data.
No clinical trial registration was required, as this study did not involve clinical interventions. No animal data were collected.
Acknowledgements
We thank all participants who volunteered their time for this study. This work was conducted at Khalifa University’s Rehabilitation Engineering and Biomechanics Laboratory with support from the Biomedical Engineering Department.
Conflicts of Interest
The authors declare no competing interests.
References
- di Biase L, Di Santo A, Caminiti ML, De Liso A, Shah SA, Ricci L, et al. Gait analysis in Parkinson’s disease: An overview of the most accurate markers for diagnosis and symptoms monitoring. Sensors. 2020;20(12):3529.
- Rokalaboina J, Nibi TM, Tao W, Zhang W. Soft Inflatable Knee Exosuit for Flexion Assistance in Swing Phase. IFAC-PapersOnLine. 2024;58(28):462-7.
- Na A, Buchanan TS. Self-reported walking difficulty and knee osteoarthritis influences limb dynamics and muscle co-contraction during gait. Human Movement Science. 2019;64:409-19.
- Lim H, Yan S, Dee W, Pech V, Hameeduddin I, Roth EJ, et al. Motor interference on lateral pelvis shifting towards the paretic leg during walking and its cortical mechanisms in persons with stroke. European Journal of Neuroscience. 2024;60(6):5249-65.
- Katmah R, Alshehhi A, Kosaji DJ, Al-Rahmani N, Albizreh A, Hulleck AAVA, Khalaf K. A methodology for quantifying neurological adaptation to physiotherapy in ACL-injured patients through muscle activity and brain connectivity analysis. In: ICBBE 2024 - Proceedings of 2024 11th International Conference on Biomedical and Bioinformatics Engineering; 2025. p. 157-161. doi:10.1145/3707127.3707153.
- Alharthi A, Ozanyan K. Fusion from multimodal gait spatiotemporal data for human gait speed classifications. In: IEEE SENSORS 2021; 2021 Oct 31-Nov 4; Sydney, Australia. doi:10.1109/SENSORS47087.2021.9639816.
- Duan F, Lv Y, Sun Z, Li J. Multi-scale Learning for Multimodal Neurophysiological Signals: Gait Pattern Classification as an Example. Neural Processing Letters. 2022;54(3):2455-70.
- Katmah R, Al Shehhi A, Jelinek HF, Hulleck AA, Khalaf K. A systematic review of gait analysis in the context of multimodal sensing fusion and AI. IEEE Trans Neural Syst Rehabil Eng. 2023;31:4189-4202. doi:10.1109/TNSRE.2023.3325215.
- Katmah R, AlShehhi A, Hulleck AA, Abdullah M, Khalaf K. AI-based analysis of BMI impact on gait dynamics across speeds. In: 2025 International Conference on Activity and Behavior Computing, ABC 2025; 2025 Apr 21-25; Al Ain, United Arab Emirates. doi:10.1109/ABC64332.2025.11118609.
- World Health Organization. Obesity and overweight [Internet]. Geneva: World Health Organization; 2025 [cited 2026 Apr 15]. Available from: https://www.who.int/news-room/fact-sheets/detail/obesity-and-overweight
- Mahdid Y, Lee U, Blain-Moraes S. Assessing the quality of wearable EEG systems using functional connectivity. IEEE Access. 2020;8:193214-25.
- Robert-Lachaine X, Mecheri H, Larue C, Plamondon A. Validation of inertial measurement units with an optoelectronic system for whole-body motion analysis. Medical & biological engineering & computing. 2017;55:609-19.
- Belcic I, Rodić S, Dukarić V, Rupčić T, Knjaz D. Do blood lactate levels affect the kinematic patterns of jump shots in handball? International Journal of Environmental Research and Public Health. 2021;18(20):10809.
- Belcic I, Ocic M, Dukaric V, Knjaz D, Zoretic D. Effects of One-Step and Three-Step Run-Up on Kinematic Parameters and the Efficiency of Jump Shot in Handball. Applied Sciences. 2023;13(6):3811.
Access
Access Policy:
Anyone can access the files, as long as they conform to the terms of the specified license.
License (for files):
Creative Commons Attribution 4.0 International Public License
Discovery
DOI (version 1.0.0):
https://doi.org/10.13026/r0ea-7161
DOI (latest version):
https://doi.org/10.13026/f7ez-gr08
Topics:
electroencephalography
biomechanics
neuroscience
gait analysis
force plate
kinematics
inertial measurement unit
electromyography
Project Views
4
Current Version4
All VersionsCorresponding Author
Versions
Files
Total uncompressed size: 6.5 GB.
Access the files
- Download the ZIP file (4.5 GB)
-
Download the files using your terminal:
wget -r -N -c -np https://physionet.org/files/multimodal-gait-dataset/1.0.0/
| Name | Size | Modified |
|---|---|---|
| Dataset | ||
| LICENSE.txt (download) | 14.5 KB | 2026-04-27 |
| Project_Content.pdf (download) | 703.5 KB | 2026-03-27 |
| RECORDS (download) | 7.0 KB | 2026-03-30 |
| SHA256SUMS.txt (download) | 70.2 KB | 2026-04-27 |
| figure1_eeg.png (download) | 40.8 KB | 2026-03-27 |
| figure2_emg.png (download) | 853.4 KB | 2026-03-27 |
| figure3_imu.png (download) | 320.3 KB | 2026-03-27 |
| figure4_setup.png (download) | 71.9 KB | 2026-03-27 |
| figure5_dataset_structure.png (download) | 112.3 KB | 2026-03-27 |