Database Open Access
MICRO Motion capture data from groups of participants standing still to auditory stimuli (2012)
Victor Gonzalez , Agata Zelechowska , Alexander Refsum Jensenius
Published: Oct. 5, 2020. Version: 1.0
When using this resource, please cite:
(show more options)
Gonzalez, V., Zelechowska, A., & Jensenius, A. R. (2020). MICRO Motion capture data from groups of participants standing still to auditory stimuli (2012) (version 1.0). PhysioNet. https://doi.org/10.13026/4h9d-gf10.
Please include the standard citation for PhysioNet:
(show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff, J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000). PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource for complex physiologic signals. Circulation [Online]. 101 (23), pp. e215–e220.
This dataset comprises head motion observations collected as part of an experiment in which a group of people were asked to stand still for 6 minutes, with the first 3 minutes in silence, followed by 3 minutes with music. Head motion was captured in units of mm at 100Hz using a Qualisys infra-red optical system. The experiment was carried out at the University of Oslo on March 8th 2012 from a total of 113 participants. Code to read and process these files is available. The paper corresponding to the work is Jensenius et al., "The Musical Influence on People's Micromotion when Standing Still in Groups", Proceedings of the 14th Sound and Music Computing Conference (2017).
It is commonly assumed that listening to musical sound, and particularly dance music with a clear pulse, makes us move. However, most empirical studies of music-induced motion have mainly focused on voluntary and fairly large-scale movement. This dataset was collected as part of a study which aimed to investigate the effects of music stimuli on movement when participants try to remain at rest. We collected data through optical motion capture from groups of people instructed to stand as still as possible with and without music stimuli. We then looked at the differences in movement between conditions.
For a full description of the methods, see . To summarize, we recruited 113 participants during the Open Day at the University Oslo after approval by the Norwegian Center for Research Data (NSD), with project identification number NSD2457. The instantaneous position of a reflective marker placed on the head of each participant was recorded using a Qualisys infrared motion capture system (13 Oqus 300/500 cameras) running at 100 Hz. The data were recorded in 8 groups of 12-17 participants at a time.
Participants were asked to stand as still as possible for 6 minutes, starting with 3 minutes in silence and followed by 3 minutes with music. Participants were aware that music would start after 3 minutes, and were free to choose their standing posture. The distribution of participants in the recording space was standardized across trials with marks on the floor indicating the approximate feet position. The motion capture system was triggered and stopped automatically with the stimuli playback system, thus all recordings are exactly 6 minutes long.
The stimuli used in the experiment were:
- A. 00:00-03:00: Silence.
- B. 03:00-06:00: Musical excerpts:
- Lento (#3) from György Ligeti Ten Pieces for Wind Quintet (20s)
- Allegro con delicatezza (#8) from György Ligeti Ten Pieces for Wind Quintet (15s)
- Adagio from Joaquin Rodrigo's Concierto de Aranjuez (40s)
- Winter movement from Vivaldi's The Four Seasons (20s)
- Left & Right by D'Angelo, featuring Method Man & Redman (35s)
- Marcando la distancia by Manolito y su trabuco(20s)
- Cubic by 808 State (30s)
- Ligeti, G. (1998). Ten Pieces for Wind Quintet. On György Ligeti Edition Chamber music [CD]: Sony Music Entertainment Inc. Performed by London Winds.
- Winter movement: Allegro Non Molto from Vivaldi's Concerto in F Minor, Op. 8/4 RV 297- Il. Giardino Armonico. 1994. “Vivaldi: The Four Seasons.” [CD] Teldec.
- D'Angelo. Voodoo. Cheeba Sound, Virgin. 2000
In a post-experiment questionnaire, participants were asked to self-report demographics and details such as whether they were standing with their eyes open or closed, and whether they had their knees locked.
The following data types are provided:
- Motion (marker position): Recorded with Qualisys Track Manager and saved as tab separated .tsv files. Data from each group of participants is saved in a separate file.
- Stimuli: audio .wav file containing 3 minutes of silence and 3 minutes of consecutive samples of the tracks described above.
- Demographics: descriptive data collected from participants in a post-experiment questionnaire, includes age, sex, music listening habits, knee posture and whether eyes were closed or open during the experiment. A value of
0indicates that the participant answered "no" to the question; a value of
1is "yes"; and a value of
0.5indicates that the participant reported changing between states. For example,
Locked knees?column means that the participant reported switching between open and locked knees.
QoMis quantity of movement;
QoM w/oMis quantity of movement without music;
QoM w Mis quantity of movement with music;
NoMus-Mus Diffis the difference in quantity of movement without music and with music.
Python and MATLAB code to process the data as well as the Max/MSP patch used to play and synchronize stimuli with the motion capture system is available on GitHub . A static version of this codebase is also available in the "code" directory of this project. Our paper describing the analysis of this dataset is Jensenius et al., "The Musical Influence on People's Micromotion when Standing Still in Groups", Proceedings of the 14th Sound and Music Computing Conference .
We encourage others to validate our work and build on it by applying novel analytical methods. In particular, our methods do not look into between-group differences and interpersonal synchronization within groups. Alternative approaches to cluster conditions or incorporate the position of individual participants in the capture volume could also yield relevant insight. Additionally, our work looked only at average movement in music vs in silence. We encourage others to look at the time domain and frequency domain properties of the time series.
This work was partially supported by the Research Council of Norway through its Centres of Excellence scheme, project numbers 262762 and 250698.
Conflicts of Interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
- Jensenius, A. R., Zelechowska, A., & Gonzalez Sanchez, V. E. (2017). The musical influence on people's micromotion when standing still in groups. In Proceedings of the SMC Conferences (pp. 195-200). Aalto University. https://www.duo.uio.no/handle/10852/56047
- Code for processing and analysing data for the MICRO project. https://github.com/fourMs/MICRO
Anyone can access the files, as long as they conform to the terms of the specified license.
License (for files):
Creative Commons Attribution 4.0 International Public License
Total uncompressed size: 114.3 MB.
Access the files
- Download the ZIP file (55.8 MB)
- Access the files using the Google Cloud Storage Browser here. Login with a Google account is required.
Access the data using the Google Cloud command line tools (please refer to the gsutil
documentation for guidance):
gsutil -m -u YOUR_PROJECT_ID cp -r gs://music-motion-2012-1.0.physionet.org DESTINATION
Download the files using your terminal:
wget -r -N -c -np https://physionet.org/files/music-motion-2012/1.0/