Database Open Access

MICRO Motion capture data from groups of participants standing still to auditory stimuli (2012)

Victor Gonzalez Agata Zelechowska Alexander Refsum Jensenius

Published: Oct. 5, 2020. Version: 1.0


When using this resource, please cite: (show more options)
Gonzalez, V., Zelechowska, A., & Jensenius, A. R. (2020). MICRO Motion capture data from groups of participants standing still to auditory stimuli (2012) (version 1.0). PhysioNet. https://doi.org/10.13026/4h9d-gf10.

Please include the standard citation for PhysioNet: (show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff, J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000). PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource for complex physiologic signals. Circulation [Online]. 101 (23), pp. e215–e220.

Abstract

This dataset comprises head motion observations collected as part of an experiment in which a group of people were asked to stand still for 6 minutes, with the first 3 minutes in silence, followed by 3 minutes with music. Head motion was captured in units of mm at 100Hz using a Qualisys infra-red optical system. The experiment was carried out at the University of Oslo on March 8th 2012 from a total of 113 participants. Code to read and process these files is available. The paper corresponding to the work is Jensenius et al., "The Musical Influence on People's Micromotion when Standing Still in Groups", Proceedings of the 14th Sound and Music Computing Conference (2017).


Background

It is commonly assumed that listening to musical sound, and particularly dance music with a clear pulse, makes us move. However, most empirical studies of music-induced motion have mainly focused on voluntary and fairly large-scale movement. This dataset was collected as part of a study which aimed to investigate the effects of music stimuli on movement when participants try to remain at rest. We collected data through optical motion capture from groups of people instructed to stand as still as possible with and without music stimuli. We then looked at the differences in movement between conditions.


Methods

For a full description of the methods, see [1]. To summarize, we recruited 113 participants during the Open Day at the University Oslo after approval by the Norwegian Center for Research Data (NSD), with project identification number NSD2457. The instantaneous position of a reflective marker placed on the head of each participant was recorded using a Qualisys infrared motion capture system (13 Oqus 300/500 cameras) running at 100 Hz. The data were recorded in 8 groups of 12-17 participants at a time.

Participants were asked to stand as still as possible for 6 minutes, starting with 3 minutes in silence and followed by 3 minutes with music. Participants were aware that music would start after 3 minutes, and were free to choose their standing posture. The distribution of participants in the recording space was standardized across trials with marks on the floor indicating the approximate feet position. The motion capture system was triggered and stopped automatically with the stimuli playback system, thus all recordings are exactly 6 minutes long.

Experimental stimuli

The stimuli used in the experiment were:

  • A. 00:00-03:00: Silence.
  • B. 03:00-06:00: Musical excerpts:
  1. Lento (#3) from György Ligeti Ten Pieces for Wind Quintet (20s)
  2. Allegro con delicatezza (#8) from György Ligeti Ten Pieces for Wind Quintet (15s)
  3. Adagio from Joaquin Rodrigo's Concierto de Aranjuez (40s)
  4. Winter movement from Vivaldi's The Four Seasons (20s)
  5. Left & Right by D'Angelo, featuring Method Man & Redman (35s)
  6. Marcando la distancia by Manolito y su trabuco(20s)
  7. Cubic by 808 State (30s)

Discography

  • Ligeti, G. (1998). Ten Pieces for Wind Quintet. On György Ligeti Edition Chamber music [CD]: Sony Music Entertainment Inc. Performed by London Winds.
  • Winter movement: Allegro Non Molto from Vivaldi's Concerto in F Minor, Op. 8/4 RV 297- Il. Giardino Armonico. 1994. “Vivaldi: The Four Seasons.” [CD] Teldec.
  • D'Angelo. Voodoo. Cheeba Sound, Virgin. 2000

Questionnaire

In a post-experiment questionnaire, participants were asked to self-report demographics and details such as whether they were standing with their eyes open or closed, and whether they had their knees locked.


Data Description

The following data types are provided:

  • Motion (marker position): Recorded with Qualisys Track Manager and saved as tab separated .tsv files. Data from each group of participants is saved in a separate file.
  • Stimuli: audio .wav file containing 3 minutes of silence and 3 minutes of consecutive samples of the tracks described above.
  • Demographics: descriptive data collected from participants in a post-experiment questionnaire, includes age, sex, music listening habits, knee posture and whether eyes were closed or open during the experiment. A value of 0 indicates that the participant answered "no" to the question; a value of 1 is "yes"; and a value of 0.5 indicates that the participant reported changing between states. For example, 0.5 in the Locked knees? column means that the participant reported switching between open and locked knees. QoM is quantity of movement; QoM w/oM is quantity of movement without music; QoM w M is quantity of movement with music; NoMus-Mus Diff is the difference in quantity of movement without music and with music.

Usage Notes

Python and MATLAB code to process the data as well as the Max/MSP patch used to play and synchronize stimuli with the motion capture system is available on GitHub [2]. A static version of this codebase is also available in the "code" directory of this project. Our paper describing the analysis of this dataset is Jensenius et al., "The Musical Influence on People's Micromotion when Standing Still in Groups", Proceedings of the 14th Sound and Music Computing Conference [1].

We encourage others to validate our work and build on it by applying novel analytical methods. In particular, our methods do not look into between-group differences and interpersonal synchronization within groups. Alternative approaches to cluster conditions or incorporate the position of individual participants in the capture volume could also yield relevant insight. Additionally, our work looked only at average movement in music vs in silence. We encourage others to look at the time domain and frequency domain properties of the time series.


Acknowledgements

This work was partially supported by the Research Council of Norway through its Centres of Excellence scheme, project numbers 262762 and 250698.


Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.


References

  1. Jensenius, A. R., Zelechowska, A., & Gonzalez Sanchez, V. E. (2017). The musical influence on people's micromotion when standing still in groups. In Proceedings of the SMC Conferences (pp. 195-200). Aalto University. https://www.duo.uio.no/handle/10852/56047
  2. Code for processing and analysing data for the MICRO project. https://github.com/fourMs/MICRO

Share
Access

Access Policy:
Anyone can access the files, as long as they conform to the terms of the specified license.

License (for files):
Creative Commons Attribution 4.0 International Public License

Corresponding Author
You must be logged in to view the contact information.

Files

Total uncompressed size: 114.3 MB.

Access the files
Folder Navigation: <base>/mocap_data
Name Size Modified
Parent Directory
nmA.tsv (download) 8.0 MB 2020-07-31
nmB.tsv (download) 9.7 MB 2020-07-31
nmC.tsv (download) 11.6 MB 2020-07-31
nmD.tsv (download) 10.6 MB 2020-07-31
nmE.tsv (download) 8.1 MB 2020-07-31
nmF.tsv (download) 6.2 MB 2020-07-31
nmG.tsv (download) 10.7 MB 2020-07-31
nmH.tsv (download) 9.9 MB 2020-07-31