Database Open Access

Body Sway When Standing and Listening to Music Modified to Reinforce Virtual Reality Environment Motion

Jefferson Streepey Shaquitta Dent

Published: Jan. 26, 2021. Version: 1.0.0


When using this resource, please cite: (show more options)
Streepey, J., & Dent, S. (2021). Body Sway When Standing and Listening to Music Modified to Reinforce Virtual Reality Environment Motion (version 1.0.0). PhysioNet. https://doi.org/10.13026/x32c-cz47.

Please include the standard citation for PhysioNet: (show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff, J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000). PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource for complex physiologic signals. Circulation [Online]. 101 (23), pp. e215–e220.

Abstract

The increased likelihood of falls as a consequence of aging or disease is generally associated with rising levels of body sway during stance.  The use of virtual reality (VR) for studying the role in vision in regulating body sway is well established, however the extent to which music incorporated into a virtual environment can influence sway is not understood. This dataset was collected as part of a study to explore the hypothesis that music manipulated to match VR motion provided by an Oculus Rift head mounted display can lead to increased levels of body sway. Twenty-eight subjects stood for 60 s on a balance platform that measured anterior posterior (AP) and medial lateral (ML) center of pressure movement (indicators of body sway).  While standing, the subjects experienced combinations of 3 visual conditions (VR translation in the AP direction at 0.1 Hz, no translation, and eyes closed) and 4 music conditions (Mozart’s Jupiter Symphony modified to scale volume at 0.1 Hz and 0.25 Hz, unmodified music, and no music) for a total of 12 trials. Analysis of the data using frequency domain measures may reveal the extent to which music influenced body sway.


Background

Increased levels of body sway are generally associated with falls in the elderly and patient populations. It has been shown that translating a moving virtual reality (VR) environment direction can promote body sway [1,2,3]. VR has been used in clinical and rehabilitation contexts to simulate “real-world” situations where unreliable visual information could promote sway and disrupt balance [4,5]. Such treatments are provided in an effort to help vulnerable populations to develop strategies to help improve balance to protect from falling.

In a “real-world” environment, however, there are also audio cues that could influence body sway. While the contributions of visual, vestibular, and somatosensory inputs on body sway have been well studied, the effects of different auditory inputs have not been conclusively demonstrated. Of interest in this study was the impact of music on sway. One study has suggested that, when the supplied sound was music as opposed to white noise or tones, there was no effect on body sway. Others have suggested that the type of music might matter with one study demonstrating that listening to Mozart’s Jupiter reduced body sway compared to other pieces of music [7] and another suggesting that increasing the groove supplied by the music would increase body sway [8].

The novelty of this study is that we used sound to try to enhance the perception of visual motion as a VR environment translated forward and backward about the subject. To do this, we manipulated the sound of Mozart’s Jupiter so that it increased and decreased in volume at the same frequency as the VR translation. We hypothesized that adding such an audio cue would increase the changes in body sway observed with VR visual motion. We believed that the addition of music could give researcher and clinicians new avenues to explore when creating effective interventions for aging and patient populations.


Methods

Twenty-eight subjects (13 male; 15 female) aged 18-35 recruited from the Indiana University Purdue University Indianapolis student population participated in the study.  Subjects were excluded from the experiment if they had a history of vertigo, motion sickness, or vestibular deficits. The experimental protocol was approved by the Indiana University Institutional Review Board and was performed in accordance with the Declaration of Helsinki.  All subjects gave their informed consent before inclusion in the study.

Subjects stood barefoot while wearing an Oculus Rift headset (Oculus VR, LLC., Irvine, CA) which presented a stereoscopic, three-dimensional, immersive virtual reality environment. The simulation was controlled by an interactive application authoring software, Unity 5.0 (Unity Technologies, San Francisco, CA) that responded in real-time to rotational and positional motion of the Oculus Rift to naturally update the visual environment with head motion.  The virtual reality environment simulated by the headset was a two-way street with four story buildings and store fronts lining the left and right side of the street. A clear blue sky with scattered clouds could also be viewed above the horizon.  On-ear headphones integrated into the Oculus Rift headset were used to provide a selection of Mozart’s Jupiter symphony streamed from Apple Music (Apple Inc, Cupertino, CA).  Jupiter was chosen as it has previously been shown to alter body sway during stance [4].  Music playback was controlled by Max 7.3.5 application (Cycling ’74, Walnut, CA), an audio editing application that can compile with Unity to distort sounds in virtual simulations.

Time in seconds (Time), vertical force (Fz), moment about the medial lateral axis (Mx), moment about the anterior posterior axis (My), medial lateral center of pressure location (COPx), and anterior-posterior center of pressure location (COPy) data were all collected at 1000 Hz with a balance plate (Bertec Corp.) when subjects stood for 60 s:

  1. With their eyes closed while listening to Mozart's Jupiter as its loudness shifted at 0.1Hz
  2. With their eyes closed while listening to Mozart's Jupiter as its loudness shifted at 0.25H.
  3. With their eyes closed looking at the VR environment while listening to an unmodified version Mozart's Jupiter.
  4. With their eyes closed looking at the VR environment while listening to no music.
  5. With their eyes open looking at the VR environment translating in the anterior posterior direction at 0.1 Hz while listening to Mozart's Jupiter as its loudness shifted at 0.1 Hz.
  6. With their eyes open looking at the VR environment translating in the anterior posterior direction at 0.1 Hz while listening to Mozart's Jupiter as its loudness shifted at 0.25 Hz.
  7. With their eyes open looking at the VR environment translating in the anterior posterior direction at 0.1 Hz while listening to no music.
  8. With their eyes open looking at the VR environment translating in the anterior posterior direction at 0.1 Hz while listening to an unmodified version Mozart's Jupiter.
  9. With their eyes open looking at the VR environment while listening to Mozart's Jupiter as its loudness shifted at 0.1 Hz.
  10. With their eyes open looking at the VR environment while listening to Mozart's Jupiter as its loudness shifted at 0.25 Hz.
  11. With their eyes open looking at the VR environment while listening to no music.
  12. With their eyes open looking at the VR environment while listening to an unmodified version Mozart's Jupiter.

Data Description

The dataset consists of 336 comma-delimited (*.csv) data files. These files correspond to 28 subjects, each with 12 trials. The header row of each file contains:

  • Time: elapsed time in seconds.
  • Fz: vertical force.
  • Mx: moment about the medial lateral axis.
  • My: moment about the anterior-posterior axis.
  • COPx: medial lateral center of pressure location.
  • COPy: anterior posterior center of pressure location.

The file naming convention is as follows:

  • ECL1.csv: eyes closed while listening to Mozart's Jupiter as its loudness shifted at 0.1Hz.
  • ECL2.csv: eyes closed while listening to Mozart's Jupiter as its loudness shifted at 0.25Hz.
  • ECR.csv: eyes closed looking at the VR environment while listening to an unmodified version Mozart's Jupiter.
  • ECN.csv: eyes closed looking at the VR environment while listening to no music.
  • WL1.csv: viewing the VR environment translating in the anterior posterior direction at 0.1 Hz while listening to Mozart's Jupiter as its loudness shifted at 0.1 Hz.
  • WL2.csv: viewing the VR environment translating in the anterior posterior direction at 0.1 Hz while listening to Mozart's Jupiter as its loudness shifted at 0.25 Hz.
  • WN.csv: viewing the VR environment translating in the anterior posterior direction at 0.1 Hz while listening to no music.
  • WR.csv: viewing the VR environment translating in the anterior posterior direction at 0.1 Hz while listening to an unmodified version Mozart's Jupiter.
  • WOL1.csv: viewing the VR environment while listening to Mozart's Jupiter as its loudness shifted at 0.1 Hz.
  • WOL2.csv: viewing the VR environment while listening to Mozart's Jupiter as its loudness shifted at 0.25 Hz.
  • WON.csv: viewing the VR environment while listening to no music.
  • WOR.csv: viewing the VR environment while listening to an unmodified version Mozart's Jupiter.

Usage Notes

Data from this study has been presented at the American College of Sports Medicine 2019 annual conference and a manuscript for the study is to be submitted to PLOS ONE. We found that VR translations at 0.1 Hz matched with 0.1Hz shifts in music volume led to an increased anterior-posterior COP excursion, but other measures of body sway such as sway velocities and variability were not enhanced when music volume shifted with VR motion. 

We are not confident that our analysis of body sway excursions, velocities, and variability would be as sensitive as other measures, especially those from the frequency domain.  We lack the expertise to swiftly program and perform these analyses, however, and would like to open up our data to the community. Data uploaded are the raw data from the balance platform.


Release Notes

The original VR and music files are not available.


Acknowledgements

The authors would like to thank Kelly Berger and Skyler Stevens for help in the collection of data as well as Ben Smith from the IUPUI Department of Music and Arts Technology for overseeing the development of the VR environment and the music app.  The authors would like to thank Chauncey Frend and Jeff Rogers from the Indiana University UITS Advanced Visualization Lab for their efforts to support the hardware and software used in this study.


Conflicts of Interest

The authors have no conflicts of interest to declare.


References

  1. Fushiki H, Kobayashi K, Asai M, Watanabe Y. Influence of visually induced self-motion on postural stability. Acta Otolaryngol. 2005;125: 60-64.
  2. Tossavainen T, Juhola M, Pyykko I, Aalto H, Toppola E. Development of virtual reality stimuli for force platform posturography. Int J Med Inform. 2003;70: 277-283.
  3. Wang Y, Kenyon RV, Keshner EA. Identifying the control of physically and perceptually evoked sway responses with coincident visual scene velocities and tilt of the base of support. Exp Brain Res. 2010;201: 633-672.
  4. Phu S, Vogrin S, Al Saedi A, Duque G. Balance training using virtual reality improves balance and physical performance in older adults at high risk of falls. Clin Interv Aging. 2019;14:1567-1577.
  5. Duque G, Boersma D, Loza-Diaz G, Hassan S, Suarez H, Geisinger D, Suriyaarachchi P, Sharma A, Demontiero O. Effects of balance training using a virtual-reality system in older fallers. Clin Interv Aging. 2013;8:257-263.
  6. Palm HG, Strobel J, Achatz G, Luebken FV, Friemert B. The role and interaction of visual and auditory afferents in postural stability. Gait Posture. 2009;30: 328-333.
  7. Forti S, Filipponi E, Berardino FD, Barozzi S, Cesarani A. The influence of music on static posturography. J Vestib Res. 2010;20: 351-356.
  8. Ross JM, Warlaumont AS, Drew HA, Rigoli LM, Balasubramaniam R. Influence of musical groove on postural sway. J Exp Psychol Hum Percept Perfom. 2016;42: 308-319.

Share
Access

Access Policy:
Anyone can access the files, as long as they conform to the terms of the specified license.

License (for files):
Open Data Commons Attribution License v1.0

Discovery

DOI (version 1.0.0):
https://doi.org/10.13026/x32c-cz47

DOI (latest version):
https://doi.org/10.13026/76ht-2717

Corresponding Author
You must be logged in to view the contact information.

Files

Total uncompressed size: 1.1 GB.

Access the files
Folder Navigation: <base>/S14
Name Size Modified
Parent Directory
ECL1.csv (download) 3.4 MB 2020-12-22
ECL2.csv (download) 3.4 MB 2020-12-22
ECN.csv (download) 3.4 MB 2020-12-22
ECR.csv (download) 3.4 MB 2020-12-22
WL1.csv (download) 3.4 MB 2020-12-22
WL2.csv (download) 3.4 MB 2020-12-22
WN.csv (download) 3.4 MB 2020-12-22
WOL1.csv (download) 3.4 MB 2020-12-22
WOL2.csv (download) 3.4 MB 2020-12-22
WON.csv (download) 3.4 MB 2020-12-22
WOR.csv (download) 3.4 MB 2020-12-22
WR.csv (download) 3.4 MB 2020-12-22