Database Restricted Access

Multitaper spectra recorded during GABAergic anesthetic unconsciousness

John Abel Marcus Badgeley Benyamin Meschede-Krasa Gabe Schamberg Indie Garwood Kimaya Lecamwasam Sourish Chakravarty David Zhou Matt Keating Patrick Purdon Emery Brown

Published: April 19, 2021. Version: 1.0.0


When using this resource, please cite: (show more options)
Abel, J., Badgeley, M., Meschede-Krasa, B., Schamberg, G., Garwood, I., Lecamwasam, K., Chakravarty, S., Zhou, D., Keating, M., Purdon, P., & Brown, E. (2021). Multitaper spectra recorded during GABAergic anesthetic unconsciousness (version 1.0.0). PhysioNet. https://doi.org/10.13026/m792-h077.

Additionally, please cite the original publication:

Abel et al., "Machine learning of EEG spectra classifies unconsciousness during GABAergic anesthesia," PLOS ONE, 2021.

Please include the standard citation for PhysioNet: (show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff, J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000). PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource for complex physiologic signals. Circulation [Online]. 101 (23), pp. e215–e220.

Abstract

This database contains electroencephalography (EEG) multitaper spectra and associated conscious/unconscious labels for a cohort of 10 healthy volunteers undergoing stereotyped anesthetic administration and direct monitoring of subject responsiveness, and 44 patients receiving anesthesia care in an operating room (OR) context. Cases recorded in an OR context are further divided into those administered propofol alone under a total intravenous anesthesia (TIVA) approach, and those who received either sevoflurane alone or sevoflurane following a propofol induction. These data were used to train and validate classification models for tracking unconsciousness. Corresponding code for building and testing classification models is publicly available.


Background

In the past several decades significant advances have been made in understanding the unconscious brain during anesthesia [1]. These advances have been furthered in part by the observation that anesthetics produce characteristic patterns of EEG activity during unconsciousness, which relate to anesthetic dose, mechanism, and patient characteristics [1, 2, 3, 4]. It has also been proposed that EEG may be used intraoperatively to ensure adequate sedation/unconsciousness during surgical procedures. Specifically, EEG power spectra are known to reliably differ during anesthesia-induced unconsciousness and EEG power provides a relatively low-dimensional observation of EEG that retains information regarding neural state. Thus, features extracted from EEG power spectra could be used to track neural state during anesthesia.

This data repository contains multitapered power spectral density estimates from a cohort of 10 healthy volunteers undergoing stereotyped propofol infusion regimens collected in [3], and from a cohort of surgical cases in the OR undergoing propofol and/or sevoflurane anesthesia. Propofol and sevoflurane are thought to produce unconsciousness by potentiation of g-aminobutyric acid (GABA) neurotransmission, and concomitantly produce highly similar EEG spectral signatures. These data are used in [5] to construct classification models for tracking unconsciousness during anesthesia, with accompanying code in [6].


Methods

Data collection: volunteer cohort

Consent was obtained under the Massachusetts General Hospital IRB. EEG was recorded using a 64-channel BrainVision MRI Plus system (Brain Products) with a sampling rate of 5,000 Hz, bandwidth 0.016-1000 Hz, and resolution 0.5 μV least significant bit. The Fp1 channel was used for all further analysis. Subjects were instructed to close their eyes for the duration of the procedure to avoid eye-blink artifacts in the EEG.

Loss of consciousness (LOC) was recorded as the time at which the probability of response to both click and verbal cues dropped below 5%, as in [3]. Return of consciousness (ROC) was recorded as the time at which the probability of response to both click and verbal cues again exceeded 5%.

Data collection: OR cohort

A waiver of consent was obtained from the Massachusetts General Hospital IRB for retrospective collection and analysis of EEG data. Frontal EEG data were recorded using the Sedline brain function monitor (Masimo Corporation, Irvine, CA, USA). The EEG data were recorded with a pre-amplifier bandwidth of 0.5-92 Hz, sampling rate of 250 Hz, and with 16-bit, 29 nV resolution. The standard Sedline Sedtrace electrode array recorded from electrodes located approximately at positions Fp1, Fp2, F7, and F8, with ground at Fpz and reference electrode 1 cm above Fpz. Electrode impedance was less than 5  for each electrode.

LOC was recorded at the time surgery began. The time between induction and surgery start is unlabeled because it is unclear precisely when the patient lost consciousness using retrospective data. The data were recorded only until the end of the surgery, because it is unclear retrospectively when a patient returns to consciousness after a surgery ends. Due to the unpredictable nature of general surgical cases, the EEG signal dropped out for portions of some OR cohort cases.

Signal processing

The data were segmented into epochs of 2s each. Multitapered spectral power between 0 and 50 Hz was calculated for each 2s epoch using the NiTime Python package. Multitaper spectral estimation was performed on each 2s window with no overlap time-half-bandwidth product (commonly abbreviated TW) of 3, and 2TW-1 = 3 tapers. Each 2s window was detrended with a linear detrend prior to calculating the power spectrum as in ref. 3.

A label was generated for each 2s epoch with conscious (label = 1) corresponding to times before LOC or after ROC, and unconscious (label = 0) corresponding to times between LOC and ROC. For the OR data, epochs after induction but before LOC are labeled with NaN values.


Data Description

Data contents

The data are split into “OR” and “Volunteer” folders for the two cohorts. Data containing .csv files are saved according to the following naming convention:  {CASEID}_{DATATYPE}.csv. Every case has a spectrogram calculated from N consecutive epochs and power estimates for 100 frequency bins ranging from 0-50 Hz.

CASEIDs are unique identifiers for each volunteer or OR case. The DATATYPEs for each case were

  • Sdb: Spectrogram (dB): spectrogram calculated via multitaper for 2s epochs with no overlap between epochs.
  • f: Frequencies (Hz): frequency bins from spectral estimation. For all cases f ranges from 0-50 in steps of 0.5Hz
  • l: Consciousness label for each window, 0 for unconscious, 1 for conscious, NaN for OR  induction during which level of unconsciousness could not be assessed
  • EEGquality: boolean list for whether the EEG was recorded correctly for each spectrogram window. These files only exist for the OR cohort because the volunteer cohort had no issue with EEG dropout. Dropout was inferred by checking for total power at less than -30,000 dB summed across bins for a single 2s epoch.
  • t: Time in seconds since the beginning of the EEG recording. This array is the time axis for Sdb, l, and EEGquality arrays.

The OR folder additionally contains rx_sorted_case_ids.yml, which divides cases by the general agent used into "pure_propofol" (corresponding to only propofol used during the case), "pure_sevo" (corresponding to only sevoflurane used during the case), and "mixed" when both propofol and sevoflurane were used.

A third directory, "Volunteer_CNN", includes features generated via transfer learning using a pretrained convolutional neural network (CNN) with the procedure described in detail in [5]. CNNs use convolution operations to compute spatial features in an image, corresponding to time/frequency relationships within the EEG recording. A previously described neural network architecture called MobileNet was used [7]. Briefly, the CNN input was the multitapered power spectral estimate for the preceding 30 s with 2 s steps between windows (28 s overlap). Pixel intensities for an entire case were normalized to values between 0 and 1. The CNN mapped each 30 s spectrogram window into 1280 features, resulting in the full CNN feature matrix. A PCA was performed on the CNN feature matrix in an analogous manner as described above, reducing the CNN feature matrix to its first ten PC scores. These CNN features are included here for comparison with the other features and for use with the code if desired [6].

Limitations

EEG signature of unconsciousness is both anesthetic-dependent and age-dependent [1, 4]. Although we did not explicitly account for age in this study, there is a known decline in alpha (8-12 Hz) frontal EEG power during GABAergic anesthesia during aging. There is significant inter-individual variation in this respect as well. Furthermore, healthy individuals may exhibit differences in their EEG spectrum compared with sicker individuals. Subjects with known neurological conditions were excluded in the healthy volunteer cohort and we did not note any neurological conditions among the subjects used in the OR cohort. OR data were collected during routine surgical procedures, and since propofol is generally used for procedures where the patient is not intubated, more invasive surgical procedures were performed for the sevoflurane group. We direct the reader to refs. [1, 2] for introductory material in interpreting EEG power spectra during anesthesia.


Usage Notes

Prior use

The data were used to build and test classification models for classifying conscious/unconscious status during general anesthesia [5]. Briefly, spectrograms collected from 7/10 volunteer subjects were featurized via canonical frequency bands (slow-delta, theta, alpha, etc.), principal component analysis (PCA), linear discriminant analysis (LDA), and a pretrained convolutional neural network. CNN features are included in the repository, please see [5] for details on how these were generated. The resulting feature sets were temporally linked via hidden Markov modeling (HMM), and logistic regression classifiers were trained on each feature set or HMM states to classify the conscious/unconscious status.

Corresponding code for building and testing classification models is publicly available at [6].


Release Notes

This is the first release of this database.


Conflicts of Interest

Masimo Corporation has licensed and paid royalties on intellectual property to Massachusetts General Hospital created by EN Brown and PL Purdon regarding the use of EEG to track neural state during anesthesia. EN Brown and PL Purdon are also cofounders of PASCALL, a company developing closed loop physiological control systems for anesthesiology.


References

  1. Brown, Emery N., Ralph Lydic, and Nicholas D. Schiff. "General anesthesia, sleep, and coma." New England Journal of Medicine 363.27 (2010): 2638-2650.
  2. Purdon, Patrick L., et al. "Clinical electroencephalography for anesthesiologistspart I: background and basic signatures." Anesthesiology: The Journal of the American Society of Anesthesiologists 123.4 (2015): 937-960.
  3. Purdon, Patrick L., et al. "Electroencephalogram signatures of loss and recovery of consciousness from propofol." Proceedings of the National Academy of Sciences 110.12 (2013): E1142-E1151.
  4. Purdon, Patrick L., et al. "The ageing brain: age-dependent changes in the electroencephalogram during propofol and sevoflurane general anaesthesia." British journal of anaesthesia 115 (2015): i46-i57.
  5. Abel, John H., et al. “Machine learning of EEG spectra classifies unconsciousness during GABAergic anesthesia”, PLOS ONE, (2021) manuscript in press
  6. Code for model building and training associated with these data https://github.com/johnabel/GABAergic_unconsciousness
  7. Howard, Andrew G., et al. "Mobilenets: Efficient convolutional neural networks for mobile vision applications." arXiv preprint arXiv:1704.04861 (2017).

Share
Access

Access Policy:
Only registered users who sign the specified data use agreement can access the files.

License (for files):
PhysioNet Restricted Health Data License 1.5.0

Data Use Agreement:
PhysioNet Restricted Health Data Use Agreement 1.5.0

Corresponding Author
You must be logged in to view the contact information.

Files