Database Open Access

Regulation of Brain Cognitive States through Auditory, Gustatory, and Olfactory Stimulation with Wearable Monitoring

Hamid Fekri Azgomi Luciano Branco Md Rafiul Amin Saman Khazaei Rose T Faghih

Published: Dec. 18, 2023. Version: 1.0.0


When using this resource, please cite: (show more options)
Fekri Azgomi, H., Branco, L., Amin, M. R., Khazaei, S., & Faghih, R. T. (2023). Regulation of Brain Cognitive States through Auditory, Gustatory, and Olfactory Stimulation with Wearable Monitoring (version 1.0.0). PhysioNet. https://doi.org/10.13026/jr6f-wc85.

Additionally, please cite the original publication:

Fekri Azgomi, H., Branco, L. R. F., Amin, M. R., Khazaei, S., & Faghih, R. T. (2023). Regulation of brain cognitive states through auditory, gustatory, and olfactory stimulation with wearable monitoring. Scientific reports.

Please include the standard citation for PhysioNet: (show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff, J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000). PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource for complex physiologic signals. Circulation [Online]. 101 (23), pp. e215–e220.

Abstract

Inspired by advances in wearable technologies, we design and perform human-subject experiments to investigate the impacts of applying safe actuation (i.e., auditory, gustatory, and olfactory) for the purpose of regulating the cognitive arousal state and enhancing the performance state. In two separate experiments, participants are instructed to engage in a working memory task known as the n-back task. To modulate their brain state, we introduce various interventions, including listening to different genres of music in the first experiment, and consuming coffee and smelling perfume in the second experiment. Employing only wearable devices for human monitoring and using safe actuation intervention are the key components of the performed experiments. This dataset contains subjects' correct/incorrect responses along with their reaction times. In these experiments, we employed two Empatica wristbands and one muse headband to collect their physiological data. The data from Empatica contains participants' electrodermal activity (EDA), heart rate (HR), blood volume pulses (BVP), skin surface temperature, Photoplethysmography (PPG), and 3-axis accelerometer data. Using a muse headband, we also collected Electroencephalography (EEG) signals from four sensors. By creating a dataset that addresses the current lack of publicly available data for utilizing wearable devices and safe everyday stimuli to manage internal brain states, our research enables further investigations into machine learning and system identification. It eventually paves the way for smarter work environments in the future. The ultimate goal is to develop practical and automated personalized closed-loop systems that can effectively regulate internal brain states and enhance the overall quality of life.


Background

Engaging in any activity has the potential to induce cognitive stress. This holds true for workplaces, where stress can arise, as well as in educational settings, where the cognitive load of learning can lead to stress. To achieve heightened productivity and maintain it over time, it becomes crucial to elevate cognitive arousal levels and prevent disengagement. Psychology's Yerkes-Dodson law provides valuable insights into this phenomenon. It suggests that an individual's cognitive performance is influenced by their level of cognitive arousal, following an inverse U-shaped relationship. In other words, performance peaks when arousal is at an optimal level. Therefore, it is vital to regulate and maintain arousal within this optimal range.

To explore the correlation between cognitive performance and internal arousal levels, our focus is on examining the variations in cognitive arousal while individuals are under cognitive strain [3]. Since the internal arousal state remains hidden, we are taking an indirect approach to address this issue [4]. When subjected to cognitive stress stimuli, the brain responds in various ways, akin to its response to other internal or external stimuli. Observing brain signals through methods such as Electroencephalography (EEG) [5] or functional Near-Infrared Spectroscopy (fNIRS) [6] can provide insights into how the brain reacts to environmental stimuli. Alongside direct changes in brain activity, there are also fluctuations in other physiological signals like heart rate (HR), blood volume pulses (BVP), and electrodermal activity (EDA), all of which carry vital information about the internal arousal state. Recent advancements in wearable technologies have opened exciting opportunities to explore human brain responses in a more practical manner. Wearable devices, in contrast to expensive, high-precision research-grade technologies, are designed for seamless integration into daily life and intelligent work environments [7]. Notably, their cost-effectiveness and portability render wearable technologies particularly appealing in the realms of emotion recognition and human performance [8].

To effectively monitor and understand changes in cognitive arousal, we propose the utilization of wearable technologies. These devices offer a means to indirectly infer internal brain states by tracking and analyzing physiological data. By leveraging wearable technology, it becomes possible to gain insights into an individual's internal arousal state and make informed adjustments accordingly. By employing wearable technologies to track and investigate cognitive arousal, we can strive for enhanced performance and better understand the interplay between cognitive states and external activities.

This research primarily concentrates on utilizing wearable technologies for the purpose of monitoring physiological reactions. Our approach involves the utilization of Empatica E4 wristbands34 and a muse headband35 to gather data from participants while they are subjected to tasks inducing cognitive stress. The Empatica E4 wristband integrates noninvasive sensors for the collection of various physiological signals including EDA, BVP, Photoplethysmography (PPG), 3-axis accelerometer data, and skin temperature. Furthermore, the muse headband is employed to directly record brain electrical activity using a noninvasive EEG technique [9]. In contrast to other research-grade devices that capture EEG signals from the entire scalp, which are impractical for everyday use, the muse headband gathers EEG signals from four channels [10].

Here, we propose to induce cognitive stress by utilizing a widely researched working memory experiment known as n-back tasks. By designing and conducting n-back experiments, we seek to explore the brain's responses during periods of cognitive load. To effectively regulate cognitive arousal and improve performance, we propose examining the effects of employing safe actuation methods, specifically auditory, gustatory, and olfactory stimuli. These sensory interventions can potentially impact cognitive states and enhance overall performance levels. By investigating the influence of these stimuli, we aim to gain insights into their potential for optimizing cognitive arousal and improving task performance.

Numerous studies have explored the impact of various types of music, considered a safe actuation method, on individuals' internal states [11]. Beyond music, there are other forms of safe actuation that can also influence cognitive behavior. Examples include caffeine [12] intake and olfactory [13] stimulants, both of which have the potential to regulate brain states effectively. In Experiment 1, we propose employing music as a safe intervention to regulate internal arousal states and enhance performance. By using music as a stimulus, we aim to optimize cognitive performance levels and modulate cognitive arousal. In Experiment 2, we specifically focus on the effects of two types of safe actuation: smelling perfume and drinking coffee. Through analytical investigations, we aim to examine how these stimuli impact participants' cognitive performance states. We hypothesize that using safe actuators would improve the cognitive performance state and influence the cognitive arousal state.

By conducting these experiments, we aim to further our understanding of the relationship between safe actuation, cognitive performance, and cognitive arousal, with the goal of identifying strategies to enhance performance and regulate cognitive states effectively. To ensure the practicality of our research findings in everyday life, we exclusively rely on wearable technologies for our investigations. While research-grade technologies offer higher precision and are more costly, wearable devices are specifically designed to seamlessly integrate into everyday activities and smart work environments.

While previous studies have explored the effects of safe actuation in influencing physiological signals, the implementation of these findings in real-world settings requires a systematic approach. Our proposed experiments represent the first step towards investigating the regulation of cognitive brain states using safe actuation, exclusively through wearable technologies. The insights obtained from this study hold immense potential for practical applications in the closed-loop regulation of internal brain states.

Furthermore, the resulting dataset from our human-in-the-loop experiments has significant potential for further analysis, particularly in modeling the dynamics of the proposed safe actuation in modulating internal brain states. This deeper understanding of human neurophysiology would contribute to the advancement of knowledge in the field and pave the way for future practical applications.


Methods

All experiments were performed in the Computational Medicine Lab (CML) at the University of Houston. During the experiment, the subject is seated comfortably on an armchair and wears a muse headband and Empatica E4 wristbands on both hands. During the experiment, the subject looks at the screen to perform memory-related n-back tasks. All methods were performed according to the guidelines of the Declaration of Helsinki and the current ethical guidelines. This pilot study includes two sets of experiments.

Subjects were recruited from the members of the University of Houston (i.e., students and postdocs) for these experiments. In Experiment 1, 15 participants (9 males, 6 females) were recruited in total. In Experiment 2, 13 participants (10 males, 3 females) were recruited in total. Participants were required to be at least 18 years old. All participants read and signed an informed consent document. Each participant in the study has been assigned a unique ID, ensuring the protection of their personal information. As part of the deidentification process, all time stamps from various recordings have been uniformly shifted. This consistent approach maintains the anonymity and confidentiality of the participants across all data sources, enabling comprehensive analysis without compromising individual privacy.

In the designed n-back experiments, subjects were shown trials of stimulus (500 ms) along with a plus sign for their response (1500 ms). Each session consisted of an instruction that lasted for 5 s and 16 trials each of which included 22 stimuli. There were 10 s breaks in between trials and 20 s relaxation in between the 16 trials. To specify their response, participants had to press the target (green) vs non-target (red) buttons on Chronos. Before the start of the experiment, they were provided with instructions regarding the tasks and performed a couple of practice trials (i.e., one 1-back and one 3-back trial).

In session one of experiment 1, subjects perform n-back tasks with no music. In the next three sessions, they are asked to repeat the tasks while listening to their choice of relaxing music, exciting music, and newly generated relaxing music (i.e., AI-driven music according to their taste in relaxing music). The break time after each session is three, six, and three minutes, respectively.

In experiment 2, in the first session, they perform n-back tasks with no actuation. Before the second session, they are asked to smell their choice of fragrance. They have six minutes to apply this actuation. Next, they are asked to repeat the n-back tasks. In the third session, they were provided with their regular coffee and were asked to sit down and drink their coffee during a thirty-minute break while resting. Next, they are asked to repeat the n-back tasks.


Data Description

This dataset encompasses the outcomes of two distinct experiments, with each containing participants' correct/incorrect responses, accompanied by their response times, documented as 'n_back_responses'. Alongside these behavioral responses, the dataset incorporates physiological measurements collected from two Empatica devices, each affixed to either hand, as well as data obtained from a muse headband. 

To present the data in a structured manner, each participant's data is contained within a designated folder, numbered sequentially from A1 to A10 in Experiment 1, and B1 to B10 in Experiment 2. The dataset also includes the data from participants who were excluded for further analysis in [1]. Their folder names are named 'Excluded_ID_number' [1]. Their ID numbers are according to those presented in supplementary materials in [1]. The column “Start_time_unix” shows the initial time of the recording expressed as UNIX timestamp in UTC.  

The experiment settings and behavioral signals can be found in the ‘n_back_responses.csv’ file. The .csv file is in the following format:

  • The “ExperimentName” specifies the experiment of the study.
  • The “Instruction” specifies the presented n-back task at each trial.
  • The “Running [Block]” displays the applied actuator within each session.

In the designed n-back experiments, subjects were shown trials of stimulus (500 ms) along with a plus sign (fixation cross) for their response (1500 ms). The behavioral signals including the reaction time (RT) and correct/incorrect response are stored under the “Stimulus” and “Fixation” categories.

In experiment 1, the reaction times (milliseconds) associated with session 1 to session 4 are stored within “Fixation101.RT” to “Fixation104.RT” and “Stimulus101.RT” to “Stimulus104.RT”. The binary correct/incorrect responses associated with session 1 to session 4 are stored within “Fixation101.ACC” to “Fixation104.ACC” and “Stimulus101.ACC” to “Stimulus104.ACC”.

In experiment 2, the reaction times (milliseconds) associated with session 1 to session 3 are stored within “Fixation101.RT” to “Fixation103.RT” and “Stimulus101.RT” to “Stimulus103.RT”. The binary correct/incorrect responses associated with session 1 to session 3 are stored within “Fixation101.ACC” to “Fixation103.ACC” and “Stimulus101.ACC” to “Stimulus103.ACC”.

The EEG signals recorded from the Muse headband are also included in the dataset as 'EEG_recording.csv'. They include EEG channels from four locations: TP9, AF7, AF8, and TP10. The 2016 model Muse (MU-02) can also output RAW EEG data at 256Hz as well as being able to output RAW EEG from the right ear USB Auxiliary connector (named Right AUX in the csv files). The column (timestamps) shows the data points in UNIX format. EEG measurements are labeled as TP9, AF7, AF8, TP10, Right AUX.

The Empatica data encompasses a range of physiological aspects, including participants' electrodermal activity (EDA), heart rate (HR), blood volume pulses (BVP), skin surface temperature, Photoplethysmography (PPG), and 3-axis accelerometer data. These data types are stored as separate CSV files: 'EDA.csv', 'TEMP.csv', 'ACC.csv', 'BVP.csv', 'HR.csv', 'IBI.csv', and 'tags.csv'. Data recorded by Empatica wristbands from Left and Right hands are each represented by their own distinct CSV files, encompassing the following attributes:

  • 'Left_EDA.csv' and 'Right_EDA.csv'
  • 'Left_TEMP.csv' and 'Right_TEMP.csv'
  • 'Left_ACC.csv' and 'Right_ACC.csv'
  • 'Left_BVP.csv' and 'Right_BVP.csv'
  • 'Left_HR.csv' and 'Right_HR.csv'
  • 'Left_IBI.csv' and 'Right_IBI.csv'
  • 'Left_tags.csv' and 'Right_tags.csv'

For clarity, a concise summary of the contents of the different data files is provided below:

  • EDA: Measurements from the electrodermal activity (EDA) sensor expressed as micro siemens (μS). Values in the first column (EDA) show the EDA values. The column (start_time_unix) shows the initial time of the recording expressed as UNIX timestamp in UTC. The column (sampling_rate) shows the sample rate expressed in Hz.
  • TEMP: Skin temperature (TEMP) measured from temperature sensor expressed degrees on the Celsius (°C) scale. Values in the first column (TEMP) show the TEMP values. The column (start_time_unix) shows the initial time of the recording expressed as UNIX timestamp in UTC. The column (sampling_rate) shows the sample rate expressed in Hz.
  • ACC: Data from 3-axis accelerometer sensor. The accelerometer is configured to measure acceleration in the range [-2g, 2g]. Therefore, the unit in this file is 1/64g. Data from x, y, and z axis are labeled respectively (i.e., ACC_X, ACC_Y, and ACC_Z).  The column (start_time_unix) shows the initial time of the recording expressed as UNIX timestamp in UTC. The column (sampling_rate) shows the sample rate expressed in Hz.
  • BVP: Blood volume pulses (BVP) measured from a photoplethysmograph. Values in the first column (BVP) show the BVP values. The column (start_time_unix) shows the initial time of the recording expressed as UNIX timestamp in UTC. The column (sampling_rate) shows the sample rate expressed in Hz.
  • HR: Average heart rate (HR) extracted from the BVP signal. Values in the first column (HR) show the HR values. The column (start_time_unix) shows the initial time of the recording expressed as UNIX timestamp in UTC. The column (sampling_rate) shows the sample rate expressed in Hz.
  • IBI: Time between individuals heart beats extracted from the BVP signal. The first column (IBI_time) shows the time (with respect to the initial time) of the detected inter-beat interval expressed in seconds (s). The second column (IBI_intervals) shows the duration in seconds (s) of the detected inter-beat interval (i.e., the distance in seconds from the previous beat). No sample rate is needed for this file.
  • tags: Event mark times in each Empatica device. Each row corresponds to a physical button press on the device; at the same time as the status LED is first illuminated. The time is expressed as a UNIX timestamp in UTC, and it is synchronized with the initial time of the recordings indicated in the related data files from the corresponding session.
    Due to the inadvertent receipt of certain tags during experiments, the correct and unified tags are also included as 'tags.csv'.

Usage Notes

The dataset has been utilized in a publication to demonstrate how the proposed safe actuation intervention would enhance the cognitive performance state and regulate the arousal state [1]. In [1], to estimate cognitive performance, we analyzed subjects' correct/incorrect responses as well as their reaction times and modeled their internal cognitive performance state. We analyzed changes in EDA and EEG signals to further examine the effects of safe actuation on their physiological data. The experimental results verify our hypothesis about the efficiency of the proposed safe actuation in regulating internal brain states. 

A promising future avenue for this study lies in the detailed examination of various physiological data and the associated biomarkers. Further analysis of the published dataset has the potential to enrich our comprehension of human cognitive science. While the study reports the overall positive effects of the proposed safe actuation in boosting cognitive performance across all participants, diverse physiological responses were observed during the experiments. Thus, a person-specific analysis would help illuminate the physiological foundations behind the observed enhancements in participants' cognitive performance levels.

Conducting additional experiments with a larger subject pool would contribute to the accumulation of a more diverse dataset. Given the variations in individual reactions and potential delays in physiological responses to different forms of actuation, integrating actuation dynamics into the development of wearable machine interface (WMI) architectures could be advantageous. These envisioned research directions have the potential to deepen our understanding of the impact of sensory stimulation on cognitive states and aid in the creation of new interventions for improving cognitive performance within a closed-loop system.

In such practical closed-loop WMI architectures, a wearable device collects physiological data from individuals, a decoder estimates the internal cognitive brain state(s), and a controller incorporates personalized actuation dynamics, suggesting appropriate and safe actuation. Consequently, it enables regulating hidden states to desired levels within a closed-loop framework. With the continuous advancements in wearable technologies, the proposed research opens avenues for addressing mental health-related disorders through remote monitoring capabilities. The proposed real-time monitoring and regulation toolsets offer personalized, effective suggestions and medications with minimized side effects, ultimately enhancing the overall quality of life for individuals.


Ethics

All the experimental procedures and corresponding documents were approved by the institutional review board at the University of Houston, TX, USA (STUDY 00002490). 


Acknowledgements

This work was supported in part by the NSF CAREER Award through Multimodal Intelligent Noninvasive brain state Decoder for WearableAdapTive Closed-loop arcHitectures (MINDWATCH) under Grant 1942585/2226123, in part by NSF Grant through Wearable-Machine Interface Architectures (CRII: CPS) under Grant 1755780, and in part by New York University Start-up funds. 


Conflicts of Interest

Rose T. Faghih and Md. Rafiul Amin are co-inventors of a patent application filed by the University of Houston related to this research [2]. The rest of the authors have no conflicts of interest to declare that are relevant to the content of this article.


References

  1. Fekri Azgomi, Hamid, et al. "Regulation of brain cognitive states through auditory, gustatory, and olfactory stimulation with wearable monitoring." Scientific Reports 13.1 (2023): 12399. https://www.nature.com/articles/s41598-023-37829-z
  2. Faghih, R. T., Wickramasuriya, D. S. & Amin, M. R. Systems and methods for estimating a nervous system state based on measurement of a physiological condition (2022). US Patent App. 17/514,129.
  3. Azgomi, Hamid Fekri, Iahn Cajigas, and Rose T. Faghih. "Closed-loop cognitive stress regulation using fuzzy control in wearable-machine interface architectures." IEEE Access 9 (2021): 106202-106219. https://ieeexplore.ieee.org/abstract/document/9492148/
  4. Wickramasuriya, Dilranjan S., Md Rafiul Amin, and Rose T. Faghih. "Skin conductance as a viable alternative for closing the deep brain stimulation loop in neuropsychiatric disorders." Frontiers in neuroscience 13 (2019): 780. https://www.frontiersin.org/articles/10.3389/fnins.2019.00780/full?ref=https://githubhelp.com
  5. Lin, Yuan-Pin, Chi-Hong Wang, Tzyy-Ping Jung, Tien-Lin Wu, Shyh-Kang Jeng, Jeng-Ren Duann, and Jyh-Horng Chen. "EEG-based emotion recognition in music listening." IEEE Transactions on Biomedical Engineering 57, no. 7 (2010): 1798-1806. https://ieeexplore.ieee.org/abstract/document/5458075?casa_token=kW1GbPvMmDoAAAAA:8eAt6jxDKOD7FqfMFToCKIY9_x8UUgSQbKbxwU2ErdcM5dMaQNhXEAsrvRx77REnqL_RMhVi72U
  6. Yaghmour, Anan, Md Rafiul Amin, and Rose T. Faghih. "Decoding a music-modulated cognitive arousal state using electrodermal activity and functional near-infrared spectroscopy measurements." In 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), pp. 1055-1060. IEEE, 2021. https://ieeexplore.ieee.org/abstract/document/9630879
  7. Giorgi, Andrea, Vincenzo Ronca, Alessia Vozzi, Nicolina Sciaraffa, Antonello Di Florio, Luca Tamborra, Ilaria Simonetti et al. "Wearable technologies for mental workload, stress, and emotional state assessment during working-like tasks: A comparison with laboratory technologies." Sensors 21, no. 7 (2021): 2332. https://www.mdpi.com/1424-8220/21/7/2332
  8. Paletta, Lucas, N. M. Pittino, Michael Schwarz, Verena Wagner, and K. Wolfgang Kallus. "Human factors analysis using wearable sensors in the context of cognitive and emotional arousal." Procedia Manufacturing 3 (2015): 3782-3787. https://www.sciencedirect.com/science/article/pii/S2351978915008811
  9. Clarke, Adam R., Robert J. Barry, Diana Karamacoska, and Stuart J. Johnstone. "The EEG theta/beta ratio: a marker of arousal or cognitive processing capacity?." Applied psychophysiology and biofeedback 44 (2019): 123-129. https://link.springer.com/article/10.1007/s10484-018-09428-6
  10. Tsiakas, Konstantinos, Maher Abujelala, and Fillia Makedon. "Task engagement as personalization feedback for socially-assistive robots and cognitive training." Technologies 6, no. 2 (2018): 49. https://www.mdpi.com/2227-7080/6/2/49
  11. Ramírez, Andrea Valenzuela, Gemma Hornero, Daniel Royo, Angel Aguilar, and Oscar Casas. "Assessment of emotional states through physiological signals and its application in music therapy for disabled people." IEEE access 8 (2020): 127659-127671. https://ieeexplore.ieee.org/abstract/document/9137201
  12. McLellan, Tom M., John A. Caldwell, and Harris R. Lieberman. "A review of caffeine’s effects on cognitive, physical and occupational performance." Neuroscience & Biobehavioral Reviews 71 (2016): 294-312. https://www.sciencedirect.com/science/article/pii/S0149763416300690
  13. Porcherot, Christelle, Sophie Raviot-Derrien, Marie-Pierre Beague, Sven Henneberg, Michelle Niedziela, Kathryn Ambroze, and Jean A. McEwan. "Effect of context on fine fragrance-elicited emotions: Comparison of three experimental methodologies." Food Quality and Preference 95 (2022): 104342. https://www.sciencedirect.com/science/article/pii/S0950329321002251?casa_token=LOAVnxK3KwgAAAAA:Cpd0QEjuY58K5P7K9iELjV-Gv59S35Rtaicv0zg4Uh2FLYBIRFF3QMrfJUqZATv6aBrhaqOCIzc

Share
Access

Access Policy:
Anyone can access the files, as long as they conform to the terms of the specified license.

License (for files):
Open Data Commons Attribution License v1.0

Corresponding Author
You must be logged in to view the contact information.

Files

Total uncompressed size: 2.7 GB.

Access the files
Folder Navigation: <base>
Name Size Modified
Experiment_1
Experiment_2
LICENSE.txt (download) 19.9 KB 2023-12-11
Readme.txt (download) 3.0 KB 2023-10-20
SHA256SUMS.txt (download) 45.9 KB 2023-12-18