Research on the extension of respiratory interaction modalities in virtual reality technology and innovative methods for healing anxiety disorders

Research on the extension of respiratory interaction modalities in virtual reality technology and innovative methods for healing anxiety disorders Research on the extension of respiratory interaction modalities in virtual reality technology and innovative methods for healing anxiety disorders


The therapeutic process designed in this study is divided into three stages, as shown in Fig. 4. In the first stage, users wear VR headset and breath sensor hardware, and run the program. In the second stage, on the selection menu page, users autonomously choose among three different therapeutic methods based on prompts and introductions, according to their needs. In the third stage, after entering the corresponding therapeutic space based on the user’s choice from the previous stage, the process and rules are introduced. Once confirmed, the session begins. Users need to adjust their breathing rate and method according to the process rules, and interact with objects in the scene with the aid of vibrations and sounds to complete tasks. Different feedback results are presented upon success or failure, and users can choose to continue or end the therapy.

Breath interaction modality

To clarify the visual representation of the therapeutic process in virtual reality, this study summarizes and explores natural interaction patterns observed in everyday life, such as blowing out flames, diving, and dispersing smoke. The aim is to discover the optimal way to apply breath healing methods in virtual reality scenarios for interactive therapy, extracting the most familiar and direct subjective sensations from natural interactions and further exploring their relationship with the interactive process. Ultimately, blowing out flames, dispersing clouds, and diving were selected as the final three interactive scenarios, integrating related elements into the design to create intuitive visual feedback.

Fig. 4
figure 4

Interactive system overall architecture and user workflow.

Integrating virtual reality and Arduino smart sensor technology, this study designs an immersive guidance method for breath interaction. Users are guided in breath interaction through the visual interface of the scene, as well as auditory and tactile modalities. The sensor detects the user’s breathing activity, inputting data into the Unreal Engine to facilitate interaction within the virtual scene. Feedback is provided to the user in the VR headset through visual modalities. Through gamified level transitions and reward measures, users are guided to complete the entire breath healing process.

Fig. 5
figure 5

Interactive system overall architecture and user workflow.

As illustrated in Fig. 5, this study proposes a set of breath modality interaction semantics for three breath healing methods, applied within the system. The cycle breathing method based on diaphragmatic breathing stimulates the nervous system and alleviates negative emotions, with its interaction semantics summarized as peace and calmness, and the interaction method being slow inhalation and exhalation. The sustained exhalation method enhances lung function and the level of respiratory muscles, with interaction semantics of continuous output and combating negative emotions, and the interaction method being short inhalation and slow exhalation. Breath-holding adjusts for hyperventilation, increases the carbon dioxide content in cells, and protects brain cells, with semantics of maintaining a safe state and a sustainable sense of security, and the interaction method being continuous exhalation without inhalation.

Virtual reality immersive interaction scenarios

During the process of scenario design, the concepts of flames, oceans, and clouds were used as keywords for extended design. These concepts correspond to the three types of breath healing methods, allowing users to choose based on their own needs. Each level is designed with a main scene as well as success and failure transition scenes, as illustrated in Fig. 5.

“Flame altar” diaphragmatic breathing interaction scene

The first level is themed “Flame Altar” corresponding to the diaphragmatic breathing method in breath therapy. The scene is set at an ancient altar surrounded by flames, with intense sunlight streaming from above, creating a hot and scorching atmosphere. Users interact with the flames in the scene through exhalation, following guided steps to progressively extinguish the flames via cyclic breathing. Upon successful completion of this level, the scene transitions to a success scenario. In the design of the success scene, the original flames are replaced with flowers, green plants, and waterfalls, correlating with the comfort felt after successful healing. Conversely, the failure scene incorporates more flames as a gamified penalty for not extinguishing the flames in time.

The specific interaction process is as follows: At the start of the level, the first flame lights up. The user then uses the controller pointer to click on the corresponding torch, which triggers the playback of a breathing guidance audio and controller vibration. Next, the user follows the guide to perform cyclic breathing. Through the sensor, volume data is transmitted to Unreal Engine. In Unreal Engine’s level program, communication ports and baud rates are set using blueprint nodes provided by the SerialCom plugin, and new variables are created to store the data. This data is compared to a fixed value to determine whether breathing activity is currently occurring. When the determination is true, a timer is set using a delay function, and variables are used for timing. Conditions are continuously accumulated until a preset value is reached, extinguishing the flame and completing a cycle of breathing for the user. If not successful, the user can repeatedly click the torch for hints and try again. When the first flame is detected to be extinguished, the second flame lights up, and the process is repeated. After each flame is extinguished, a variable is incremented until all flames are extinguished, leading to a success screen. A timer starts with each lighting of a flame; if the flame is not extinguished within 30 s, it is considered a failure, leading to a failure screen.

“Cloud City” continuous breathing interaction scene

The second level is themed “Cloud City” corresponding to the sustained exhalation method in breath therapy. The scene depicts a cityscape filled with skyscrapers, with a generally grey tone, and the abundance of tangled cables and distant stations illustrates the anxiety brought about by urban life. The clouds in the scene, as interactive objects, resonate with the semantics of “continuous output, combating negative emotions.” When users follow the guidance to perform sustained exhalation in the level, they disperse the dark clouds hanging over the city. Only continuous exhalation can provide enough power to scatter the clouds. The feedback scene for successful completion of the level is designed as a city under a clear blue sky with clouds, with the messy wires removed, offering users a sense of clarity and emotional relief after healing, resonating with the feedback of the scene. Conversely, if users fail the level, the entire city falls into a dark atmosphere shrouded in dense clouds.

The specific interaction process is as follows: At the start of the level, sound effects and vibration cues automatically play, prompting the user to perform a sustained exhalation after inhaling. The sensor detects this and transmits volume data to Unreal Engine, where new variables are created to store this data. The cloud components are set to be movable and are given simulated physical properties. Multiplying the volume values by a fixed value, a force is applied to the cloud components on the z-axis, having previously set the cloud components to be movable and to simulate physical properties, allowing the clouds to move on the z-axis, creating the effect of being blown away. When exhalation stops and the value decreases, the relative force lessens, causing the clouds to fall. Once the clouds reach a certain height and begin to collide with collision boxes placed in the virtual scene, the clouds disappear. When the current cloud state is determined to be invisible, force begins to be applied to the next cloud. After all eight clouds disappear in sequence, the level is successfully completed, leading to the success screen. A timer starts at the beginning of the level; if the success conditions are not met within one hundred seconds from the start, the level transitions to the failure screen.

“Underwater diving” breath-holding interaction scene

The third level, themed “Underwater Diving” corresponds to the sustained breath-holding method in breath therapy. The scene is set under the sea, where bubbles are produced in the scene when users exhale, simulating the breath-holding experience while diving and maintaining a safe state underwater by holding one’s breath. In this level, when users maintain a breath-holding state, they can float towards the surface, with the breath-holding state resonating with the semantics of “a sustainable sense of safety.” Upon successful completion of the level, the scene transitions to a calm sea surface after surfacing, representing the relief of being rescued and surfacing successfully. However, failure to complete the level transitions to a failure scene, sinking into the dark abyss of the ocean floor.

The specific interaction process is as follows: At the start of the level, sound effects and vibration cues automatically play. Users inhale and then continue to hold their breath for the duration of the level, which lasts fifty seconds. In this level, users have two opportunities to breathe again. When users exhale and the volume value exceeds a preset threshold, bubble particles appear, creating an exhaling bubble effect on the screen. If the number of breaths exceeds two, the attempt is considered a failure, leading to the failure screen. If it does not exceed two, the environmental brightness is gradually increased through a delay setting while continuously holding breath, brightening the level environment until successfully completed, at which point the success screen is displayed.

Hardware and sensors

As shown in Fig. 6, the breath sensor casing is divided into two parts: left and right. The left part has a protruding insert board that, through the elasticity of the material, achieves an interference fit, connecting with the right side. The plug-in assembly scheme offers rapid and convenient assembly, with a simple structure to facilitate the disassembly and maintenance of the sensor. The use of a mortise and tenon assembly structure without screws also ensures a smooth and regular outer surface, which is beneficial for integration with the connecting bracket structure.

Fig. 6
figure 6

Sensors and related hardware accessories.

The design of the connecting bracket adopts a structure that can be fixed to the VR glasses and freely detached at any time, used to secure the breath sensor casing. Through a stable structure, the accuracy of the position is ensured, preventing discomfort caused by direct contact of the sensor with the user’s face and minimizing the impact on breathing comfort. At the same time, the connection with the VR glasses and the sensor is based on the material’s own elasticity, achieved through slot and snap structures, maximizing the simplicity and ease of use of the structure.

When using this product, first push the sensor upwards into the slot; after the sensor is securely fixed, attach the bracket to the VR glasses device; once fixed, the VR glasses device can be worn normally. To enhance measurement accuracy and wearing comfort, the position and angle can be adjusted up and down according to individual needs.

EEG results from user experiment

Topographic map analysis results

In the comparison of Topographic Maps (see Fig. 7), a clear elevation in the potential values of the right prefrontal and parietal regions, particularly in the F4 and P8 channels, was observed in the control group. This indicates heightened neural activity in these areas, closely associated with increased anxiety. These findings align with the theory of increased right-sided brain activity during anxiety states, suggesting that the control group did not effectively alleviate anxiety.

Fig. 7
figure 7

Topographic map analysis results.

In contrast, the experimental group exhibited a significant reduction in potential values in the right prefrontal and parietal regions, particularly in the F4 and P8 channels, with a marked decrease in Topographic Map values. This indicates that, following the intervention, anxiety-related activity in the right prefrontal and parietal regions was significantly alleviated, leading to a reduction in anxiety levels. The decrease in prefrontal asymmetry further supports the improvement in anxiety states.

Band-power reports analysis results

In the band-power report (see Fig. 8), the power spectral density of alpha waves (8–12 Hz) and beta waves (12–25 Hz) was analyzed in detail. Both control and experimental group showed an increasing trend in alpha wave power spectral density. This aligns with the theory that individuals with anxiety experience an increase in alpha waves as they attempt to regulate emotions and restore a relaxed state.

However, the increase in alpha wave power spectral density in the experimental group was significantly greater than that of the control group, especially in the F3 and F4 channels. This indicates a stronger relaxation response and greater anxiety relief in the experimental group. In contrast, although both groups exhibited a slight increase in beta wave power spectral density, the experimental group’s increase was lower than that of the control group, particularly in the right prefrontal and parietal regions. The reduction in beta waves suggests that the experimental group experienced more effective relief from alertness and tension, further demonstrating the experimental group’s advantage in anxiety regulation.

The power of different frequency bands (α waves and β waves) at distinct time points (“Origin” and “After”) was compared using boxplots, with power values expressed in decibels (dB). Statistical analyses were conducted to verify the differences in the distribution of band power across channels between the two time points.

Fig. 8
figure 8

Band-power reports analysis results.

This study conducted an analysis of variance (ANOVA) on the changes in α-wave and β-wave power at the F3, F4, P7, and P8 electrode sites between the “Origin” and “After” time points. All analyses were performed using SPSS, with Bonferroni correction applied for p-value adjustments. Table 1 presents the ANOVA results for the Band-Power Reports experimental data, with key statistical findings summarized as follows:

1. F3 Electrode.

α-wave: The experimental group exhibited a significant increase in α-wave power at the “After” time point (F (1, 36) = 5.24, p = 0.028), indicating an enhanced relaxation response following anxiety regulation.

β-wave: No significant difference in β-wave power was observed between the experimental and control groups (F(1, 36) = 2.45, p = 0.120).

2. F4 Electrode.

α-wave: The experimental group showed a significant increase in α-wave power at the “After” time point (F (1, 36) = 6.38, p = 0.017), further supporting the role of anxiety regulation in enhancing relaxation.

β-wave: A significant decrease in β-wave power was observed in the experimental group (F (1, 36) = 4.85, p = 0.032), suggesting a reduction in anxiety-related vigilance.

3. P7 Electrode.

α-wave: The experimental group exhibited a significant increase in α-wave power at the “After” time point (F (1, 36) = 4.36, p = 0.023), further indicating the effectiveness of anxiety alleviation.

β-wave: Although β-wave power showed a slight increase, the change did not reach statistical significance (F(1, 36) = 1.82, p = 0.138).

4. P8 Electrode.

α-wave: The experimental group demonstrated a significant increase in α-wave power at the “After” time point (F (1, 36) = 6.72, p = 0.013).

β-wave: The β-wave power in the experimental group decreased compared to the control group, though the significance level was relatively low (F (1, 36) = 3.41, p = 0.029).

5. Summary.

The significant increase in α-wave power suggests that the anxiety regulation intervention effectively enhanced relaxation responses. Meanwhile, the decrease in β-wave power, particularly at the F4 electrode site, indicates reduced vigilance. These findings support the critical role of the prefrontal and parietal regions in anxiety regulation.

Table 1 Analysis of variance on band-power reports experimental data.

ERP analysis results

Through ERP data analysis (see Fig. 9), differences in the N200 and P300 components between the two groups can be observed. The N200 component is primarily associated with cognitive conflict and emotional response. The results indicate that the experimental group exhibited a significant reduction in N200 amplitude, with a decrease notably greater than that of the control group, particularly in the F4 channel. This suggests that the experimental group experienced reduced sensitivity to negative emotions, decreased cognitive conflict, and a greater degree of anxiety relief.

Table 2 Analysis of variance on ERP experimental data.

In contrast, the P300 amplitude did not show significant differences between the experimental and control groups, and no clear pattern of change was found. This suggests that during anxiety regulation, the P300 component may not serve as a primary indicator, whereas the N200 component more effectively reflects changes in anxiety.

Fig. 9
figure 9

This study also conducted an ANOVA on the ERP experimental data, with the results presented in Table 2. The key statistical findings are summarized as follows:

1. F4 Channel.

N200 Amplitude: The experimental group exhibited a significant reduction in N200 amplitude at the “After” time point (F (1, 36) = 5.24, p = 0.028), indicating a weakened anxiety response and reduced sensitivity to cognitive conflict and negative emotions following anxiety regulation. In contrast, the control group showed no significant change in N200 amplitude.

P300 Amplitude: No significant difference in P300 amplitude was observed between the experimental and control groups at the F4 channel (F (1, 36) = 1.87, p = 0.134), suggesting that anxiety alleviation was primarily reflected in changes in the N200 component, while P300 amplitude may not be a primary indicator of anxiety regulation.

2. Other Channels.

At other electrode sites, N200 amplitude changes in both the experimental and control groups did not reach statistical significance (p > 0.05).

P300 amplitude showed no significant changes across all channels (p > 0.05), further indicating that N200 amplitude is a more sensitive marker of anxiety regulation, whereas P300 amplitude may be less reflective of anxiety-related state changes.

3. Summary.

The critical role of N200 amplitude in anxiety regulation was further confirmed, while P300 amplitude did not exhibit significant changes. These findings suggest that anxiety regulation may primarily affect cognitive conflict processing (N200) rather than the later-stage allocation of cognitive resources (P300).

Experiment summary

Through the analysis of Topographic Maps, Band-Power Reports, and ERP data, the experimental group demonstrated significant anxiety-relieving effects across all indicators. Specifically, the experimental group exhibited a marked reduction in activity in the right prefrontal and parietal regions, indicating effective suppression of anxiety-related neural activity. In the power spectral report, the experimental group showed a larger increase in alpha wave power, reflecting a stronger relaxation response, while the smaller increase in beta waves pointed to lower levels of alertness and tension. In terms of ERP components, the experimental group’s N200 amplitude was significantly reduced, indicating a weakened response to negative emotional stimuli and improved emotional regulation capacity.

These results not only demonstrate that the experimental group outperformed the control group in anxiety relief but also further confirm the neurophysiological observability of emotion regulation and anxiety management. Through the comprehensive analysis of various EEG indicators, we gained a deeper understanding of the activity patterns in different brain regions during anxiety states, as well as the mechanisms of frequency band power and event-related potential changes during emotional regulation. This finding provides neurobiological evidence for future anxiety treatment interventions, suggesting that neuroregulation techniques can more precisely monitor and modulate anxiety states.

On a broader application level, this study not only offers new insights into the neural mechanisms of anxiety but also provides objective and quantifiable biomarkers for anxiety management interventions. This implies that future anxiety treatments can be optimized based on real-time EEG feedback, further enhancing the effectiveness of personalized treatment plans. The potential applications of this approach are wide-ranging, with significant implications for mental health management, clinical interventions, and brain-computer interface technologies.

Thus, this experiment not only confirmed the critical role of EEG data in emotion regulation but also offered new insights into the interdisciplinary integration of neuroscience and psychology. Future research can further explore how precise neuroregulation techniques can more effectively intervene in and manage anxiety, advancing the precision and personalization of anxiety treatment.

Qualitative report on user experience

As a supplement to the above EEG experiment data analysis, a user experience survey was conducted with 24 participants from the experimental group out of the total 38 participants to gather additional qualitative feedback on the healing experience. The participants were invited to complete the USE Questionnaire (Usefulness, Satisfaction, and Ease of Use), as shown in Table 3. The USE questionnaire is divided into four dimensions: Effectiveness, Usability, Learnability, and Satisfaction, comprising 30 questions. Each question is rated on a 7-point scale ranging from “strongly disagree” to “strongly agree.” The feedback data from the questionnaire was used for a qualitative assessment of the user experience of the psychological healing interaction system.

Table 3 USE questionnaire.

The rightmost column of Table 3 presents the average scores of user feedback collected via the USE questionnaire. Qualitative evaluations allow for a rapid understanding of the strengths and weaknesses of the healing experience. For example, the items “I am satisfied with it” and “It is good” achieved average scores exceeding 6, indicating a high level of user satisfaction with the healing system. However, the items “I can use it without a manual” and “I feel that I need to own it” received relatively lower average scores. This suggests that improvements in usability will be necessary during the subsequent product development phase, providing critical insights for the future commercialization and user experience optimization of this research.

Virtual reality interaction modalities for anxiety psychological healing

This study leverages virtual reality (VR) technology to design and develop an interactive system aimed at anxiety psychological healing, incorporating breathing therapy to extend the system’s breathing interaction modality and create a prototype of a VR interaction product with therapeutic functions. VR technology provides users with an immersive embodied interaction experience, replacing the physical therapy rooms used in psychodrama with virtual game scenarios. The convenience of this interaction modality allows users to engage in the anxiety healing process from the comfort of their homes, experiencing virtual scenarios anytime and anywhere. The innovative expansion of the breathing interaction not only offers users a novel way to experience VR but also incorporates breathing therapy semantics that promote anxiety relief and enhance cardiopulmonary function. Data from the EEG user experiments and user experience reports confirm that this prototype effectively supports anxiety relief. Therefore, integrating VR interaction modalities with psychological healing can significantly simplify the process of anxiety treatment, facilitating timely and effective emotional regulation for users, which is of great importance in preventing anxiety disorders.




Source link

Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use