This study presents a VR-assisted fire evacuation training system that integrates
Digital Twin (DT) technology and Brain–Computer Interface (BCI) sensing to improve
user navigation, decision-making, and emergency response performance in hazardous
building-fire scenarios. The virtual environment reproduces architectural layouts
and simulates dynamic fire and smoke propagation, enabling trainees to practice identifying
safe routes, interpreting environmental cues, and executing time-critical evacuation
strategies. The incorporation of BCI devices, including Emotiv and Galea, facilitates
real-time monitoring of cognitive and affective states such as stress, mental workload,
and attentional shifts during evacuation tasks. These physiological signals provide
an additional analytical layer for assessing user performance and behavioral adaptation
under stress. The system also employs AI-driven crowd modeling, adaptive pathfinding,
interactive hazards, and multimodal feedback to create a comprehensive and data-driven
training experience. The objectives include:
- Integrate EEG-based BCI devices to monitor trainee brain signals.
- Record neural markers of stress, focus, workload, and emotional response during
evacuation tasks.
- Use BCI feedback to automatically adjust training difficulty (e.g., increasing smoke,
time pressure, or environmental distractions).
- Provide personalized difficulty curves that adapt to each trainee’s cognitive and
emotional state.
- Simulate realistic fire propagation, smoke behavior, and airflow using physics-based
models compatible with 3d wall, metaquest 3, etc.
- Combine behavioral data with BCI data to generate comprehensive performance reports.
- Provide data-driven recommendations for improving evacuation routes, signage, and
emergency procedures.
- Assist safety managers in evaluating the effectiveness of current fire protocols.
Publications