DTSC 5777/DTSC 4777: Virtual Reality and its Applications
Project Proposal Report
2-3 students in each group
(Students are free to choose their own project & team members. But it should follow
the below guidelines)
Guidelines for Project Proposal Report:
- Proposal: font size 10. Arial or times new roman. Submit a hardcopy.
- A header should provide: - Project title - Names of all group members - Course and
term
- GOAL AND OBJECTIVES: List out the goals of the project. [For example evacuation behavior
simulation, plane take-off simulation, hazardous events, reconstruction or remodeling
of a building, etc.
- MODELING: Describe the envisioned virtual environment (building, trees, people, furniture,
landscaping elements, etc.) [3ds Max or Google Sketchup]
- Planned geometry, use of textures, animations, behavior and functionality, etc.
- Describe how the application will be used. (envisioned users, navigation, interactions,
etc.)
- List software and hardware equipment that is required or desired for the implementation.
Choose between Vizard, Unity or Unreal
- The proposal should explain how the following will be implemented: (Give examples of how the functionality will be incorporated) Vision:The project will incorporate high-quality textures and 3D models to present
detailed and informative visual content within the environment.
- Vision: Use of textures and 3D models to provide detailed information in the project.
- Sound: Speech, background music, and/or ambient sound effects will be used to enhance immersion
and to convey information about the environment or location.
- Animation: The project will include a minimum of three animated objects, demonstrating meaningful
motion relevant to the scene or gameplay.
- Interactivity: At least three user-triggered interactive events will be implemented, allowing users
to interact with objects or elements within the environment.
- Characters Behaviors: The environment will feature animated agents that utilize built-in behaviors, such
as path-following or navigation, to simulate realistic movement and actions.
- Sensors: Use at least three different types of sensors (Proximity, Time, Touch, and Visibility)
in the project. (use at least three trigger events)
- Player: Add a Player Controller to the Scene: First Person Controller or 3rd Person Controller
- AI Implementation: AI functionality depending upon the project (navigation, behaviors, shortest path,
etc.), implement different behaviors (selfish, altruistic, learning, adaptive, etc.)
in the environment through a user menu (number of agent’s assignment), Integration
with Chat GPT, NPC AI Engine, voice interaction, etc.
- User Interface elements: A user interface will be designed and implemented, including interactive elements
such as menus, buttons, and other UI components necessary for user control and feedback.
- Multi-User Environment OR Hardware Integration OR Mobile Version (with Joystick for navigation): More than two people to be present in the same environment. Hardware can be metaquest,
htc vive, HoloLens, smart phone, etc. The project will support one or more of the
following: 1) A multi-user environment allowing more than two users to be present
simultaneously 2) Hardware integration, such as Meta Quest, HTC Vive, HoloLens, or
similar device . 3) A mobile version of the project, including joystick-based navigation
for user movement
- Incorporate the above functionalities and explain how/where it will be implemented.
- Who will be using this VR Application (Target audience)?
- Software and hardware used: Unity 3D or Vizard or Unreal.
Management Plan: Describe each individual group member's role and responsibility
Individual Contributions and Cross-Evaluation (WILL BE DONE AT THE END OF PROJECT)
RECOMENDED PROJECTS by Instructors/Mentors/Faculty
2026
1. VR-Driven Digital Twin Platform for Real-Time Motion Analysis: (Mentor: Ian Abeyta, PhD Student, IanAbeyta@my.unt.edu)
Engineer the flagship visualization system for next generation, real-time motion analysis
based on wearable device technology. Your work in creating a biomechanically accurate
digital twin will form the foundation of a motion feedback platform with applications
spanning orthopedic rehabilitation, professional/collegiate athletics, and human performance
optimization. Project objectives can include:
i) Develop a Real-Time VR Motion Analysis System
Design and implement a virtual reality platform capable of visualizing human motion
in real time using camera-based detection and wearable sensor data.
(ii) Implement Camera-Based Human Detection and Pose Tracking
Integrate live camera input to detect a human subject and extract skeletal joint data
using computer vision and pose estimation algorithms.
(iii) Create a Biomechanically Accurate Digital Twin
Develop a real-time animated avatar that accurately replicates human posture, joint
movement, and body dynamics based on detected motion data.
(iv) Support Multiple AI-Driven Behavior and Analysis Modes
Implement AI-based analysis to identify movement patterns, inefficiencies, and potential
injury risks, with configurable behavior modes accessible through a user interface.
2. VR fire training system:
Create realistic, interactive scenarios by allowing learners into safe environments
for real-life emergencies. Sample projects for ideas:
Fast XR Simulator Training courses
FAST XR - Fire Immersive Trainer by Meta
3. Smart Building / Environment Twin:
Concept: Simulate a building’s lighting, HVAC, and occupancy systems in Unity for
monitoring or training purposes.
Features: 1) Sensor-driven control (temperature, motion, light), 2) Real-time energy
efficiency visualization, 3) Emergency simulations and training
4. Healthcare / Rehabilitation Twin
Concept: Patient-specific digital twin for physical therapy, injury recovery, or surgical
planning.
Features:1) Interactive avatar replicating patient’s motion, 2) Feedback on range
of motion, posture, and progress, 3) VR/AR immersive therapy sessions
5. Multi-User VR Platform for Autonomous Vehicle Simulation
Concept: Develop a real-time virtual reality environment where multiple users can simultaneously
interact with and monitor autonomous vehicles in a shared 3D world. The system will
act as a digital twin for autonomous driving systems, combining VR visualization,
AI-driven vehicle control, and multi-user collaboration. Users can enter the VR environment
as drivers, engineers, or traffic observers, controlling vehicles or monitoring AI
behavior. Autonomous vehicles operate in the scene using AI pathfinding and decision-making
algorithms, responding to dynamic obstacles and user interventions.
Features: 1) Multi-User VR Interaction: Multiple users join the same VR environment via networked
multiplayer (Photon, Unity Netcode, or Mirror). Each user has an avatar and can interact
with vehicles or the environment. 2) Autonomous Vehicle Simulation: Vehicles navigate
the environment using AI (NavMesh, custom pathfinding, or reinforcement learning).
Simulate traffic rules, obstacle avoidance, and route planning, 3) Interactive VR
Tools: Users can spawn vehicles, modify routes, trigger events, or analyze AI decisions.
Dashboard displays metrics: speed, distance, traffic density, and vehicle decision
logs, 4) AI Behavior: Implement multiple autonomous driving behaviors (aggressive,
cautious, cooperative). Optional integration with AI learning or adaptive control.
6. Sports Performance Digital Twin
Concept: Mirror an athlete’s movements in VR using wearable sensors for real-time
motion analysis and training feedback.
Features:1) Joint angle and posture tracking, 2) Performance metrics visualization,
3) AI-driven recommendations for optimization
7. VR Glucose Monitoring System or Instruction System: Help people undertsand Type 1 and Type 2 diabates. Create a VR version of DDS17 screening scale Example: Here, Sample
2025
- A VR environment to understand anxiety and stress management in an academic environment. Example: Here, Sample
- VR Glucose Monitoring System or Instruction System: Help people undertsand Type 1 and Type 2 diabates. Create
a VR version of DDS17 screening scale Example: Here, Sample
- Create VR-learning tool to explore Diabetes distress. OR or tool might help a recently
diagnosed person with Type 2 diabetes living anywhere in Texas cope with one or more
of the following challenges associated with diabetes distress. (Check with Dr. Sharma for more details)
- Create a Virtual Instrcutor to interact with realtime user for training. Checck asset - NPC AI Engine - Dialog, actions, voice and lipsync - Convai for voice and lip sync. The Virtual Instrcutor can be for a museum, tour guide, campus
tour guide, virtual classroom, etc.
- Create AI behaviors for NPCs to interact with user controlled agents in a multi-user
environment for different tasks. For example. Check the megacity project where the policeman can escort blue people to a safe zone during emeregncies.
- Training in multi-user environment: Create a VR instructor to interact with AI-powered training for public speaking,
leadership, sales,interview, etc. Practice in a wide range of self-paced VR scenarios,
from a press conference to a meeting room. Check Virtual Speech, for training, roleplay, skill development, lip synch, etc.
- Digital Twin: Fire and Smoke (UNITY)
https://github.com/urbaninfolab/FireIncidentFrontend
https://github.com/urbaninfolab/AustinDigitalTwin
2024
- Digital Twin VR Project: A digital twin is a dynamic virtual copy of a physical asset, process, system or environment that
looks like and behaves identically to its real-world counterpart: https://unity.com/solutions/digital-twins, https://www.youtube.com/watch?v=yX06te3zMOU, Creating Digital Twins with Unity, realvirtual.io and ChatGPT, Connect IoT data to HoloLens 2 with Azure Digital Twins and Unity, Connecting your Digital Twin to the Unity Game Engine,
- Create a Virtual Instrcutor to interact with realtime user for training. A "hybrid" meeting scenario where some
will be sitting face-to-face in a room and other will be on a large screen via zoom
having a group discussion. Many buttons and features can be created to navigate in
face-to-face room or in the virtual participant's virtual environment. They can create
tired faces for virtual participants and happy faces for the face-to-face ones. Exploring
developing an AI instructor with Chat GPT integration/interaction. Example: DE&I Training in VR: Micro-aggressions, Allyship, Exclusion, and More (virtualspeech.com), Managing Workplace Stress: Online Course with Practice (virtualspeech.com)
- A bullying scenario: Children or high school kids are playing together in the playground, or at a party
or in the school hallway. There will be a perpetrator, a victim and bystanders. All
the functionalities listed in the project description can be used here. Can be helpful
in bullying prevention program especially in perspective taking.
- A decision-making game: The creation of the Virtual Moon Survival game or Lost at Sea game. A ship got lost
in the ocean (or a spaceship got lost in the space) and they are left with 15 important
items. They have to choose the most important items from the ship and rank order the
items from the most important to the least important for survival. The items include
a mirror, mosquito net, floating seat, radio, etc. Those who can order them correctly
can get the full points.
- A domestic violence environment with a perpetrator, a victim and witness. The response and creation of
each person. This can be used in a prevention program especially in perspective taking.
All the functionalities listed in the project description can be used here.
- A virtual campus tour for a university for navigation, evacuation, and virtual tours. Without going into
much detail, one can create buildings, cafeteria, a gym, a stadium, trees, statues,
etc.
PROJECT IDEAS
- Course curriculum modules: Create with more inquiry based problem-solving activities and hand-on experiences
based on Virtual and Augmented Reality educational modules.
- Create Gaming Instructional Modules to Enhance Student Learning in Lower Level Core
Computer Science Courses
- Traffic Simulation
- Virtual Museum
- Crowd Simulation and Evacuation Simulation
- Online class room in client server network
- Military simulations, combat situation, battlefield simulations
- Airport Simulation