iHARP- NSF HDR Institute for Harnessing Data and Model Revolution in the Polar Regions

Current Students: Andrew Summit, Archith Sharma (TAMS)

Past Students: Rishitha Reddy Pesaladinne, Suruthi Selvam, Maruthi Prasanna, Kavya Gundla

Status: Current

Annotation and Visualization of Heterogeneous Data using VR/AR

The goal of this project is to develop intelligent data-driven techniques, accurate labeled data is needed for training and evaluation of deep learning techniques. To better engage scientists in the labeling and to facilitate the labeling we will use AR/VR tools. This work seeks to build upon the previous efforts by creating an integrated tool suite for enhancing the value, situational awareness, accessibility, and understanding of science data connected with Arctic science using virtual reality (VR) and augmented reality (AR) technologies. This work will design, develop, and evaluate AR/VR tools to explore and annotate, using tablets/mobile devices, and HoloLens. The objective of the effort will be to use the AR displays on mobile devices and headsets to guide the users with virtual overlays, paths, and way-points. It will also involve development of algorithms for layering, situational awareness, and location sensing.

Project 1: Development of mobile application for identifying anomalous behavior and conducting time series analysis using heterogeneous data (Poster)

We have incorporated a use case of a parking lot to develop and test our tool. Understanding anomalous behavior and spatial changes in an urban parking area can enhance decision-making and situational awareness insights for sustainable urban parking management. Decision-making relies on data that comes in overwhelming velocity and volume, that one cannot comprehend without some layer of analysis and visualization. This work presents a mobile application that performs time series analysis and anomaly detection on parking lot data for decision-making. The mobile application includes two modules: 1) Information gathering module and 2) Time series analysis module. In the information gathering module, users can add pins at the parking lot and in the time series analysis module, users can analyze the pins they added over a period. Our approach uses parking pins to identify each vehicle and then collect specific data, such as temporal variables like latitude, longitude, time, date, and text (information from the license plate), as well as images and videos shot at the location. Users have the option of placing pins at the location where their car is parked, and the information collected can be used for time series analysis. By examining the data pattern, we may quickly identify vehicles parked in restricted spaces but without authorization and vehicles parked in disabled spaces but owned by regular users. This time series analysis enables the extraction of meaningful insights, making it useful in the identification of recurring patterns in parking lot occupancy over time. This information aids in predicting future demands, enabling parking administrators to allocate resources efficiently during peak hours and optimize space usage. It can be used in detecting irregularities in parking patterns, aiding in the prompt identification of unauthorized or abnormal parking and parking violations which includes parking of the wrong type of vehicle, and parking at restricted or reserved areas.

Information Gathering Module Information Gathering Module Time Series Analysis Module Time Series Analysis Module

Videos (Time Series Analysis Module )

 

Project 2: Time Series Analysis for Detecting Anomalous Behavior using Unity 3D and MetaQuest 3 (Poster)

The proposed mobile application is developed using the Unity framework and seamlessly integrates image capture, note-taking, GPS tracking, and a robust SQLite database, offering users a comprehensive memory management system. This work focuses on performing time series analysis techniques for detecting anomalous behavior in urban parking lots.

Detecting Anomalous Behavior Detecting Anomalous Behavior Detecting Anomalous Behavior Detecting Anomalous Behavior

Mobile Application using Unity 3D for time series analysis for Geospatial data                      Visualization of Geospatial data in MetaQuest 3

Project 3: Real Time Object Detection and Emotion Detection via Camera using React Native, Python Flask, Coco Dataset and OpenCV (Poster)

This project presents a real-time object detection and emotion recognition system implemented through React Native for mobile application development, Python Flask for backend support, and leveraging the Coco dataset and OpenCV for robust and accurate detection. The integration of these technologies enables seamless camera-based object recognition and emotion analysis, offering a versatile and responsive user experience. Our object detection model is trained on the Coco Dataset and powered by OpenCV, showcases accurate and responsive detection capabilities. Additionally, our emotion recognition module provides a seamless user experience, highlighting the project's potential in various practical applications. As we look ahead, there is immense scope for further enhancements and broader utilization of this technology in diverse domains. The work includes:
1) Coco Dataset: The Coco dataset serves as a foundational element of our object detection system.  Its extensive and varied content is crucial for training a robust object detection model capable of recognizing a wide range of objects in real-time scenarios.
2) OpenCV Integration:  It provides a comprehensive set of functions for image processing and computer vision tasks.
3) Emotion Detection: The core of our emotion detection system is a deep learning model trained to recognize human emotions from images and video frames. We employed a convolutional neural network (CNN) architecture for this purpose. The model was trained on a diverse dataset of labeled facial expressions, allowing it to identify a spectrum of emotions, including happiness, sadness, anger, and more.
4) Camera Integration: To capture and analyze user emotions in real-time, we seamlessly integrated the device's camera into our application. This feature enables users to interact naturally with our system without the need for additional sensors or hardware. Image frames from the camera are processed on the device, ensuring privacy and real-time responsiveness. Emotion predictions are then displayed to the user or transmitted to the backend for further analysis if needed.

Mobile Application using React Native, Python Flask, Coco Dataset, and OpenCV                      Visualization of Geospatial data

Project 4: Point Cloud Data Visualization using Unity 3D and MetaQuest 3 (Poster)

The goal of this project is to develop a point cloud data visualization in VR or AR using Cesium, Unity 3D, and MetaQuest by incorporating labeling and annotation.

  • Preserving More Metadata for Point Clouds Using 3D Tiles.
  • Explore with other hardware devices such as HTC Vive, Hololens2, Magic leap 2, Apple Vision Pro, and Meta Quest 3.
  • Adding relevant labeling and annotation.
Point Cloud Annotation v2 with menu (local point cloud file) Point Cloud Viz in VR Meta Quest Pro
Point Cloud Annotation v2 with menu using Cesium Point Cloud Viz in VR Meta Quest Pro

Publications

Journals

  1. Sharma, S and Pesaladinne R.,"Spatial Analysis and Visual Communication of Emergency Information through Augmented Reality", Journal of Imaging Science & Technology (JIST), JIST, Vol. 67, Issue 6, https://doi.org/10.2352/J.ImagingSci.Technol.2023.67.6.060401, 2023.

Conferences

  1. Chellatore,M.P., Pesaladinne R., and Sharma, S., "Time Series Analysis for Detecting Anomalous Behavior using a Mobile Device", Proceedings of the 22nd International Conference on Embedded Systems, Cyber-physical Systems, & Applications (ESCS'24), IEEE-CSCI conference, Las Vegas, USA, July 22-25, 2024.
  2. Bhatt, B., Sharma, S. "Mobile Application for Conducting Time Series Analysis on Location-Based Spatial Data", Proceedings of the 20th International Conference on Data Science, (ICDATA'24), Las Vegas, USA, July 22-25, 2024.
  3. Omary, D., and Sharma, S., "Virtual Reality fire drill for campus evacuation", Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Seattle, USA, Oct 21 - Oct 25 2024. (Submitted)
  4. Chellatore, M.P., Sharma, S, "Mobile Application for Identifying Anomalous Behavior and Conducting Time Series Analysis Using Heterogeneous Data", Proceedings of the 26th International Conference on Human-Computer Interaction (HCI International 2024), Thematic Area: Virtual, Augmented and Mixed Reality, Washington Hilton Hotel, Washington DC, USA, 29 June - 4 July 2024.
  5. Sharma, S., Pesaladinne R., "Spatial Analysis and Visual Communication of Emergency Information through Augmented Reality", Proceedings of the IS&T International Symposium on Electronic Imaging (EI 2024) in the Engineering Reality of Virtual Reality Conference, DOI: 10.2352/J.ImagingSci.Technol.2023.67.6.060401, January 21-25, 2024.
  6. Pesaladinne, R., Chellatore, M.P., Dronavalli,S., Sharma, S., "Situational awareness and feature extraction for indoor building navigation using mixed reality", Proceedings of the IEEE International Conference on Computational Science and Computational Intelligence, (IEEE-CSCI), Research Track on Big Data and Data Science (CSCI-RTBD), Las Vegas, USA, December 13-15, 2023.
  7. Dronavalli,S. Pesaladinne, R., Sharma, S., "Crime Data Visualization Using Virtual Reality and Augmented Reality", Proceedings of the IEEE International Conference on Computational Science and Computational Intelligence, (IEEE-CSCI-RTSC), Las Vegas, USA, December 13-15, 2023.
  8. Tack, N, Williams, R, Holschuh, N, Sharma, S, Engel, D, "Visualizing the Greenland ice sheet in VR using immersive fence diagrams", (ACM-PEARC 23), Conference on Practice and Experience in Advanced Research Computing, Portland, OR, USA, ACM ISBN 978-1-4503-9985-2/23/07, https://doi.org/10.1145/3569951.3603635, July 23–27, 2023.
  9. Tack, N, Holschuh, N, Sharma, S, Williams, R, Engel, D, "Development and initial testing of XR-based fence diagrams for polar science", Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS 2023), Pasadena, California, 16-21 July 2023.
  10. Sharma, S, "Mobile Augmented Reality System for Emergency Response", Proceedings of the 21st IEEE/ACIS International Conference on Software Engineering, Management and Applications (SERA 2023), Orlando, USA, May 23-25, 2023.
  11. Sharma, S., Engel, D., "Mobile augmented reality system for object detection, alert, and safety", Proceedings of the IS&T International Symposium on Electronic Imaging (EI 2023) in the Engineering Reality of Virtual Reality Conference, January 15-19, 2023

Posters

  1.  Bhoj Raj Bhatt and Sharad Sharma “Mobile App for Object Tracking and Location-Based Data for Time Series Analysis”, special celebration for the 15th anniversary event, College of Information (COI) at the University of North Texas, November 10, 2023. (1st Place Award).

  2. Sri Chandra Dronavalli, and Sharad Sharma “Crime Data Analysis and Visualization through HoloLens 2 and Oculus Quest Pro”, special celebration for the 15th anniversary event, College of Information (COI) at the University of North Texas, November 10, 2023. (2nd Place Award)

  3. Maruthi Prasanna and Sharad Sharma, “Mobile Application for Identifying Anomalous Behavior and Conducting Time Series Analysis using Parking Lot Data”, special celebration for the 15th anniversary event, College of Information (COI) at the University of North Texas, November 10, 2023.

  4. Suruthi Selvam and Sharad Sharma, “Real Time Object Detection and Emotion Detection via Camera using React Native, Python Flask, Coco Dataset and OpenCV”, special celebration for the 15th anniversary event, College of Information (COI) at the University of North Texas, November 10, 2023.