The goal of this project is to develop intelligent data-driven techniques, accurate labeled data is needed for training and evaluation of deep learning techniques. To better engage scientists in the labeling and to facilitate the labeling we will use AR/VR tools. This work seeks to build upon the previous efforts by creating an integrated tool suite for enhancing the value, situational awareness, accessibility, and understanding of science data connected with Arctic science using virtual reality (VR) and augmented reality (AR) technologies. This work will design, develop, and evaluate AR/VR tools to explore and annotate, using tablets/mobile devices, and HoloLens. The objective of the effort will be to use the AR displays on mobile devices and headsets to guide the users with virtual overlays, paths, and way-points. It will also involve development of algorithms for layering, situational awareness, and location sensing.
We have incorporated a use case of a parking lot to develop and test our tool. Understanding anomalous behavior and spatial changes in an urban parking area can enhance decision-making and situational awareness insights for sustainable urban parking management. Decision-making relies on data that comes in overwhelming velocity and volume, that one cannot comprehend without some layer of analysis and visualization. This work presents a mobile application that performs time series analysis and anomaly detection on parking lot data for decision-making. The mobile application includes two modules: 1) Information gathering module and 2) Time series analysis module. In the information gathering module, users can add pins at the parking lot and in the time series analysis module, users can analyze the pins they added over a period. Our approach uses parking pins to identify each vehicle and then collect specific data, such as temporal variables like latitude, longitude, time, date, and text (information from the license plate), as well as images and videos shot at the location. Users have the option of placing pins at the location where their car is parked, and the information collected can be used for time series analysis. By examining the data pattern, we may quickly identify vehicles parked in restricted spaces but without authorization and vehicles parked in disabled spaces but owned by regular users. This time series analysis enables the extraction of meaningful insights, making it useful in the identification of recurring patterns in parking lot occupancy over time. This information aids in predicting future demands, enabling parking administrators to allocate resources efficiently during peak hours and optimize space usage. It can be used in detecting irregularities in parking patterns, aiding in the prompt identification of unauthorized or abnormal parking and parking violations which includes parking of the wrong type of vehicle, and parking at restricted or reserved areas.
Videos (Time Series Analysis Module )
The proposed mobile application is developed using the Unity framework and seamlessly integrates image capture, note-taking, GPS tracking, and a robust SQLite database, offering users a comprehensive memory management system. This work focuses on performing time series analysis techniques for detecting anomalous behavior in urban parking lots.
This project presents a real-time object detection and emotion recognition system
implemented through React Native for mobile application development, Python Flask
for backend support, and leveraging the Coco dataset and OpenCV for robust and accurate
detection. The integration of these technologies enables seamless camera-based object
recognition and emotion analysis, offering a versatile and responsive user experience.
Our object detection model is trained on the Coco Dataset and powered by OpenCV, showcases
accurate and responsive detection capabilities. Additionally, our emotion recognition
module provides a seamless user experience, highlighting the project's potential in
various practical applications. As we look ahead, there is immense scope for further
enhancements and broader utilization of this technology in diverse domains. The work
includes:
1) Coco Dataset: The Coco dataset serves as a foundational element of our object detection system.
Its extensive and varied content is crucial for training a robust object detection
model capable of recognizing a wide range of objects in real-time scenarios.
2) OpenCV Integration: It provides a comprehensive set of functions for image processing and computer vision
tasks.
3) Emotion Detection: The core of our emotion detection system is a deep learning model trained to recognize
human emotions from images and video frames. We employed a convolutional neural network
(CNN) architecture for this purpose. The model was trained on a diverse dataset of
labeled facial expressions, allowing it to identify a spectrum of emotions, including
happiness, sadness, anger, and more.
4) Camera Integration: To capture and analyze user emotions in real-time, we seamlessly integrated the device's
camera into our application. This feature enables users to interact naturally with
our system without the need for additional sensors or hardware. Image frames from
the camera are processed on the device, ensuring privacy and real-time responsiveness.
Emotion predictions are then displayed to the user or transmitted to the backend for
further analysis if needed.
The goal of this project is to develop a point cloud data visualization in VR or AR using Cesium, Unity 3D, and MetaQuest by incorporating labeling and annotation.
Point Cloud Annotation v2 with menu (local point cloud file) | Point Cloud Viz in VR Meta Quest Pro |
Point Cloud Annotation v2 with menu using Cesium | Point Cloud Viz in VR Meta Quest Pro |
Bhoj Raj Bhatt and Sharad Sharma “Mobile App for Object Tracking and Location-Based Data for Time Series Analysis”, special celebration for the 15th anniversary event, College of Information (COI) at the University of North Texas, November 10, 2023. (1st Place Award).
Sri Chandra Dronavalli, and Sharad Sharma “Crime Data Analysis and Visualization through HoloLens 2 and Oculus Quest Pro”, special celebration for the 15th anniversary event, College of Information (COI) at the University of North Texas, November 10, 2023. (2nd Place Award)
Maruthi Prasanna and Sharad Sharma, “Mobile Application for Identifying Anomalous Behavior and Conducting Time Series Analysis using Parking Lot Data”, special celebration for the 15th anniversary event, College of Information (COI) at the University of North Texas, November 10, 2023.
Suruthi Selvam and Sharad Sharma, “Real Time Object Detection and Emotion Detection via Camera using React Native, Python Flask, Coco Dataset and OpenCV”, special celebration for the 15th anniversary event, College of Information (COI) at the University of North Texas, November 10, 2023.