Student Scholar Symposium Abstracts and Posters

Document Type

Chapman access only poster or presentation

Publication Date

Spring 5-3-2023

Faculty Advisor(s)

Dr. Trudi Qi


Virtual reality (VR) technologies have the potential to transform human lives, but to truly unlock their benefits, they must provide personalized assistance to users in their virtual activities. Fortunately, machine learning (ML) models offer a promising solution to this challenge by leveraging data to understand user needs. While modern VR devices offer high-end tracking of the user's head and hands, developing AI technologies for interpreting human activities in VR is still in its infancy. Our project aims to address this challenge by creating ML models that can interpret human intentions and predict their actions based on VR tracking data. We started by analyzing a massive VR dataset containing human activity data in a virtual environment, and our goal is to label VR tracking data collected from users' heads and hands with specific human activities. We developed a novel rule-based system to accurately and efficiently label VR tracking data with specific human activities. By analyzing both hand and object motion data, we developed a hierarchy of rules to distinguish hand-object interactions, such as reaching, moving, and tossing, based on hand motion metrics like hand position, velocity, and proximity to objects. Our proposed rule-based system achieved an accuracy of 91.2% for the "moving" activity when tested on manually labeled data. We plan to fine-tune the metrics for labeling other hand activities. Our next step is to train ML models and integrate them into a virtual rehabilitation program. By understanding the activity the user is conducting, such as reaching an object, our program can provide feedback and assistance on their rehabilitation goals. In summary, our project takes a crucial step towards offering personalized assistance in virtual reality. By developing novel AI technologies to interpret human activities in VR, we can further help users achieve more in their everyday lives.


Presented at the Spring 2023 Student Scholar Symposium at Chapman University.

Download from off-campus (Chapman ID required)