Jacob Presents at the Signature Event of Undergraduate Research Week, the Undergraduate Research Symposium (URS)

Since 2008, the Undergraduate Research Symposium (URS) has been the hallmark event of Undergraduate Research Week at the University of Illinois (U of I). Initially attended by just a few hundred undergraduates, participation has surged to well over 800 students. Throughout the day-long event, students showcase their research through concurrent oral and poster presentations, including creative performances. This diverse array of presentations not only reflects the wide spectrum of academic disciplines at U of I but also underscores the exceptional quality of our undergraduate scholars. Our students are characterized by their innovation, engagement, and eagerness to learn from their advisors and mentors. Moreover, their projects exemplify the ongoing commitment of our institution to foster and broaden research opportunities, both within U of I and beyond.

The HXRI Lab member Jacob presented a poster entitled “Towards Seamless Integration: Realtime Object Detection in Mixed and Virtual Reality”. His research delves into the recent advancements in the computational capabilities of mobile devices which have facilitated the emergence of various applications aimed at improving quality of life. Among these innovations, augmented reality (AR) applications stand out, leveraging smartphone cameras to superimpose digital content onto real-world images. Many AR apps utilize machine learning algorithms for object detection, enhancing their functionality. However, applying similar techniques to virtual reality (VR) and mixed reality (MR) headsets presents challenges due to constraints such as limited computational power and low-resolution cameras. Recent developments in headset technology, such as the Meta Quest 3 and Apple Vision Pro, address these limitations by providing higher computational resources and better external camera resolutions. His study seeks to assess the feasibility of real-time object detection using machine learning on VR/MR headsets. Using a custom Unity application, he evaluated various object detection models (e.g., YOLO, Tiny-YOLO) with the passthrough camera of a Quest 3. The analysis focused on latency and accuracy, revealing that only smaller models like Tiny-YOLO demonstrate sufficiently low latency for practical use. Future enhancements in VR/MR headset design, including improved computational resources and camera resolutions, hold promise for enabling the implementation of larger machine learning models with minimal latency.