PROJECTS




YCGH: A Therapeutic VR Journey for Anxiety Management
SIGGRAPH 2025 Poster
presented in the CAVE2™ environment.

SummaryImmersive VR experience that visualizes anxiety-management techniques through interactive storytelling and multisensory cues. It invites users to reflect on emotions, regulate anxiety, and build resilience through calm, self-guided exploration.

Interaction Flow
1 · Awareness    
Dark typographic space visualizing anxious state of mind.
2 · Release    
Watery types expressing negative emotions melt on touch showing acceptance.  
3 · Breathing  
Floating spheres guide 5-5-5 breathing.  
4 · Grounding
Nature-inspired space for 5-4-3-2-1 grounding exercise, users reconnect to the present. 





Process
Background Research    Reviewed prior VR anxiety work; identified need for autonomous, narrative design.
Concept Development    Sketched spatial metaphors and storyboarded a four-stage emotional journey.
Iterative Prototyping     Built weekly Unity prototypes & refined pacing and tone through participant feedback.
System Implementation    Integrated custom assets, responsive breathing spheres, and ambient sound.
Evaluation & Reflection    Observed engagement in CAVE2; refined scenes and transitions from feedback.






Squishy Log
4th year thesis, developing project

Squishy Log explores how tangible interfaces can make anxiety tracking more accessible and comforting.

Phase 1: Log
Designed Log, a mobile app supporting users in identifying anxiety triggers and high-stress moments.
Features a comforting support character, daily reflection “Logs,” and progress visualizations to encourage routine and self-awareness.
🔗 link to app video

Phase 2: Squishy Log (in progress)
Developing a sensor-embedded stress ball that connects with the app, enabling users to express and record emotions through touch and gesture. This phase explores how tangible interaction can bridge physical expression and digital reflection, making emotion regulation more personal, playful, and humane.






Exploring Gaze-Based Interaction
KAIST, HCI lab

During a summer research internship at KAIST, I designed and conducted a gaze-tracking study exploring how wearable sensing systems can enhance interactions with IoT devices.
Working independently, I modified the Tobii Pro glasses 2 with an additional sensing component, calibrated the setup to align with the study’s goals, and led quantitative data collection and analysis. Through this work, I gained hands-on experience in experimental design and sensor-based prototyping.

(This project is part of ongoing research and cannot yet be publicly shared.)






Fulfillment: CAVE2 Virtual Reality Exhibition
ACM Hypertext 2025,  NSF Award #2121737
UIC Electronic Visualization Laboratory + UChicago Neubauer Collegium



Fulfillment is part of an interdisciplinary research project examining logistics as both a material system and a global condition of movement, labor, and technology. The broader project investigates how digital infrastructures such as sensors, supply chains, and data networks organize global flows of people, goods, and information, and reveal patterns of inequality and precarity.

For the CAVE2 exhibition, I created an interactive VR scene depicting the intensity of warehouse logistics work. The piece simulates the challenge of locating and packing parcels under time pressure, allowing viewers to experience the disorienting and embodied dimensions of logistical systems.

Working with researchers from computer science, anthropology, and media arts, I helped translate complex theories of global circulation into a tangible, spatial experience that exposes the human scale of automation and mobility.






Image courtesy of ClicBot.

MyTurn social robotics curriculum
UIC Learning + Interest + Technology Lab

MyTurn is an ongoing research project exploring how social robots can help students see computing as creative, collaborative, and personally meaningful. Our team designed and studied classroom activities that bring values into computer science learning.

For the past studies, I focused on developing a value-centered intervention and analyzing classroom footage through qualitative coding and thematic analysis. This work led to our publication “Pre-Coding Scaffolds for Computational Thinking in an Open-Ended Middle School Social Robotics Curriculum” (ICLS 2025).

Our current study is on co-designing the format we developed with students, mentors and industry professionals.