top of page

VR CPR Simulator Prototype

Project summary

After my fourth term at university I, alongside two other classmates were headhunted by a tech & health startup, VitalSigns AB. We were contracted to develop a prototype for an educational VR/XR CPR simulator (cardiopulmonary resuscitation). The ultimate goal was to have an application providing a more fun and educational way to learn CPR. While simultaneously putting the user in immersive, life-like accident situations to reinforce the gravity of what’s being taught. The prototype was intended to be a proof of concept for investors. 

​

Demo video shown at delivery & completion of prototype

Development

The prototype development was conducted over 20 weeks in 2022, with the team dedicating part-time efforts. Following a briefing and requirements specification provided by VitalSigns, tasks were allocated among team members. We engaged in periodic, yet spaced-out meetings with the company to ensure our progress aligned with their expectations and to make necessary refinements. Upon successful completion and satisfaction from VitalSigns, we were compensated and presented with job offers.

My contribution
When discussing how to divide up the work within the team, it naturally fell upon me as the only person with some familiarity and experience with VR and Oculus to handle the more VR-specific work. My main focus during most of the development cycle was the design and implementation of:

​

  • A "digital double" persistent object system for the real-life CPR mannequin using Oculus spatial anchors

  • An intuitive setup wizard UI for the CPR digital double doll, controllable both through Oculus wands and hand tracking

  • Localisation for both Swedish and English

  • CPR Rythm detection system

  • A high score system and main menu 

​

The most significant challenge I faced during this project was the limited and outdated documentation for the Oculus Unity SDK. At the time, only two tutorials were available for Oculus spatial anchors, both using deprecated versions of the SDK. Being relatively new to VR, the scarcity of information forced me to rely on my analytical and problem-solving skills, often involving trial and error. Notably, I devised my own UI components, leveraging both Unity and Oculus SDK elements. One such innovation was a click handler and counter for Oculus hand tracking, aimed at facilitating intricate handtracked interactions. The associated code is provided below.

The OVRClickTracker class and a usage example. The click tracker is used to make up for the lack of "buttons" when using handtracking instead of the Oculus wands

bottom of page