Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI

liVeR | VR and desktop tools enhancing how surgeons learn patient anatomy.

Liver surgeons are incredible professionals who juggle between 20+ hour surgeries, education and research responsibilities, patient interactions, and their personal lives. This project helps liver surgeons at the Toronto General Hospital claim back time by streamlining a common process that has long been ripe for innovation.

Role

Product Designer, Unity Developer, UX Researcher, Project Lead

Role

Product Designer, Unity Developer, UX Researcher, Project Lead

Role

Product Designer, Unity Developer, UX Researcher, Project Lead

Role

Product Designer, Unity Developer, UX Researcher, Project Lead

Duration

16 months (part-time)

Duration

16 months (part-time)

Duration

16 months (part-time)

Duration

16 months (part-time)

Project

UHN-Funded M.Sc. Capstone Project

Project

UHN-Funded M.Sc. Capstone Project

Project

UHN-Funded M.Sc. Capstone Project

Project

UHN-Funded M.Sc. Capstone Project

Clients

Clients

Clients

Clients

Collaborators

Collaborators

Collaborators

Collaborators

Tools

Tools

Tools

Tools

Output

Output

Output

Output

DISCOVERY & DEFINE

/

/

Exploratory Research

Design Document

I led research revealing that liver anatomy interpretation is an essential and time-consuming step in every liver surgery, yet it was difficult to encourage the adoption of new tools — unless they fit seamlessly into surgeons' busy lifestyles.

Research methods used: interviews, contextual inquiries, media audit, and literature review.

As an analogy, liver surgeons often rely on the sparse cross-sections on the left to mentally construct the intricate anatomy on the right — which can vary dramatically patient to patient.

With so much on my plate, I would simply be disinclined to learn something complicated with marginal benefits.

- Primary Persona, Luci the Expert Surgeon

Removing ambiguity from CT/MRI scans via 3D models would benefit all liver surgeons, no matter their levels of experience.

- Secondary Persona, Charlie the Hopeful Surgical Fellow

Could we leverage everything I learned through research — about surgeons' habits, needs, and goals as well as team context and workflows — to design effortless new tools that slash the time needed for this routine process by half?

The new tool I envisioned would make finding and independently or collaboratively viewing pre-built 3D models seamless. In terms of modality, we can channel clients’ enthusiasm about VR into an internal tool, and satisfy their need for scientific rigor by building desktop and VR twin tools for a comparative study.

My literature review had shown that desktop and VR contends to be the superior 3D media. Desktop is considered more familiar and accessible, while VR is not only novel and exciting but it also affords intuitive spatial interactions. However, it was contended within the literature which of the two made 3D liver anatomy easier to understand.

liVeR design document created in Notion
liVeR design document created in Notion
liVeR design document created in Notion
liVeR design document created in Notion
liVeR design document created in Notion

Research Phase Output: Design Document

(A living project document containing everything from personas, user journeys,

to information architecture, user flow, and project requirements.)

DEVELOP & TEST

/

/

Prototyping

User Testing

A challenge in this project was my direct engagement with non-design stakeholders to find alignment and improve VR design solutions. This meant that 2D, static mockups would dramatically hamper communication. So, I learned to use Unity to create fully interactive prototypes and used them to seek feedback.

An unexpected benefit of this strategy was also the confidence it immediately cultivated in our stakeholders in regards to this project.

Initial Prototypes: select scenes.

I conducted a total of 10 usability testing sessions, and organized regular design critiques with my collaborators. This process drove several key iterations.

Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI

1. Though the initial design was functional for users, I observed this as an underuse of VR's immersive potential. So, despite popular VR apps showing a trend of “laser” interactions in selection lobbies, we maximized “physical” interactions like grabbing instead — proving to enhance engagement and task efficiency.

2. Testing showed that the top right button required extra effort to reach. I redesigned so that all buttons are easy to access while maintaining the essence of the Windows-style, top-right session controls that surgeons were most familiar with. This shift accommodates for both disabilities and small spaces.

3. Testing also informed how I balanced technical feasibility with usability. For example, I dropped momentum physics — initially added for immersion and task efficiency — after finding it interfered with more critical scripts and was unnoticed by users in testing.

1. Though the initial design was functional for users, I observed this as an underuse of VR's immersive potential. So, despite popular VR apps showing a trend of “laser” interactions in selection lobbies, we maximized “physical” interactions like grabbing instead — proving to enhance engagement and task efficiency.

2. Testing showed that the top right button required extra effort to reach. I redesigned so that all buttons are easy to access while maintaining the essence of the Windows-style, top-right session controls that surgeons were most familiar with. This shift accommodates for both disabilities and small spaces.

3. Testing also informed how I balanced technical feasibility with usability. For example, I dropped momentum physics — initially added for immersion and task efficiency — after finding it interfered with more critical scripts and was unnoticed by users in testing.

1. Though the initial design was functional for users, I observed this as an underuse of VR's immersive potential. So, despite popular VR apps showing a trend of “laser” interactions in selection lobbies, we maximized “physical” interactions like grabbing instead — proving to enhance engagement and task efficiency.

2. Testing showed that the top right button required extra effort to reach. I redesigned so that all buttons are easy to access while maintaining the essence of the Windows-style, top-right session controls that surgeons were most familiar with. This shift accommodates for both disabilities and small spaces.

3. Testing also informed how I balanced technical feasibility with usability. For example, I dropped momentum physics — initially added for immersion and task efficiency — after finding it interfered with more critical scripts and was unnoticed by users in testing.

1. Though the initial design was functional for users, I observed this as an underuse of VR's immersive potential. So, despite popular VR apps showing a trend of “laser” interactions in selection lobbies, we maximized “physical” interactions like grabbing instead — proving to enhance engagement and task efficiency.

2. Testing showed that the top right button required extra effort to reach. I redesigned so that all buttons are easy to access while maintaining the essence of the Windows-style, top-right session controls that surgeons were most familiar with. This shift accommodates for both disabilities and small spaces.

3. Testing also informed how I balanced technical feasibility with usability. For example, I dropped momentum physics — initially added for immersion and task efficiency — after finding it interfered with more critical scripts and was unnoticed by users in testing.

1. Though the initial design was functional for users, I observed this as an underuse of VR's immersive potential. So, despite popular VR apps showing a trend of “laser” interactions in selection lobbies, we maximized “physical” interactions like grabbing instead — proving to enhance engagement and task efficiency.

2. Testing showed that the top right button required extra effort to reach. I redesigned so that all buttons are easy to access while maintaining the essence of the Windows-style, top-right session controls that surgeons were most familiar with. This shift accommodates for both disabilities and small spaces.

3. Testing also informed how I balanced technical feasibility with usability. For example, I dropped momentum physics — initially added for immersion and task efficiency — after finding it interfered with more critical scripts and was unnoticed by users in testing.

Various other design considerations were also validated by users during testing.

Familiarity / I co-created with clinicians to make sure our information hierarchy and copy were immediately familiar to our users. Additionally, I mimicked human-computer interactions that I learned were familiar to liver surgeons through my contextual inquiries.

Inclusivity / Since VR adoption is especially limited, I iteratively designed a total of 18 gifs, 1 video, and relevant UI in order to put together visual-first onboarding modules that could be clearly grasped by our diverse user base.

Simplicity / To make it faster and easier for surgeons to interpret the gold-standard 2D cross-sections (i.e. CTs and MRIs), we not only featured a pre-built 3D model, but we also included innovative features that links the 2D and 3D views — namely, color-coding and indicator planes.

Comfort / To reduce the potential motion sickness from VR, I complied with Meta's recommended frame rate and minimized the amount of head movement needed to navigate the scene. This was in part due to my observation from my VR media audit that head motion, especially rapid vertical motion, was especially nauseating.

Immersion / Adding custom audio feedback to all interactions not only provided users with assurance that an action was performed, but users also reported that it added tremendous value to immersion and engagement.

Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI

User-Validated Prototypes: select scenes.

Users acknowledged and appreciated the early-stage branding implemented within the apps, which I designed for the sake of memorability — supporting continued funding applications.

liVeR logo's evolution.

liVeR's VR environment.

DELIVER

/

/

Project Reflection

The final outputs of this project includes 3 applications and 1 video.

Multi-User VR App / An internal VR tool for liver surgeons to easily find, review, and discuss patients' 3D liver anatomy.

Twin Apps for Comparative Study / VR and desktop twin tools that onboard users to the respective technologies and then lead users to review 3D liver anatomy.

Onboarding Video for VR First-Timers / As a prelude to the VR component of the twin apps, this video helps surgeons, who are likely first-time users of VR, quickly learn the technology.

Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI
Medical VR UI

Final Visuals: select scenes from the multi-user VR app (Left) and twin apps for comparative study (Right).

"You did such a great job, and you really brought a long-awaited vision to life."

"I can really see myself incorporating this into my current workflow."

"This tool is beautiful."

- Product Testers, Experienced Surgeons and Surgical Trainees

This product was tested by the liver surgical team at Toronto General Hospital, including experienced surgeons, fellows, residents, and medical students. It was well-received and ultimately incentivized $75,000 in continued funding for showing a promising product experience. A team is currently being assembled to prepare liVeR for formal launch.

Upon launch, I'd be interested in tracking metrics like scene exit rates, session lengths, and error rates, and I'd also gather qualitative feedback. This feedback would allow us to gauge success and pinpoint opportunities for improvement and iteration.

Thanks to this project, I sharpened my design fundamentals, learned the value of storytelling in all communication contexts, and improved my programming abilities.

Less is More / Through liVeR, I was able to improve my product design sensibilities — ultimately learning that reducing visual clutter, aiming for 'perfection' only where it counts, and leveraging existing design guidelines can support the pragmatic approach needed in product design.

Iteration is King / Testing with actual users revealed things that empathy alone wouldn't have.

Coding with AI / My experience showed me that knowing scripting fundamentals are still crucial in the age of AI. It allowed me to craft more accurate prompts and quickly fix the AI-generated scripts when they didn't perform to expectations.

Storytelling is Everywhere / With communication being one of the most critical skills of a designer, applying storytelling techniques across presentations, documents, and user interfaces plays an important role in helping to move the project forward and reducing friction.

Thank you to everyone who made this project possible, and thank you to the university health network for funding this project. a special thanks to my collaborators who provided generous guidance.

Next Case Study

Next: critter safARi

/

critter safARi

/

SERVICES

AR/VR Design & Unity Dev.

Web Design & No-Code Dev.

Mobile App Design

3D Modelling & Animation

Branding & Graphic Design

Illustrations & Videos

Design & Product Strategy

Tools

Figma, Framer

Adobe CC (Ai, Ps, Ae, Pr, Id)

Blender, Maya, ZBrush

Unity, Xcode, GitHub, VS

Notion

Domain Knowledge

General Surgery (HPB & Bowel)

Human Anatomy & Diseases

Drug Discovery/Med. Chem.

Oncology

Biophysics

Biochemistry

Responsive website from scratch by Remi Gao. | 2025 |

Thanks for visiting!

Responsive website from scratch by Remi Gao. | 2025 |

Thanks for visiting!

Responsive website from scratch by Remi Gao. | 2025 |

Thanks for visiting!