Overview

critter safARi
Conceptualizing a mobile app that leverage zoo visits to drive impact on people and our planet.
This project began as a self-led, cross-functional student team project that I directed and it evolved into a 1-week, AI-assisted solo design sprint. Ultimately, this project is an exploration of how we could conceptualize a viable, user-friendly product that improves ecological empathy.
Role
Duration
Project
Phase 1 Team
Jayne Leggatt (UX/UI, Branding)
Brendan Lazar (3D Design and Animation)
Tools
PHASE 1
TEAM PROJECT (6 WEEKS)
Catalyzing a Team Project
To a small class, I pitched a product opportunity that attracted 2 cross-functional peers.
1
problem
1
solution
3
roles
6
weeks
Pitching the Opportunity
Visitors are often frustrated when zoo critters are not visible or are far away — so how can we balance the realities of zoo-keeping with visitor engagement?
This idea stemmed from a personal experience where I went to a zoo where almost half of the animals were missing. I had also noticed that it was tough to understand the sizes of animals in large enclosures due to them being so far away. User research affirmed these observations.
User research distilled as our MVP persona, Timmy.
Primary User Persona: 12 Years Old · Sloth Enthusiast · Older Brother
"Both of our [his and his little sister's] favourite animals were missing! Jamie and I had waited for our day at the zoo for so long."
Pitching the Solution
Mobile AR can bring animals up close — providing unparalleled information about appearance and size. In addition, interactive educational snippets can nurture interests in ideas like conservation.
Business Strategy
The north star of our prototype was a one-off experience at our local zoo, but my strategy would later shift in phase 2.

Option A
Tapping circular button opens and scrolls to the related paragraph in the bottom sheet.

Option B
Tapping circular button opens the pop-up.
Leveraging Unity to Inform Design
Our team members differed in their opinion on a specific interaction feedback pattern. To resolve this debate, I built 2 low-fidelity, interactive prototypes in Unity and compared them through lightweight A/B testing.
A/B Test Results
100% of testers preferred option B because there were (1) closer proximity between the user action and the UI feedback and (2) less irrelevant information presented.
On retrospect, to increase testing accuracy, I would match the copy in the pop-up and paragraph 3 in the bottom sheet as they intend to represent the same information.


Collaborative Achievement
Ultimately we created a functional prototype where users can scan markers to reveal critters. The project earned A+, excited faculty, and was showcased to junior students.
In phase 1, I developed in Unity, established multimedia workflows, managed the project in Notion, and conducted lightweight UX research. Meanwhile, my teammates took ownership of UX/UI, branding, and 3D design.
PHASE 2
SOLO DESIGN SPRINT (2 WEEKS)
Deepening User Research
To evolve this app, I conducted user interviews and competitive analysis that pointed to opportunities in wayfinding, education, and marketing.
New "How Might We"s
Simplify route planning
Support deeper learning
Set up a viable business plan
Phase 2 research were distilled as additional personas, scenarios, and opportunities.
Madeline
Secondary User Persona: 38 Years Old · Mother of 3 · Busy Professional
"Visiting as a family 5 is always a lot to handle for my husband and I. Route planning, navigation, and handling every child's unique demands is a huge hurdle for us."
Mateo
Tertiary User Persona: 66 Years Old · Retired Small Business Owner
"The pain in my knees wishes we were guided through the main attractions via the shortest route possible. I also would've liked to see more conservation stories and such — my wife and I are always eager to learn more about how to make the world a more beautiful place for the future generations."
Celine
Quaternary User Persona: 21 Years Old · Undergraduate Student · Blogger
"Social media led me to visit the zoo today — my friend posted a super cute picture with the new baby fox!" "I wish there were cool events for people my age like night-time safaris."
Key UX Updates
To achieve the new HMWs, I added a LLM feature, changed the AR strategy, and enhanced business considerations.
Option 1: Preset Routes
(High Constraints Context)
OR
Option 2: LLM-Based Route Customization
(Low Constraints Context)
To meet users' unique demands in route planning while minimizing interaction effort, I integrated LLM-based route personalization — with preset routes being a fall-back option depending on technical and business constraints.
Users have unique needs during their visit like physical exertion limitations, favourite animals, etc. yet they don't want to spend time going through complex route planning processes. LLMs offers a stress-free, low-barrier, and natural way to plan their visit.

Phase 1: Marker-Based AR
Phase 2: SLAM (Simultaneous Localization and Mapping)-Based AR
Our initial strategy anchored the AR critter on physical markers to minimize the number of interactions required to reveal AR content. However, I switched to a GPS-based AR to avoid overcrowding near marker, enabling more seamless product scaling.
Instead of an one-off app experience that users often download over data and will likely delete, this product would ideally partner with major zoos to act as a consolidated tool that users can access freely, especially while travelling to new zoos or revisiting old ones.
This would enable user retention that can make way for continuous engagement with zoos — to get notified about new events, receive updates about select critters, and collect critter momentos. This ultimately aims to (1) nurture stronger wild-life compassion and eco-consciousness in the general public, (2) continuously generate revenue for the critter safARi team through B2B SaaS fees, and (3) incentivize stronger connection with and more frequent visits to the partnering zoos.
Phase 2: QR-Code Based UX Specificity within One App
Wireframe Guerilla Testing
Integrating AI-generated visual assets, I tested this wireflow with 5 potential users via think-aloud.
I integrated visual assets that I generated using AI so that interface intentions were clear to testers.
User Testing-Driven Iterations
Insights from users about increasing engagement and reducing interaction cost drove several improvements.
Switching to seasonal and changing cast of AR critters
Hypothesized Effect
Gamification that encourages zoo revisits
+
Newly Inserted Screen
+

Additional selfie option with AR critters
Hypothesized Effect
Improve social media presence and influence
Reduce interactions in route selection screen
Hypothesized Effect
Minimize interaction cost
Visual Identity
I found Midjourney especially helpful for exploring color palettes — and synthesized the remainder manually.
Final Prototype
The final output is a proof-of-concept, made with gen-AI based visual assets, of a user-friendly, viable tool that also intends to nurture ecological empathy.
Success Metrics
I would use metrics like retention, ARPU, page exit rate, session length, and more to gauge success over time and identify areas for improvement.
Reflections
The collaborative experience taught me the of value of empathetic yet honest communication, while the rapid design sprint allowed me to sharpen my core design process and explore using AI as a creative partner to enhance my design efficiency.
In a real-world context, I would seek stakeholders buy-in (e.g. zoo decision-makers and engineers) with the proof-of-concept and collaboratively determine the MVP roadmap — likely cutting back on features. Given more resources, I'd also consider a B2B version of the app to allow zoos to easily make updates and track engagement.