Overview

critter safARi
Conceptualizing a mobile app that leverage zoo visits to drive impact on people and our planet.
This project began as a self-led, cross-functional student team project that I directed and it evolved into a 1-week, AI-assisted solo design sprint.
Ultimately, this project is an exploration of how we could conceptualize a viable, user-friendly product that improves ecological empathy.
Role
Duration
Project
Cycle 1 Team
Jayne Leggatt (UX/UI, Branding)
Brendan Lazar (3D Design and Animation)
Tools
CYCLE 1
TEAM PROJECT (6 WEEKS)
Problem Definition
Project Pitching
Teamwork
Early Prototyping
User Research & Testing
Catalyzing a Team Project
To a small class, I pitched a product opportunity that attracted 2 cross-functional peers.
1
problem
1
solution
3
roles
6
weeks
Pitching the Opportunity
Visitors are often frustrated when zoo critters are not visible or are far away — so how can we balance the realities of zoo-keeping with visitor engagement?
This idea stemmed from a personal experience and user research affirmed my observations. I had gone to a major zoo where almost half of the animals were missing. I had also noticed that it was tough to understand the sizes of animals in large enclosures due to them being so far away.
Opportunity Embodied as a User
User research distilled as our MVP persona, Timmy.
Primary User Persona: 12 Years Old · Sloth Enthusiast · Older Brother
"Both of our [his and his little sister's] favourite animals were missing! Jamie and I had waited for our day at the zoo for so long."
Pitching the Solution
Mobile AR can bring animals close to the user — providing unparalleled information about appearance and size. In addition, interactive educational snippets can nurture interests in ideas like conservation.
North Star
This app intended to be an one-off experience for our local zoo.

Option A
Tapping circular button opens and scrolls to the related paragraph in the bottom sheet.

Option B
Tapping circular button opens the pop-up.
Leveraging Unity to Inform Design
I built low-fidelity, Unity-based, interactive prototypes to decide between two interaction patterns.
A/B Test Results
100% of testers preferred option B because there were (1) closer proximity between the user action and the UI feedback and (2) less irrelevant information presented.
On retrospect, to increase testing accuracy, I would match the copy in the pop-up and paragraph 3 in the bottom sheet as they intend to represent the same information.


Collaborative Achievement
Ultimately we created a functional prototype where users can scan markers to reveal critters.
In phase 1, I developed in Unity, established multimedia workflows, managed the project in Notion, and conducted lightweight UX research. Meanwhile, my teammates took ownership of UX/UI, branding, and 3D design. Our collective effort earned earned A+, excited faculty, and was showcased to junior students.
CYCLE 2
SOLO DESIGN SPRINT (2 WEEKS)
Research (User Interviews, Competitive Analysis)
AI-Integrated Workflow
LLM Feature
Design and Business Strategy
Deepening User Research
To evolve this app, I conducted user interviews and competitive analysis that pointed to opportunities in wayfinding, education, and marketing.
New "How Might We"s
Simplify route planning
Support deeper learning
Set up a viable business plan
New Personas and Opportunities
In addition to having Timmy as our primary persona, research identified more scenarios and personas.
Madeline
Secondary User Persona: 38 Years Old · Mother of 3 · Busy Professional
"Visiting as a family 5 is always a lot to handle for my husband and I. Route planning, navigation, and handling every child's unique demands is a huge hurdle for us."
Mateo
Tertiary User Persona: 66 Years Old · Retired Small Business Owner
"The pain in my knees wishes we were guided through the main attractions via the shortest route possible. I also would've liked to see more conservation stories and such — my wife and I are always eager to learn more about how to make the world a more beautiful place for the future generations."
Celine
Quaternary User Persona: 21 Years Old · Undergraduate Student · Blogger
"Social media led me to visit the zoo today — my friend posted a super cute picture with the new baby fox!" "I wish there were cool events for people my age like night-time safaris."
Key UX Updates
To achieve the new HMWs, I added a LLM feature, changed the AR strategy, and enhanced business considerations.
Option 1: Preset Routes
(High Constraints Context)
OR
Option 2: LLM-Based Route Customization
(Low Constraints Context)
LLM-based route planning was integrated to enhance personalization and efficiency for users — with preset routes available as a fallback depending on technical and business constraints.
Users have unique needs during their visit like physical exertion limitations, favourite animals, etc. yet they don't want to spend time going through complex route planning processes. LLMs offers a stress-free, low-barrier, and natural way to plan their visit.
Cycle 1: Marker-Based AR

Cycle 2: SLAM (Simultaneous Localization and Mapping)-Based AR
Switching to GPS-based AR avoids overcrowding near physical markers — enabling more seamless experience scaling.
Marker-based AR had been selected initially as it minimized the number of interactions needed to access AR contents — useful for less crowded zoos.
I envision this being a consolidated tool that partners with multiple zoos, instead of an one-off experience, to increases user retention and engagement for all partnering zoos.
User retention can:
Motivate more frequent visits to and stronger connection with partnering zoos (e.g. event notifications, critter updates, seasonal collectibles, etc.)
Nurture stronger wild-life compassion and eco-consciousness in general public
Sustain a viable, scalable B2B business model for the product team
Cycle 2: QR-Code Based UX Specificity within One App
Wireframe Guerilla Testing
Integrating AI-generated visual assets, I tested this wireflow with 5 potential users via think-aloud.
I integrated visual assets that I generated using AI so that interface intentions were clear to testers.
User Testing-Driven Iterations
Insights from users about increasing engagement and reducing interaction cost drove several improvements.
Switched to seasonal and changing cast of AR critters
Why This Works
Gamification that encourages zoo revisits
+
Newly Inserted Screen
+

Added selfie option with AR critters
Why This Works
Improve social media presence and influence
Reduced interactions in route selection screen
Why This Works
Make task faster for users

Visual Identity
I found Midjourney especially helpful for exploring color palettes — and synthesized the remainder manually.
Final Prototype
The final output — which uses genAI-based visual assets — is a concept of a user-friendly, viable app that also intends to nurture ecological empathy.
Reduces friction for users like Madeline and Mateo who have very specific route planning needs.
Namely, the mockup uses the user-validated option B interaction and ensures critter visibility for Timmy.
Enables the scalable GPS AR strategy while resolving Madeline and Mateo's concern about ease of navigation.
Next Steps
I'd take this concept further by connecting with various zoos to identify their unique needs and constraints, and assembling a team accordingly. To gauge our success, I'd track metrics like retention and ARPU.
Reflections
AI-generated visual assets was a major time saver in the creation of this concept product but it remains debatable whether these visual assets should be used in a commercially-launched product.
In support of GenAI art, I believe there’s artistry in selecting and refining GenAI outputs. As an analogy, photography was initially scrutinized and seen as a documentary means rather than art. As it evolved, composing, framing, and editing came to be seen as an art form — I think a similar thing will likely happen for GenAI. Therefore, I'm excited and strongly motivated to explore new AI tools and thinking.
Moreover, I believe referencing the works of others is a deeply human process. Throughout history, art has built on what has come before, often without clear attributions to all of sources that have inspired it. Now, AI is able to complete this very human intent alongside a human user at superhuman speed.
Nevertheless, before we fully explore how we can protect traditional artists from the unprecedentedly massive displacement by AI and set up systems for more transparency, I believe final artworks in commercially-launched products should be made by a human hand, using GenAI for inspiration only.
The cross-functional team experience in this project taught me the value of honest, clear, and considerate feedback in enabling growth and collaboration.
The rapid design sprint allowed me to reflect on my leanest design process and practice agility.