Super.com Referrals
Design Management, Vision and Strategy, End-to-End Product Design
Team
Myself, a product manager, a product designer, a product design intern, engineer team, Content designer, Brand designer
Platforms
Native iOS and Android apps, mobile and desktop web
Timeline
4 sprints over 8 weeks
The Challenge
The company had previously launched a failed referrals program, attempting to get users to refer Super+, a premium subscription with little perceived value. Without strong incentives, adoption was low.
This time, leadership wanted to leverage a proven in-app feature—Earnings—where users could make real money through offerwalls. The goal was to create a referral system that incentivized sharing and sustained engagement.
As a hybrid IC and design manager, I supported the design effort, managed an intern and senior designer, and advocated for research, process improvements, and higher-quality execution within a fast-moving, delivery-focused team.
Objectives
For our customers
A referrer would earn unlimited 15% commissions from earnings of referred users, forever.
For our business
Add a new, retentive feature for customers that would:
Lower the cost to acquire customers
Increase net revenue from earnings
Accelerate user growth
Discover & Define
Competitive analysis
Validated commission-based incentive model
Narrowed design choices for 2-sided experience
Identified industry best practices
Both the Referrer and Referred experiences as journey maps
User journey mapping
Built customer empathy with cross functional team and stakeholders
Surfaced unexpected pain points
Clarified achievable scope
Uncovered overlooked experience needs
Setting a direction
Future exploration wireframes
Aligned cross-functional team and stakeholders
Prioritized order of feature development
Influenced roadmap
Output a synthesis of research, best practices, industry standards
Rapid usability testing
Unmoderated usability test of both sides of experience
Though formally out of scope as part of the PDLC, I independently planned, scripted, designed, prototyped, analyzed and reported out findings from unmoderated testing on usertesting.com
Validated core user flows for Referrer and Referred
Identified unforeseen pain points that required iteration before release
Uncovered a key insight that drove future planning and cross-functional collaboration: Participants claimed they wouldn’t refer unless they trusted the platform to deliver on promises. Since we didn’t own experiences outside these flows, solving for this trust issue required support from a partner vertical.
Referrer MVP
The scope of the Referrer experience for MVP was focused on primary entry points and primary landing page. We would gather data on user behavior and focus additional resources on the Referred MVP.
Small wins, outsized impact
Custom banner component
I designed and drove implementation for the most successful referrals entry point (the pink banner animated from the bottom of the screen).
I identified the opportunity based on existing data and understanding of user behavior, but it was initially scoped out of MVP.
To get this rescoped, I leveraged my relationships with engineering and found solutions. We successfully delivered and built this component resulting in:
The highest percentage of successful referrals
Highest CVR from click to send link
2nd highest overall CTR
Referred MVP
The scope of the Referrer experience for MVP was focused on a linear onboarding flow with gamified activities.
MVP Launch Prototypes
Referrer user journey
You are invited to make your first referral
Referred user journey
Your friend sends you an invitation to sign up for Super.com
MVP Impact
A K-factor of 1.0+ means your product is viral, and the goal of the referral program was ultimately to to achieve that goal. It took 3 weeks to achieve statistical significance in the experiment.
Primary KR was achieving a K-factor of 0.20 (20%) and we managed 0.022 (2.2%)
Secondary KR was to acquire 3k monthly earners (not just sign-ups) and we acquired 774
Our team knew these launch KRs were a likely out of reach. But we learned a lot. One important learning: the assumption that that the financial incentive alone would be enough to motivate our customers to refer another user. This assumption was wrong. Post-launch iterations would subsequently focus on remedying this failed assumption.
Learnings
Super.com is a data-driven, highly iterative company. I have never worked anywhere quite like it. The shipping cadence was incredibly fast and the release cycle was measured in days, not weeks or months. The final MVP lacked many of the features I would have wanted — but that wasn’t the goal of the initial release. It was for gathering data to make more informed decisions for the next iteration. To that end, we immediately got to work identifying which user cohorts were most engaged, what entry points were most effective, and if the program had the potential to scale.
For additional details to this case study, reach out and let’s set up a time to chat!