Project Overview
The Problem
Recruiting participants for B2B research is notoriously difficult, particularly in an environment where no formal incentivization model exists. For our research team, gathering insights from migrated users of Product A was essential to understanding post-migration behavior and informing strategic decisions.
However, despite launching a survey, participation rates remained extremely low. The lack of proactive engagement and motivation highlighted the need for a structured incentivization strategy to improve response rates without exhausting research funds.
The Solution
To address this challenge, I designed a data-driven incentivization pilot that introduced a randomized, budget-conscious A/B testing model. This approach allowed us to measure the impact of incentives on engagement while optimizing the budget for broader research initiatives.
Separately, another part of the organization recognized the value of this incentivization approach for their own research needs and adopted a scaled-up version of the model to support their participant recruitment efforts. This resulted in a successful funding request of $20,000 to expand research incentives. While this initiative is still ongoing, my original proposal laid the foundation for a scalable, cost-effective UX research recruitment strategy across multiple research teams.
My Role
As the lead on this project, I:
- Identified the research challenge and developed a structured incentivization strategy to boost participation.
- Designed the A/B testing framework, balancing cost-efficiency with engagement goals.
- Authored and pitched the business case, securing a $750 budget for the initial pilot.
- Led implementation, overseeing survey distribution, incentive allocation, and engagement tracking.
- Created a scalable model, which was later adopted by another department and expanded into a $20,000 funding request.
Research Process & Strategy
Background & Business Case
Understanding user behavior post-migration was critical for evaluating the forced migration project. However, due to low participation in our initial survey, we needed a structured approach to:
βοΈ Increase participation without exceeding budget constraints
βοΈ Validate the impact of incentives through A/B testing
βοΈ Create a repeatable model that could be scaled across research teams
Β
Proposed A/B Testing Approach
To determine whether incentives meaningfully impact engagement, I structured the study as follows:
- User Group Segmentation (n=91)
- Group A (Control) β Standard survey invitation (no incentive).
- Group B (Test) β Survey invitation offering a randomly selected $50 incentive for a subset of participants.
- Incentive Allocation & Budget Optimization
- 5 randomly selected participants from Group B received $50 each.
- Total incentive pool: $250 out of an approved $750 budget, ensuring $500 was preserved for future research efforts.
- Implementation Strategy
- Tailored email outreach to both groups.
- Random selection methodology to ensure fairness.
- Virtual incentives distribution to simplify reward fulfillment.
Β
Business Value Proposition
The scaled-down incentive model was designed to:
β
Increase Engagement β Incentives encourage higher response rates, leading to richer data and deeper insights.
β
Optimize Research Budget β Randomized incentives maintain participation while controlling costs.
β
Create a Scalable Model β A/B test results serve as a data-backed framework for future research incentives.
Β
Scaling & Adoption Across the Organization
How My Work Was Adapted & Expanded
Following the success of my initial pilot, a colleague repurposed my framework for a separate research initiative in the F&I business area.
πΉ The new proposal scaled up incentives and expanded participant outreach beyond surveys to include interviews and other research methods.
πΉ Using my original framework, the new funding request secured $20,000, marking a significant increase from my initial $750 budget.
πΉ The larger initiative is still ongoing, with no final results yet available, but the impact of my original proposal is clear: it established a scalable approach for incentivizing research participation across the organization.
Β
Key Differences Between My Proposal & the Expanded Version
Aspect | Original Proposal (My Work) | Expanded Proposal (Adapted Work) |
Research Area | Payments Efficiency (Product A) | Identity Verification (F&I) |
Initial Budget | $750 | $7,000 β Scaled to $20,000 |
Methodology | A/B Testing (91 participants) | Larger-scale participant outreach |
Incentive Pool | $250 (5 participants at $50) | $7,000+ for survey and interview incentives |
Expansion Outcome | Preserved $500 for future research | Justified significant increase in funding |
Challenges & Ethical Considerations
1οΈβ£ Research Attribution & Recognition
As my proposal was replicated and expanded, proper attribution was not initially given. While knowledge-sharing is crucial in UX research, ensuring that contributions are acknowledged is equally important in professional and collaborative settings.
2οΈβ£ Scaling Without Validation
The budget increase from $750 to $20,000 was based on my unvalidated pilot, meaning the model was scaled before its long-term effectiveness was confirmed. This raises questions about risk management in research funding decisions.
3οΈβ£ Ethical Collaboration in UX Research
Collaboration is essential in research, but organizations should establish clear guidelines on:
β
How research contributions are documented
β
How attribution is given when work is expanded upon
β
How funding requests should be structured to ensure research methodologies are validated before large-scale adoption
Business Outcomes & Next Steps
Leadership Adoption & Ongoing Impact
- Leadership approved my initial incentivization pilot, confirming that targeted incentives can drive engagement.
- The expanded version of my proposal secured $20,000 in funding, demonstrating its broader organizational impact.
- While the expanded initiative is still ongoing, my original work set a precedent for using structured incentives in research recruitment.
Future Strategic Value
- The incentive-based recruitment model has now been recognized as a viable research strategy.
- Future teams can leverage insights from my pilot and the scaled initiative to refine their participant engagement approaches.
- The importance of research attribution and validation before large-scale funding approval is now a topic of discussion among stakeholders.
Conclusion
This research initiative successfully validated the potential impact of structured participant incentives in B2B UX research. By carefully designing a controlled, budget-conscious A/B testing approach, I was able to demonstrate the viability of incentives in increasing engagementβleading to the adoption of my framework across the organization.
While the scaled-up version of my work is still ongoing, its approval for $20,000 in funding highlights the strategic value and long-term impact of my initial research. This case also underscores the importance of research attribution, ethical collaboration, and validating methodologies before large-scale expansion.
Β
Β