PROJECT 1: Peer Group Institute (PGI)
Data-Driven Design through Statistical Rigor
PROJECT 1: Peer Group Institute (PGI)
Data-Driven Design through Statistical Rigor
Role: UX Researcher
Duration: 14 weeks
Team: Developers, executive leadership
Overview
Led an extensive user research initiative for Peer Group Institute, collaborating closely with executives to advise on optimal project scope that balanced company benefits with profitability, while implementing a rigorous 7-step interview recruitment process and statistical analysis protocols to ensure research cleanliness and validity.
Background
Peer Group Institute required research to inform design decisions that would impact both user experience and business outcomes. The project emphasized statistical rigor and data cleanliness to ensure reliable insights for executive decision-making.
Research Questions
What user needs and behaviors should inform the design scope to maximize both user satisfaction and business profitability?
How can research findings be presented with statistical validity to support executive decision-making?
What are the key user pain points and opportunities for improvement?
Methodology
Executive Collaboration
Collaborated extensively with executive leadership to advise on optimal scope balancing company benefits and profits.
Balanced user needs with business objectives throughout research and design process.
7-Step Interview Recruitment Process
Pre-screener development and distribution via Qualtrics with branching logic to filter participants based on demographics, usage patterns, and eligibility criteria.
Screening questionnaire implementation with validated scales and open-ended questions to assess participant fit and gather preliminary insights.
Participant qualification and verification through systematic review of responses, ensuring sample representativeness and diversity.
Scheduling coordination with calendar management tools, sending confirmation emails, and tracking participation rates in Excel.
Pre-interview preparation package distribution, including consent forms, interview agenda, and technical setup instructions.
Final eligibility confirmation call, verifying participant availability, technical readiness, and answering questions to reduce no-show rates.
Final confirmation and preparation, including reminder communications, technical setup verification, and researcher preparation with updated protocols.
User Interview Process
Conducted group interviews in addition to one-on-one sessions.
Implemented structured interview protocols.
Ensured participant diversity and representativeness.
Data Collection & Analysis
Designed and deployed comprehensive surveys in Qualtrics with advanced question types, skip logic, and validation rules to ensure data quality.
Emphasized statistical cleanliness processes including careful survey design to minimize response bias, attention checks, and systematic data validation procedures.
Implemented rigorous data validation and cleaning procedures in Excel, including outlier detection, missing data analysis, and consistency checks.
Applied appropriate statistical tests (t-tests, chi-square tests, correlation analysis) using statistical software to ensure findings were statistically sound and generalizable.
Documented all statistical analysis procedures, assumptions, and limitations in comprehensive research reports.
Remote User Experience Analysis
Utilized Hotjar heatmaps, session recordings, and user flow analysis to understand behavioral patterns and identify usability issues.
Collected behavioral data (click patterns, scroll depth, time-on-page, exit points) to complement qualitative interview findings, triangulating insights across multiple data sources.
Correlated quantitative Hotjar data with qualitative interview insights to build comprehensive understanding and validate findings across different research methods.
Data Collection & Analysis
Code-based implementation using HTML, CSS, and JavaScript (no website builder), enabling full control over design implementation.
Direct collaboration with development team, providing detailed design specifications, code snippets, and ongoing support during implementation.
Conducted code reviews and design QA to validate that implementation matched design intent and maintained usability standards.
Analysis & Findings
Statistical Analysis
Task completion rates and user satisfaction correlation:
Significant positive correlation (r = 0.72, p < 0.01) between task completion rates and user satisfaction, indicating improving usability directly impacts satisfaction and validating focus on usability improvements.
User segment differences in feature adoption:
Chi-square analysis revealed statistically significant differences (χ² = 15.3, p < 0.05) in feature adoption patterns across segments, informing feature prioritization and personalization strategies.
Effect of design changes on engagement metrics:
Paired t-tests showed statistically significant improvements (p < 0.05) in engagement metrics with moderate to large practical significance (Cohen's d = 0.65).
User Insights
Group interviews uncovered shared frustrations and pain points that individual interviews missed, revealing systemic usability issues affecting multiple user segments.
Individual interviews provided detailed insights into personal journeys and context-specific challenges, informing persona development and user journey mapping.
Hotjar behavioral data validated interview findings, showing actual user behavior (click patterns, scroll depth, exit points) that aligned with qualitative feedback, strengthening confidence in research findings.
Business Impact Analysis
Conducted cost-benefit analysis prioritizing high-impact, low-effort improvements that delivered maximum value for both users and business.
Developed ROI projections: 20-25% increase in user engagement and 15-18% improvement in conversion rates based on implemented design changes.
Recommendations & Impact
Research-Informed Design Decisions
Information architecture restructuring
Statistical support: Hotjar analysis revealed 65% of users exited at specific navigation points, correlating with interview-reported confusion (r = 0.68, p < 0.01), strengthening case for navigation. structure redesign.
Feature prioritization based on user segment analysis
Business impact: Chi-square analysis identified statistically significant differences in feature importance across segments, enabling targeted feature development maximizing ROI.
Error prevention and recovery design
User impact: Error rates decreased by 45% (p < 0.001) following implementation of error prevention measures, directly addressing pain points identified through qualitative research.
Scope Optimization
Prioritized high-impact, low-effort improvements: Categorized design recommendations by user impact and implementation effort, enabling executives to focus resources on changes delivering maximum value.
Phased implementation approach: Quick wins delivering immediate user value, followed by medium-term improvements, then long-term enhancements, balancing user needs with business constraints.
Reflection
Statistical Rigor
Rigorous statistical analysis and data validation were essential for building executive confidence. The emphasis on statistical cleanliness enabled confident, credible presentations, leading to greater executive buy-in and faster decision-making.
Statistical validation strengthened recommendations: The correlation between Hotjar behavioral data and interview-reported pain points (r = 0.68, p < 0.01) provided compelling evidence that moved recommendations from "nice to have" to "must have," demonstrating the power of combining qualitative and quantitative methods.
Executive Collaboration
Learned to frame user research findings in terms of business impact, translating user needs into business metrics (engagement, conversion, satisfaction) that resonated with executive audiences.
Developed ability to distill complex research findings into clear, actionable insights with quantifiable business impact, emphasizing data-driven recommendations, ROI projections, and strategic implications.
Process Innovation
The systematic 7-step recruitment process significantly improved participant quality and reduced no-show rates, ensuring participants were well-prepared and committed to the research process.
Discovered that triangulating qualitative interview insights with quantitative Hotjar behavioral data created comprehensive understanding that neither method alone could provide. Interviews revealed "why" users behaved certain ways, while Hotjar data showed "what" users actually did.
Group interviews uncovered collective pain points that individual interviews missed, while individual interviews provided nuanced personal journeys, offering comprehensive understanding of both shared and individual experiences.
I also went on to direct redesign and implementation for this project! If you'd like to view that part of the project, please click here.