User Research ROI Is Demonstrably Positive — Institutions That Skip It Are Making a Financially Irrational Decision
The Claim
User research is consistently framed as a budget-constrained luxury rather than a return-generating investment. The evidence from EvolveDigital suggests this framing is financially wrong: user research, even conducted by a single internal practitioner with minimal recruitment budget, delivers measurable improvements in conversion, search visibility, and session quality that outperform the research cost by multiples.
The UTSC Case Study
Adie Margineanu's presentation at EvolveDigital provided the most rigorous cost-benefit analysis of user research investment in the corpus. The UTSC admissions website redesign included five distinct research studies across a 10-month project timeline:
1. Tree testing (~20 participants) to validate information architecture 2. Low-fidelity prototype usability testing (~10 participants) 3. Medium-fidelity prototype usability testing (~10 participants) 4. Sentiment testing (60 participants) for visual design selection 5. Post-launch usability testing (12 participants, ~$1,500 recruitment cost)
Total cost: internal researcher time plus recruitment. Less than 10% of total project budget. Approximately 20% of timeline, running concurrently with design and development without causing delays.
Outcomes measured eight weeks post-launch: - Google average position improved from 10.7 to 6.5 - Sessions increased 10% year-over-year despite sector-wide declines attributed to AI - Session duration rose 3.8% site-wide and 19% on program pages - **Conversion (Apply Now clicks) increased 62% in the first eight weeks during peak admissions season**
For a Canadian university dependent on enrollment revenue, a 62% conversion lift in peak admissions season represents a financial impact that dwarfs the research cost by orders of magnitude.
Dismantling the Objections
Margineanu structured her entire presentation around the three standard leadership objections — lack of time, money, and resources — and addressed each with concrete evidence:
- **No time**: Research ran concurrently with design and development without causing delays
- **No money**: Recruitment was the only cost; the researcher was internal. Post-launch testing cost $1,500.
- **No resources**: The entire research program was executed by a single internal practitioner
The Stakeholder Override Value
Sheridan College's homepage slider case study adds a different dimension to the ROI argument. Nicole Woodall's team used user research to override strong political pressure from multiple stakeholders to retain the slider. Without the research, the team would have designed to internal politics rather than user behavior — a common and costly outcome in large organizations. The research cost of the slider decision was minimal; the cost of designing to stakeholder preference rather than user behavior would have been embedded in every future iteration.
The Methodological Caveat
The UTSC case is compelling but represents a single data point. Margineanu herself acknowledged it as a minimum viable research program with methodological shortcuts. And the corpus does not include a comparator — a site redesign that skipped research and underperformed. The causal story is inferential. The evidence supports the ROI claim but cannot prove it definitively from a single case.