Adie Margineanu, User Experience Lead at the University of Toronto Scarborough (UTSC), presents a detailed case study of how a disciplined, multi-phase user research program drove the redesign of the university's admissions website over a 10-month period. The project involved five distinct research studies engaging 194 prospective student participants in total, with Margineanu as the sole dedicated researcher — representing less than 9% of the delivery team.
The talk is structured around three recurring objections from leadership to user research — lack of time, money, and resources — and systematically dismantles each. Research consumed approximately 20% of the project timeline and less than 10% of total project costs (recruitment only, since Margineanu was an internal resource), and it ran concurrently with design and development without causing delays.
The research program followed the product lifecycle with four key decision points. First, a tree test with approximately 20 prospective students validated the proposed information architecture, revealing that users preferred a journey-based global navigation (programs → applying → finances → campus) over one organized by institutional departments. Second, two rounds of prototype usability testing — one low-fidelity and one medium-fidelity — with roughly 10 participants each surfaced critical usability failures in the program finder. Users were confused by advanced filters placed above the fold and assumed they had to fill them out before seeing programs. Findings led to simplified filters, progressive disclosure of prerequisites, and a surfaced co-op checkbox. International students specifically did not understand co-op, prompting a tooltip addition. Third, sentiment testing with 60 participants (split across two visual design options) used a validated word bank developed with the brand team to determine which concept was more on-brand. Though stakeholders preferred Option A, data clearly showed Option B was far more brand-adherent, and leadership accepted the data without pushback. Fourth, post-launch usability testing with 12 participants (~$1,500 recruitment cost) validated that the program finder worked at scale with real programs.
Outcomes were measurable: Google average position improved from 10.7 to 6.5; sessions increased 10% year-over-year despite sector-wide declines attributed to AI; session duration rose 3.8% site-wide and 19% on program pages; program page sessions increased 18% compared to the same period the prior year; and conversion (clicks on 'Apply Now') increased 62% in the first eight weeks post-launch during peak admissions season.
Margineanu also surfaces lessons learned: wireframes should use real or directional content (not Lorem Ipsum); vendor discovery phases need tighter integration; institutional content governance gaps limited the specificity of eligibility and fee information; and development starting before research was complete created feature constraints. She identifies the initiative as a minimum viable research program — card sorting was skipped, visual design testing was less robust than ideal, and proxy users were recruited under time pressure. The session closes with confirmation that the pilot established a repeatable research model with executive buy-in and dedicated budget going forward.
Um, so I'm so I'm very excited to uh for this talk. I know a little bit about it, so I'm excited for everyone else to learn more about this project. So uh but it's about the strategic value of user research and um Addie uh she's been working in user exper experience research design and strategy for the past 14 years and has been leading user experience at user experience at uft scarboard for the last three and this talk about making user research feasible sound investment for teams to adapt to k...
26:21This fireside chat from EvolveDigital Toronto 2026 brings together Nicole Woodall (Director of Digital Strategy, Communi...