Transforming Strategic Research Capacity through Democratization

Marjorie Stainback
Research Operations Manager, Fidelity Investments
Kelsey Kingman
Senior User Researcher, Fidelity Investments
From the DesignOps Summit website:
What do you do when you have 8x more designers than user researchers in an organization? Learn how we empowered designers to conduct local-level evaluative research, gave researchers more time to work on global, strategic research, and transformed our user research delivery practice. We will share what key ingredients made our transformation possible and what pitfalls to avoid when bringing research democratization to your own organization.
Do you have more designers than researchers? Have you ever had to wait for research? Yes … pretty much everyone in the audience has had those challenges.
Fidelity has been around since 1946 – a bit of a dinosaur. They operated in Waterfall for 22 years, as their website launched in 1996. Agile is still very new at Fidelity. They re-organized and ended up with 450 scrum teams! After the dust settled … they had one researcher for every eight designers.

They figured it out it was taking any average of 14 days from intake to delivery for a research request. As a result, teams were going rogue or skipping research all together. They knew that 85% of their research was evaluative – not as much strategic research as they would have liked. The goal was to make research available to more scrum teams, while freeing up researchers for more strategic work.

Through their program, they provided both education and enablement.

They have had great success so far. They trained 125 designers who have launched 215 studies since the program’s inception. Skipping research is no longer an issue. That has also allowed the research team to change priorities. They went from 15% to 56% strategic research. By enabling designers to do their own research, they also decreased intake and delivery by 11 days (from 14 to 3).
They are going to walk us through the following:
- Program and Measurement
- Key Ingredients
- Pitfalls to Avoid
Program and Measurement
They had a hypothesis that they could teach ONE method and help drive timelines down. They had to figure out how to be quick. The method had to support geographically dispersed teams. And it had to be Evaluative, because that was where they were facing issues of scale.
They taught designers how do remote, unmoderated usability studies, enabling teams to get feedback in a day. It also allowed them to measure usage of research services, which established the magntitude of the problem, and helped them track progress over time.
They have a research repository and an intake form, which is how they measured the 14 days. They were able to track designers doing research, study completion, and collect feedback. By defining and capturing the problem, they were able to track progress against their goals.
Key Ingredients
A key factor for them was establishing a shared vision between researchers, designers, product managers, and stakeholders. The move to Agile and a company culture that valued learning was a good combination. It also helped improve empathy from designers towards researchers. Stickers further helped to spread the word.
The developed a course for designers to learn how to conduct unmoderated research:

On Day 2, students learned how to build out the test, and on Day 3 they learned how to take notes and draw conclusions. But the bulk of the training was in analysis.
They created some guardrails to ensure good test questions. Students were also assigned a research buddy who would review their test plan. They also receive presentation feedback ahead of time. The upload into the repository provided a final opportunity for check.
Pitfalls
Here are some things that went wrong:
- Their first pilot was in December, and it was six weeks long – they had only one graduate because of holidays and vacations.
- Do NOT allow unlimited sign-ups. She once ended up with nine students, and she was unable to give them the feedback they needed in order to be successful. Six students is ideal, seven is the max.
- They asked for feedback, but failed to save time to implement it – schedule breaks in the scheduling of courses so you can learn and adapt accordingly.
- If you want to stress out your students / designers, don’t set expectations properly. Kick-off was scheduled a week before the class, which helped make clear what kind of prototype what to use for the testing. There was also a website, but the thirty minute phone made a big difference in helping people feel prepared. Designers should also be taking on less work so they can focus on the course.
- Articulate your purpose in just a few words. Make sure to communicate and overcommunicate; there was a lot of miscommunciation and misunderstanding that needed to be addressed.
- And, plan to scale from the beginning. The bought a lot of licenses, but they hadn’t really though through the burnout for the researchers involved in teaching.
Key takeaways

Pingback: Основные тренды в работе дизайн-команд — из конференций DesignOps Summit и Leading Design 2019 | INFOS.BY