This document discusses measuring and evaluating a DesignOps practice. It begins by defining DesignOps and its goals of amplifying design value and scaling design teams. It then discusses defining design value through skills like storytelling and prototyping. Various pieces of DesignOps like tools, infrastructure, and governance are outlined. Different types of metrics for measuring DesignOps success are proposed, including quantitative and qualitative data. Key questions for evaluating people, workflow, communications, tools, and governance are provided. The document stresses the importance of understanding business goals and creating a vision of success to measure the right things and ensure DesignOps success.
10. What is scaling?
• Impact to org & customers
• Team Sizes
• Number of Teams
• Number of Systems
• Types and number of integrations
11. The largest obstacle to
design success is the
misalignment of the
value proposition that
design itself provides an
organization.
12. Value …
… answers, “why should I come to you?”
… justifies investment through perceived return.
… suggests what should be measured to
understand return on that investment.
14. A proposed value of design
• Driving Understanding & Empathy
• Creating Clarity & Behavioral Fit
• Exploration
• Envisioning
15. Some skills to create that value
• Storytelling
• Visual Thinking
• Information Presentation
• Workshop Facilitation
• Prototyping/Simulations
16. Disciplines and their value
Research: move past surface symptoms towards framing problems and needs.
Facilitation: align understanding utilizing tools, frameworks, and visualizations.
Interaction Design: convert understood problems and needs into flows and
interactivity models that not just meet needs but fit behaviors.
Information Architecture: Brings clarity by converting data sets into
information spaces that help people gain insights, navigate smoothly, make
better decisions.
Visual Design: Creates handles and buttons, and information visualizations that
allow people to understand possibilities, make better decisions, act within a
system with confidence, know equally confidently the system reaction.
19. Different types of metrics
Types
•Quantitative data
•Qualitative data, quantified.
Collection Methods
•Self reported
•Gathered through automated instrumentation
22. Cascading use of data
Metric What is available to measure?
Correlation
If we compare the original metric to another metric can
that help understand the original hypothesis?
Interpretation
What does the correlation tell us?
Which directions make the desired effect?
Desired Threshold
What measure of the metric will tell us we reached an
otherwise qualitative goal?
Trend
How do the prime metric and correlated metric compare over time?
How strong of a correlation in the trend would be significant? (differential)
Milestone
What can we map against a timeline to help us understand and interpret
possible moments of cause and effect? (such as releases, ship dates)
Baseline
The value of a metric or the combination of metrics at
beginning of any initiative.
23. Setting up your measurement*
*partially taken from case study by Intuit
Data Type: Quantitative
Time Designing
Collected by: Self-reporting
Using tool such as Harvest
Desired Outcome
Increase design quality &/or
designer engagement
Hypothesis:
Increasing time designing will
increase design quality &/or
designer engagement
Measuring Desired Outcomes:
NPS, Heuristics, Usability
Testing, Customer
Satisfaction, etc.
24. How can you measure quality?
Is the product organization aligned in their understanding of the
value of you design(ing) to the business & their customers? 2
1. There is no alignment across the product organization.
2. There have been gains in alignment seen by open trials of design and
research activities and processes.
3. Alignment is growing, as seen by more non-designers participating in design
activities.
4. Design value is well understood and consistently articulated across the
product organization.
26. Vital Signs
for DesignOps
Top 3-5 metrics that tell you
something might be wrong,
or everything is ok.
Possible Examples
• Number of UX stories
that started in a sprint’s
backlog, but didn’t get
deployed to prouction.
• Attrition rate within a
design team compared to
the whole organization.
• Time spent designing/
researching.
28. DesignOps Set of Questions
People
-Does recruitment lead to best in class talent hired?
-Is the team engaged and growing professionally?
-Are the team’s values being upheld?
-Are diversity & inclusion upheld as important values?
Workflow
-Teams are meeting the needs of stakeholders?
-How much of the total design process are team
members being encouraged &/or allowed to do?
-Are designs regularly being included in shipped goods?
Communications
-Does the team have line of sight into the team &
business?
-Is the signal:noise ratio being managed?
Tools
-Is the team able to get the tools they need to
be successful & productive?
-Are tools easily integrated to each other, and to
the broader set of stakeholders (where
appropriate)?
Governance
-Are the mission & vision in place and well
understood?
-Are the team’s principles are being used to
evaluate the quality of design work?
- Are decision making processes are understood,
and acted upon?
BusinessOps
-Is the DesignOps team creating and
maintaining relationships with key BusinessOps
teams to ensure DxD smooth functioning?
29. ResearchOps Set of Questions
Inclusion
-Who is included in all the stages of research?
-Is there proper representation of appropriate subjects?
-Is data coming from many sources in the
organization?
Diversity
-How is diversity ensured during research?
-What is the current state of diversity during research?
-Are there a diverse set of data types?
Empathy
-How much is empathy spread through the
organization for customers & users of products and
services?
-Who in the organization can share stories of
customers & users that can express their emotional and
cognitive mental models?
Holism
-Is research done in a holistic manner?
-Is the total journey of the user understood?
Synthesis
-Is collected data aggregated, and synthesized
into models, prototypes, and visions?
Rigor
-Is data gathered in ways to keep data clean and
to avoid wrongful conclusions?
30. How do you know if you are
measuring the right things?
31. Measuring the right things …
Your value to you
• How do you want to be valued by your
peers and stakeholders?
• What proof do you have that you are
valuable in these ways or that you can
provide value if allowed.
32. Measuring the right things …
Your value to others
• When others come to you, what job(s) do
they ask you to do?
• What ways do they have to understand
your value to them? To the business?
33. Measuring the right things …
Align design team & stakeholders
• If you don’t create this alignment,
everything will be much harder, if not
impossible for you.
• This is hard work, and requires team and
stakeholders take the time to do the work.
34. Measuring the right things …
Understand business/org goals.
• If you don’t understand your orgs goals,
you’ll never be successful.
• So interview executives across domains in
the organization, even/especially those
outside of the Product & Engineering teams.
35. Measuring the right things …
A vision that describes success
• Can you tell a story that describes what
your world will be like if you were
successful?
• Do stakeholders like this story?
36. Measuring the right things …
Activity path to success
• Your story, hopefully, had a series of
activities that led from now to success.
Outline these, and put them into a plan.
• Along the way create milestones that tell
you what/when you measure.
37. Measuring the right things …
Gather, monitor, compare, share
• How will you gather data?
• How will you monitor the right data?
• How will you find strong correlations?
• How will you share data across the org?
39. For designOps to succeed …
1. Resist being reactive to stakeholders.
2. Use your skills as a design team.
3. Understand/Coach your value to the business.
4. Have a vision & a plan for achieving it.
5. Measure, or evaluate your performance.
41. What do I do?
1. I coach designers & designer leaders to help them
reach personal & professional goals.
2. I consult for & with design teams to help them
evaluate, create, and maintain their designOps
practice.
3. I teach workshops on designOps, storytelling,
design strategy, and design studio culture.