June 20, 2023 Smashing Newsletter: Issue #410
This newsletter issue was sent out to 212,473 subscribers on Tuesday, June 20, 2023.
Editorial
How do we measure the quality of our design work? What about design system health? What metrics and what design KPIs could we use to connect business requirements and user needs through the lens of design? We often think of design as an ephemeral, artistic endeavor, but to improve design, we need to be able to measure how well it performs.
In this newsletter, you’ll find plenty of insights that I’ve discovered in my research. Things that work, things that don’t work, and how to make decisions around them. We hope you’ll find them useful.
You can also explore more all around design KPIs and UX metrics in the upcoming Interface Design Patterns UX Training and our UX video course.
Later today, we run Smashing Meets AI from 8 until 11 AM (PT) 🌍, where we’ll be talking to five wonderful speakers about the AI and its impact, and how it affects our work — now and in the future. Tickets are free for everyone, so get yours now.
Do join us in any one of our upcoming events — we’d love to see you!
- SmashingConf Freiburg (Sept. 4–6), our legendary SmashingConf (offline and online), with only 12 tickets left!
- SmashingConf Antwerp (Oct. 9–11), our shiny new conference all-around design and UX — it also includes a workshop on design KPIs.
For now, though, let’s dive into some UX and design metrics! Happy reading, everyone!
— Vitaly (@vitalyf)
1. Measuring UX Research Impact
How can you make sure your UX research makes a real impact and doesn’t collect dust in someone’s drawer? Karin den Bouwmeester proposes a multi-level framework for defining and measuring UX research impact, taking into account all the different angles that need to be considered.
Karin defines three levels for measuring UX research impact: the impact on the customer and business outcome, the impact on the organization, and the impact on the user research practice. Her cheatsheet (PDF) makes it easy to ask the right questions and track the right metrics for each level. (cm)
2. Measuring Design System Success
How much value does your design system provide? Does it pay off? Ravi Lingineni shares insights into how he and the design systems team at Pinterest measure design system adoption in Figma to maximize usage and increase component adoption.
To measure adoption at scale, Ravi leveraged Figma’s REST API to build a tool that calculates design adoption in the background without slowing down designers. The tool runs every night and looks at all Figma files to calculate the adoption percentage across the entire organization and give the team a better understanding of the system’s health. If you plan to measure design system success in your organization, the Pinterest team open-sourced their solution. (cm)
3. NPS Considered Harmful
As easy as they might be to measure and track, Net Promoter Scores (NPS) are harmful, argues Jared Spool. He wrote a comprehensive post in which he unpacks how NPS works, why we can’t reduce user experience to a single number, and how we can do better.
If you’re looking for a quick overview on the topic, Vitaly also summarized the main arguments that speak against tracking Net Promoter Scores, among them weak statistical properties, strange score calculation, and the fact that NPS doesn’t accurately reflect UX success. Even if you are a proponent of NPS, these two posts are a great reminder to take a step back every once in a while to reconsider if the tools and techniques you use really give you the best value. (cm)
4. Complete Guide To The Kano Model
Every UX designer wants their product to satisfy their customers’ needs while being meaningful and delightful. But how do we measure satisfaction? How do we choose what to build in order to provide it? And how do we turn satisfaction into delight? While there are no definitive answers to these questions, the Kano Model can help you see things clearer.
Daniel Zacarias sifted through every online resource he could find on the Kano model and did scientific research to distill everything you need to know about it down into an in-depth, step-by-step guide. The guide not only gets you familiar with the important aspects of the Kano model but also introduces you to a practical approach and a set of tools to conduct your own Kano analysis. (cm)
5. Upcoming Workshops and Conferences
That’s right! We run online workshops on frontend and design, be it accessibility, performance, or design patterns. In fact, we have a couple of workshops coming up soon, and we thought that, you know, you might want to join in as well.
As always, here’s a quick overview:
- Deep Dive On Accessibility Testing Dev
with Manuel Matuzović. June 19 – July 3 - The React Performance Masterclass Dev
with Ivan Akulov. June 29 – July 13 - Data Visualization Masterclass Dev
with Amelia Wattenberger. July 4–18 - Figma Workflow Masterclass Design
with Christine Vallaure. July 20–28 - Advanced JavaScript Masterclass Dev
with Christophe Porteneuve. Aug 16–30 - Interface Design Patterns UX Training UX
with Vitaly Friedman. Sep 8 – Oct 6 - Accessible Components from Design to Development Dev
with Carie Fisher. Sep 14–22 - Typography Masterclass Design
with Elliot Jay Stocks. Oct 16–30 - Strategizing Products and Customer Experiences (SPACE) UX
with Debbie Levitt. Oct 18–26 - Smart Interface Design Patterns Video Course UX
9h-video + Live UX Training with Vitaly Friedman - Jump to all workshops →
6. Feedback Scoring And Gap Analysis
“How likely would you be to recommend our product to friends, family, or colleagues?” Many companies use a question like this to analyze what customers think of their product. However, there are quite some drawbacks that come with the approach. So instead of relying on a single survey question, Anna Debenham recommends a combination of feedback scoring and gap analysis to gather user feedback.
The idea is to compile a list of statements and ask your users to rank their agreement or disagreement on a scale. Then the response scores for each statement are averaged and plotted on a radial graph to visualize the areas in which you are performing or underperforming. Anna summarized useful tips and templates for creating your own survey, including feedback statement prompts and tips for splitting your results into cohorts to better understand how the experiences in your user base differ. (cm)
7. DesignOps KPIs
Design metrics such as heuristics and conversion rate usually focus on the outcome. Arturo Leal and the Dell Digital Design team wanted something more specific for their organization and decided to measure the day-to-day processes and interactions instead. To help them assess if they are taking the right actions towards success, they defined a set of KPIs that provide a snapshot of what’s happening at a given time.
As Arturo explains in his post “DesignOps: What can we measure?,” the approach defines four large buckets for KPIs. The Productivity bucket measures efficiency and removes blockers in workflows and processes. The Team Health bucket is related to employee loyalty and designer growth. Team Output measures the quality of design work. And last but not least, Team Growth helps plan resources. Interesting takeaways for your DesignOps strategy are guaranteed. (cm)
8. The Double Diamond Is Not Enough
Designers are often pulled off a project as soon as the prototype is done, and test results come back positive. Any design work that is needed after that is usually squeezed in. In order to make sure quality doesn’t suffer, Adam Gray suggests a different approach to the design process. He recommends replacing the well-known double diamond model with a triple diamond.
Compared to the double diamond, the triple diamond keeps the designer involved beyond the prototyping phase, making planning, updating stakeholders on progress, and working with the development team clearer. Each of the three diamonds describes one phase in the design process — from the research phase to creating a proof of concept and, finally, a live release. A small change that has the makings to improve the quality of your products significantly. (cm)
9. News From The Smashing Library 📚
Promoting best practices and providing you with practical tips to master your daily coding and design challenges has always been at the core of everything we do at Smashing.
In the past few years, we were very lucky to have worked together with some talented, caring people from the web community to publish their wealth of experience as printed books. Have you checked them out already?
- Understanding Privacy by Heather Burns
- Touch Design for Mobile Interfaces by Steven Hoober
- Image Optimization by Addy Osmani
- Check out all books →
That’s All, Folks!
Thank you so much for reading and for your support in helping us keep the web dev and design community strong with our newsletter. See you next time!
This newsletter issue was written and edited by Cosima Mielke (cm), Vitaly Friedman (vf), and Iris Lješnjanin (il).
Smashing Newsletter
Useful front-end & UX bits, delivered once a week. Subscribe and get the Smart Interface Design Checklists PDF — in your inbox. 🎁
You can always unsubscribe with just one click.
Previous Issues
- UX Writing
- New Front-End Techniques
- Useful Front-End Techniques
- Design & UX Gems
- New Front-End Adventures In 2025
- Inclusive Design and Neurodiversity
- UX Kits, Tools & Methods
- How To Measure UX
- New In Front-End
- Web Accessibility
Looking for older issues? Drop us an email and we’ll happily share them with you. Would be quite a hassle searching and clicking through them here anyway.