“Designed to be loveable by managers”
I read Erika Hall’s Just Enough Research. I’m not going review the entire book as it feels a bit off-topic for this blog, but the chapter about surveys had me nodding my head so much I’d love to excerpt a few things:
The questions can be asked in person or over the phone, or distributed on paper or collected online. The proliferation of online survey platforms has made it possible for anyone to create a survey in minutes.
This is not a good thing.
Surveys are the most dangerous research tool — misunderstood and misused. They frequently blend qualitative and quantitative questions; at their worst, surveys combine the potential pitfalls of both. […]
If you ever think to yourself, “Well, a survey isn’t really the right way to make this critical decision, but the CEO really wants to run one. What’s the worst that can happen?”
Brexit.
Hall highlights that surveys are much harder to debug than other methods:
It’s much harder to write a good survey than to conduct good qualitative user research—something like the difference between building an instrument for remote sensing and sticking your head out the window to see what the weather is like. Given a decently representative (and properly screened) research participant, you could sit down, shut up, turn on the recorder, and get useful data just by letting them talk. But if you write bad survey questions, you get bad data at scale with no chance of recovery. It doesn’t matter how many answers you get if they don’t provide a useful representation of reality. […] Surveys are the most difficult research method of all.
[…] Bad code will have bugs. A bad interface design will fail a usability test. A bad user interview is as obvious as it is uncomfortable. […] A bad survey won’t tell you it’s bad.
And that they might be seductive because they feel like hard data:
Designers often find themselves up against the idea that survey data is better and more reliable than qualitative research just because the number of people it is possible to survey is so much larger than the number of people you can realistically observe or interview. [… But] unless you are very careful with how you sample, you can end up with a lot of bad, biased data that is totally meaningless and opaque.
There’s also this hilarious bit:
Managers love NPS because it was designed to be loveable by managers. It’s simple and concrete and involves fancy consultant math, which makes it seems special. But is this metric as broadly applicable and powerful as it claims to be?
Nah.
NPS is not a research tool. I shouldn’t even be talking about NPS in a research book.
The entire book is worth a read, with a lot more to offer than the pithy quotes I excerpted above. I really liked its pragmatic approach to research that understands the realities of the industry.