AI Summary. In this conversation, Peter Spear interviews Farrah Bostic, founder of the Difference Engine, about her background and the value of qualitative research. Bostic shares her journey from copywriting to strategy and research, emphasizing the importance of understanding customers' real stories and experiences. She argues that qualitative research is essential for businesses to make informed decisions, adapt to change, and maintain integrity in an increasingly complex world, despite the challenges of convincing clients of its worth.
Farrah Bostic is the founder of The Difference Engine and the creator of CrossTabs “the newsletter for the podcast that will try to explain how polls are designed and conducted, how to understand what they say, how to discern what they don't say.”
Prior to founding The Difference Engine, Farrah had served in various research and strategy roles including as VP, Group Planning Director at Digitas; SVP Consumer Immersion at Ipsos/OTX; and Partner/Head of Innovation at Hall & Partners.
I have followed her for a long time, always appreciating her advocacy for research.
I invited you because I've followed you for a long time. I've loved your newsletter. You've been really vocal about research and arguing for research in beautiful ways. I was excited to have a chance to sit down with you and just talk as part of this weird little conversation series I'm doing. Thank you for accepting my invitation.
Thank you for inviting me.
I don't know if you know this, but I start all my interviews with the same question, which is a question I borrowed from a friend of mine. She's an oral historian, so she helps people tell their stories. I've hijacked this question for my own method. I over-explain it because it's a beautiful question, but it's very powerful. I want you to know you're in absolute control. You can answer or not answer any way that you want to. The question is, where do you come from?
I think I understand the ways that people approach answering this question. I'll start with the literal. I'm from Portland, Oregon, originally. I was born at Good Samaritan Hospital in downtown Portland. My mom's from Oregon also, and her dad's from Oregon, which is surprising. There are not many multi-generation Oregonians. Before that, mostly Kansas, Missouri. We are white Americans from way back.
My dad was from West Virginia originally, but his parents, for reasons we can only speculate upon, decided to move from West Virginia to Southern California. He went to high school with Steve Martin at Garden Grove High School. He then got into - and I think this starts to get at the other part of where I'm from - he was almost a PhD in the philosophy of science at Duke.
My mother is a pragmatist who studied political science at the University of Oregon. She was an educated woman at a bad time, when you couldn't get a credit card without your father's or husband's help, and you couldn't get certain kinds of jobs. It was not an easy thing to transition from university into the workforce. She wanted to go to law school, but her mother put a stop to that by refusing to help her pay for it. So she temped, and she had a story that changed as time went by. When I was a teenager, she used to tell me she would temp during most of the year and then ski in the winters. That was how she met my dad. He was managing a company that expedited shipping invoices, which is what he did with his almost-PhD.
I think one of the things people do is talk about generational narratives. One of the things that is another answer to "where do I come from" is I'm really interested in people's actual stories and experiences, not broad stroke generalizations about people. My dad was an aspiring draft dodger, but he had a really high draft number, flat feet, and bad eyesight, so he was never going to go into infantry. That was never going to happen. My mom voted for Nixon, and yet was at University of Oregon from '68 to '72. You'd think there would be some revolutionary fervor there for her, but no, it was more present for my dad at Duke. There are just some of those things.
That's who they more or less are. My dad became a teacher temporarily. He taught at Bemidji and then at Lewis & Clark College, which is where he did his undergrad. Then he went into product marketing after doing other things. They had an insurance business when I was about four and a half, five years old. They had to declare business and personal bankruptcy, sell their house, and become renters in the suburbs, which is probably where I discovered that the suburbs are not for me.
How old were you when you discovered the suburbs weren't for you?
I think I said to my parents when I was five or so, as we were moving out of the house that they had built in the West Hills in Portland, that my life was over. Very dramatic. I also was five when I told them I wanted to be Debbie Harry. There were lots of things going on with five-year-old me - a real flair for drama.
We did that - on paper, we should be doing pretty well, middle-class white people. But in fact, not doing great financially. Having to live in that in-between phase where we're living in reasonably affluent neighborhoods, but we're renters in that neighborhood. I think that's part of the "there's always a story" orientation that I have.
From being a little kid, the other thing that I was always oriented to was writing, even a mimicry of writing. My mother has memories of me pretending to be local NBC affiliate news anchor Kathy Smith, but writing my news copy before I knew how to read or write. I'd get a paper and pen and pretend to write, then I'd sit on these open stairs like they were a desk and read the news.
There are those things, and then living in okay suburbia that's mostly white, mostly pretty affluent, and tries to ignore the parts where it's not that affluent. The in-group/out-group things feel extremely tenuous because there's no obvious reason why you would be in an out-group. You have to know some things about people in order to reject them. You can't just do it on the surface.
I think that probably taught me that, in my parents' constant admonition every time I had a question to "look it up," the other place I think I come from is a kind of default setting of skepticism - a strong belief that people tend to be lazy truth tellers. You can ask them all sorts of questions; you don't even have to ask them questions. I seem to have a face for confession. People tell me all sorts of things unsolicited. Then a kind of skepticism about constructed narratives. I don't know if that answers your question.
Yes, absolutely. You also answered my second question. Often it's "What did you want to be when you grew up as a kid?" But you've shared a few already - Debbie Harry, your local broadcaster. Do you have other recollections of what you wanted to be when you grew up as a kid?
At the point in time that I remember adults asking me that question, I didn't know the answer. I tended to have a real freeze response to these types of questions as a kid. "What's your favorite color?" No idea.
There was a period of time where I felt like my grandparents, my mom's parents, every time they got me something for my birthday or Christmas, which is very close together, it was purple. It was like a purple bike once, a terrible dusty mauve puffer coat, but it was purplish. "If it's purple, then Farrah probably wants it" was the assumption. I like purple fine, but I like all the other colors too. Those kinds of questions tended to leave me frozen.
Tell me a little bit about where you are right now. We talked about it a little bit before, but you grew up in Oregon. Where are you now? And what are you doing with yourself?
Right now I am in the attic all-purpose room. It is my office. There is a bed on the other side of this desk, a Peloton over there, a sofa and a PlayStation over there, a yoga mat over here. And thankfully there's also a bathroom, so I never have to leave this room. It is in Springs in the town of East Hampton, about as far east as you can get on the contiguous 50 states. It is partly by choice and partly by COVID that we live here.
I have worked for myself now for 12 or 13 years. I have gone through multiple iterations of what that business looks like, but fundamentally it's always been an insights-driven strategy consultancy or insights-driven research. That's funny.
I think over time, what I returned to in some ways is - because I didn't start out as a researcher. I started out actually as a copywriter. I worked in advertising - never a good time to be a junior copywriter. I think in the late '90s, early aughts, I don't know if it's better now. I was about to say it was worse. I don't think there's ever a good time to be a junior creative in advertising, but I also think it's tricky to be a junior copywriter who is a woman because there is still this belief that women aren't funny and that good copywriters are funny.
I think I'm pretty funny, but I didn't have Tina Fey's "Bossypants" around yet. I got a lot of really weird feedback early on as I was shopping my portfolio around and looking for work. The weird bait-and-switch would be I'd get Court Crandall at Ground Zero who liked my book and would have me come in, but inevitably he'd be out on a shoot, so I'd meet with someone on his team.
I remember one time I interviewed there twice. Both times Court wanted me to come in, both times Court wasn't there. The first guy I talked to is flicking through my book and he says, "You can really tell that this book was written by a woman. There are no women on our creative team." I just looked at him like, "What's that mean?" It probably wasn't until I was in my early 30s that I put it all together. Oh, he was saying they don't hire women. He just didn't actually say they don't hire women.
I attempted that for a while. Then I had a friend of a friend walk my resume into Chiat Day. I was living in LA. I interviewed with a guy named Sean, who was at the time the head of the Apple account. He said, "I have a job. It's not the job you should have, but you should work here. I want to give you this job, and then I want you to keep working on your portfolio and pester the hell out of the creative directors until they give you a job. I'll throw you little creative assignments nobody wants, if you're up for it." Sean is a great guy.
I took that job and shortly after, they restructured the account slightly. They had someone running the international side of the business and Sean was running U.S. The woman I worked for was Nina Lalic. I don't know if you've ever heard her name or come across her. She was a planner at FCB before she came to Chiat Day. Then I think she went and started her own business - Serbian, glamorous, brilliant. I learned planning working for her because she'd not been an account person. She had always been a strategist.
I started out as a copywriter. I became a strategist more or less working for Nina. I learned a little bit about how to do research because we would do quick and dirty, guerrilla-style research from time to time. It has always amused me, the narrative that Apple does not do research, because Apple very much does do research and Apple's agencies very much do research. They don't do a lot of ad testing, but they do a lot of lifestyle research, use case research, ethnographic stuff. They do a lot of research - the really useful stuff.
Then it was the 2000, 2001 dot-com bust. Accounts were fleeing Chiat left and right. I kept surviving every cut. I got to a point where I didn't like the guy who was now the head of the account. I didn't want to stay in LA, I didn't want to stay with my boyfriend. LA was always temporary for me. I love LA, it's not for me.
I just, all in one fell swoop, broke up with the boyfriend, quit the job, moved home for a summer, and then went to New York to go to law school. The law school move was for two or three reasons. One is, I'd had my wisdom teeth pulled out and was stuck at home for a few days watching West Wing reruns. I was like, "All these people are lawyers. Maybe it's finally time for me to acquiesce to what all these adults have been telling me for years and just go be a lawyer."
The other thought I had was, "If I go get a graduate degree, maybe I can buy myself some credibility. Maybe people will listen to me if I have this degree." Turns out this is a very common reason that people go back to school to get graduate degrees - to be taken seriously by their peers. I was not alone in that. I found that out more recently because we do a lot of work in higher ed now.
Is there a name for that phenomenon, or how do you talk about it? It's like social proof in a way. Do you have a name for that phenomenon?
We have described in research, we've described them as status seekers, but in a very specific way. The status they're looking for is credibility. What you see, particularly in returning adult higher ed - not people who went all the way through from undergrad to graduate school - is, in particular, a lot of women of color seeking master's degrees and PhDs because they have to dot every i and cross every t in order to be taken seriously.
I'd started to see a glimmer of this. I briefly tried to do a project where I wanted to talk to people who were not stereotypical ad agency people, but had built careers in advertising, and just interview them about how they built their careers. It was not meant to be, "Hi, I'm a white woman. I'll stand on one foot and you explain what it's like to be black in advertising." It was more like, "How did you actually build your career?"
The thing that emerged after four or five of these interviews was, "Oh, I thought everybody did either one of two routes. You got like an English degree from a really good small liberal arts college and your dad knew somebody, your cousin worked at JWT, and you got a job there. Or you did this 'backed into advertising' thing where you're an aspiring playwright, you had a summer job, you were working in the mail room, blah blah blah." It's one of those "I accidentally fell into a job as a creative director" stories that I used to hear a lot on informational interviews.
It was like, this doesn't make sense to me, especially because I'd gone to U of O for my undergraduate, did an advertising specialty, interned at Wieden+Kennedy, did all of the things, and still had to back my way in by going and doing other things.
You knew early that you wanted to be in this world.
My dad had been laid off from a job at a company called InFocus Systems. They made LCD displays and overhead projectors. He started his own company installing local area networks, wide area networks, LAN and WAN installations for companies around Portland. One of his clients was Wieden+Kennedy.
He installed their LAN system and he was there all the time fixing stuff. He was in Dan Wieden's office one day and some female creative director came into Dan's office, flopped down on the sofa and just went, "Fuck." My dad came home with some flyer he'd found near the copy machine or something that was made by somebody in the agency advertising how they like to go to a place called the Rialto to shoot pool. It was a funny ad someone had written just for their colleagues to get them to come to the bar across the street.
He brought that home and he said, "I think you should work here." That was my dad spotting it, because I had no idea. I think there's probably also a moment in high school where I would have become a computer graphics artist if someone had told me that was a job and you could go to school for it. I was super into animation and special effects. Any project I could do that involved video, I would do that. But that was the first time I thought about, "Oh, there are people who make ads. They don't just happen on television." That's how that happened.
But every step has been this, "How do I move upstream? How do I get closer to where the actual problem is?" After a while, both struggling to get a real job as a copywriter and then even as a planner, I was probably more interested in - I think this is also where some of the skepticism revealed that just being a planner wasn't going to work for me.
Clients would deliver you data or information or their own strategy. And you'd go, "Does that make sense though? Is that really - how does this business model work?" I would get looked at like, "What's it to you? We just need to make the ads."
I also remember trend report, youth trend report people coming to the agency to present to us and just being like, "Nuh-uh, that's not true. Maybe it's true for four blocks around FIT in the city, but no one in LA is doing this. I don't know what you're talking about." That made me want to get closer to where the insights came from. I think that's ultimately how I found my way into research.
Tell me a little bit about The Difference Engine. What's the work you do now and how do you talk about what you do?
I think over - when I first started it, I think I was trying to solve a problem that I later discovered wasn't really the problem. The feedback you got, particularly for qualitative researchers, which by this point in time is what I'd spent several years doing, and I had been hired into research companies as a strategist and then knew how to do qualitative research, knew how to interview people, whatever, but got better at that.
The feedback you always get is, "It takes too long, it costs too much money." I'm sure you had some of this conversation with Hugo with synthetic users. My sort of feeling about that was, sure, there are lots of ways we could make this less expensive. There are lots of ways that we could make it faster. I spent the first probably three, four years of the business just showing how quickly and inexpensively qualitative research could be done, using any kind of tool in order to recruit that wasn't Schlesinger and Associates, which is now Sago, because it's all the little middlemen along the way creating the delays and the added expense. Everybody's got to take their cut along the way.
But the other thing I discovered over time is two things. One is the only way to actually pull that off is to be as senior and experienced as I was. It's very hard to train up other people to do it. I tried and I just wound up stressing people out because they were too junior to be able to flex and bend with the kind of iterative approach to doing qualitative research. They needed to learn the rigor first before they could learn how to figure out which parts to cut.
The other thing I discovered is that's not the problem. I should have known this because I've told clients forever that when customers or consumers say it's too expensive or "I don't have the time," they usually mean that thing is not valuable enough to them. It's not really about the money. If it was valuable enough, they'd spend the money. If it was valuable enough, they'd invest the time. But if they look at it and go "it's not worth it," that's what they mean - "It's not important enough to me to part ways with my money or my time or my effort."
What I discovered in testing the limits of how quickly or inexpensively you could do qual is the fundamental issue is, do you think it's worth it to talk to your customers? That made me have to pivot my business, because I was like, "All right, the answer is not faster and lighter. The answer is do it properly and do it well."
From there, it was coming back to first principles. The metaphor is frequently "No one wants a drill, they want a hole." No one wants to buy research, really. They want to get advice. They want to know what to do. They want help making a decision.
In my early research career at Hall & Partners, we would frequently say to clients, "We're not here to pick a winner. It's not a beauty contest, not a bake-off, whatever." Most clients were fine with that. Some clients, though, would say, "I need to know which campaign to run" or "I need to know which positioning to pick. They can't all be beautiful babies. Some of these babies got to be ugly and some of these babies got to be the ones that I'm taking to the pageant. So which one is it?"
At a certain point that was just like, "Of course, why would you spend this money and not want a decision at the end of it?" Obviously I'm not going to make your decision for you, but I can advise you on what I think is the best decision to make, understanding what your goals are.
Over time, I think that's also evolved for me. I don't think it can be solely about your goals, because I think people can do a lot of sort of selective fitting of the facts. They think if they assemble them in this way, they will get the result that they want. It means that they make bad decisions. But their decisions are oriented towards the intended outcome, as opposed to "What do we believe in? What are we really good at? What would be a sound set of decisions that we can defend to our shareholders and stakeholders, even if the bet doesn't pay out?" There are good ways to bet and bad ways to bet.
That has also become an evolution in all of this. The way I talk about it now is we help business leaders make big decisions. We do that with the assistance of research and we do that through a kind of strategic lens.
I think over the last several years, with all the turmoil in the world, broadly speaking, it's become incredibly clear that not only do businesses need help making decisions, understanding their customers, getting out of the building, or "touching grass," whatever phrase you want to use. They also are struggling to figure out how to make decisions in a world where accountability comes whether you want it or not. Either the EU or the state of California is going to regulate you. Congress is going to investigate you. They're going to pass a bill to ban you. Regulation is coming for a whole lot of the industries that we work in. Even if regulation isn't, other forms of public accountability are. Reputational risk is a big problem across almost every category we work in.
One of the questions becomes, can you, with a straight face, say to the regulator, say to the lawyer from the SEC, say to the person suing you and their lawyer, that you did the best you could with the information you had at the time, and you acted in good faith?
That has become, in this era of talking to clients about using AI or developing more algorithmic or automated products and services, all the more important. Then you layer on the kind of questions of algorithmic bias and bias in general, and DEI, and other forms of equity, and ESG. All of these things are influencing the clients we work with.
It's become more important, I think, to help clients understand that the world is constantly changing, so they have to be flexible. But one of the ways to be flexible is to understand what your real mission and purpose is as a business. I don't mean that in the inspirational poster sort of way, but literally what are you in business to do? Who is it that you serve? How will you know if you got a good outcome or not? And then how do you order your decision-making to serve that, instead of just serving "We think we'll get a point bump in our sales this quarter, so we'll do whatever it takes to get it."
What's the first conversation you have with a client? Where do you like to start your engagement with a client?
I have a set of questions I received from a boss a long time ago. They usually just start with a question, your second question, "Where are you now?" Where are you in the process of whatever it is you're trying to do? What have you done to get you to this point? Why do you feel that you need this kind of help? Typically, they are coming to us for research. Why do you think you need research?
Then there are questions about, "What decisions will this help you make? Who's going to be involved in that decision-making process? What will be a good outcome for you if we do this?" Then we have typical questions about timing and budget and all that stuff.
But first we start with getting ourselves oriented. Where are you now? And what are you trying to accomplish? I think a lot of times, for good or ill, that conversation reveals that they're not ready to do research.
We had a new business call about a month and a half ago with an organization that wanted to understand a particular form of discrimination, let's say, that was present in the culture and that they were getting some anecdotal reports was on the rise. They wanted to do some research for that.
The thing about it was, they hadn't really done the legwork yet to figure out - I think there were two things. One is they wanted to launch in the fall with a campaign. First of all, is a campaign the answer? You've already decided what the outcome ought to be, but is that the outcome? That's an open question.
The other thing is, you're talking to me in the middle of March to launch a campaign in November, October, November. I've worked in agencies long enough to go, "That's not how that works. If you wanted to have this conversation, we needed to be talking last fall."
It did lead me to say, "Given what you want to do, there is research that could help you, but it's not the research you're asking for. Also, you're asking for something that academics have done a lot of work on. We could start with reading the relevant literature. Either I can do that for you, or you can do that. But you need to get grounded in what the problem really is."
I candidly agreed with them that this problem was present and growing, but there needed to be a little bit more meat on the bones before we could even begin to think about writing a discussion guide or a survey instrument or that sort of thing.
It was just like, "It's not that research isn't what you need. It's just, you're not ready to do it. There's a whole bunch of legwork and groundwork that needs to be laid before we can start asking people questions."
That is a step that often gets skipped by clients. "Let's just rush out and go talk to some customers," when we don't really know which customers or what we want to talk to them about. We're not sure about how we want to talk to them.
Of late, I've had to argue with clients about the best way to ask about gender, for example, in surveys, where they're like, "We've always asked it this way. So just ask it this way." I'm like, "Except we see when we ask it this way, like one to 3 percent of our sample identifies as trans. So you're not going to know that. And it might matter for some of the services you want to offer. Why don't you just let me ask it this way? Also, I'm not making up this process. I've learned this from social scientists who've developed a better way of asking this question. We've done some research to make sure that we're asking it the right way."
But these kinds of things are just like, if you don't know enough walking in, you're going to default to old habits, or you're going to focus on stuff that's not really all that important, or that you could get that information someplace else. We don't need to repeat research that is well covered by the academy. We could just read the white papers.
What do you love about the work? Where's the joy in it for you?
I think there are two places - no, there are three steps that I love.
One is the part where we have that initial conversation, and the partnership starts to form. It's now a collaborative effort to identify what the real question is, and what the real decision to be made or problem to be solved actually is. Helping our clients clarify that question so that it is something really worth investigating and really worth doing something about.
I love a puzzle. I love a mystery novel. I love a spy novel. Let's figure out what the real problem here is and get underneath. The question they come to me with is never really the question. There's always some deeper question. Let's figure out what the real question is. That's fun.
The second part of it that I still derive a lot of joy out of is the generosity that people show in responding to my questions - the willingness that people have to share with me their experiences, good, bad, ugly, indifferent, and their opinions. I find it humanity-affirming. As much as I am a skeptic and don't think much of popular narratives about groups of people, I really love people telling me their stories.
I love trying to engage in the perspective-taking that's required in order to actually hear their stories. Because one of the things that's very easy to do is listen to someone tell you a story and go, "That doesn't make any sense. You must be stupid." The actual truth is, people's stories make sense to them. So you have to try to put yourself in their shoes and figure out, "Why does this make sense to them? Does it make sense to me? Why does it make sense to them?" If you can start to understand that, then all sorts of other things open up.
The other thing I like about those encounters is that on almost every project, there's something I'm wrong about. There's something that I didn't expect. A lot of times, to be honest, there are projects where - one of the things I like to do as we begin fieldwork is state my priors. What do I think is going to happen? What do I think is normal? What do I expect from this group of people or whatever? Then we see what actually happens.
I like to do that both to keep myself honest and to make my clients keep themselves honest. But the thing that has happened many times is, I go into a conversation thinking, "This is how it happened for me or my family or someone close to me, but that's such a specific case. I shouldn't expect to hear anybody else tell this story." And then you encounter all these people who tell structurally the same story.
It's like, "Oh, I'm like everybody else." Or, "My friends are like everybody else." Or, "This is a universal human experience." This is about how people interact with systems and structures, and the systems and structures are what they are. So there are only so many ways people can interact with them and have only so many outcomes. Once you encounter a system or a structure, infinite possibility is not before you. There are only so many possible outcomes.
That's always very humbling. I get a kick out of it every time. "Aha, this is - everybody's doing the best they can and sometimes it's not good enough. But most of the time they're running up against impediments that they didn't see coming."
If more people knew what everybody else was going through, they probably wouldn't feel so isolated. Also, they probably wouldn't feel so isolated or shamed by failing at a process, because everybody's failing at the process. The process is set up for them to fail.
I was going to say a third thing, but I don't actually believe it. I do love telling these stories back to clients, but it's actually one of the least fulfilling parts of a project a lot of the time. It's very hard to incept your perspective-taking and empathy-building into somebody else who did not go along for the ride with you.
I had a client in a startup years ago, who's a good friend now. At the time she told me, like, after the first dozen interviews, she listened to about her brand new baby product that it was nails on a chalkboard for her. She could not engage in the perspective shift required for her to be wholeheartedly listening to these people's feedback, and then going and pitching her business to investors. "I'm getting all this 'it's not great, it's not ready, it's not perfect' feedback over here. And I got to go tell these investors that it's great and it's only going to get better."
So I have some empathy with them for why it's hard to hear these stories and switch their point of view.
Do you feel like there's been a shift in terms of how clients understand research today? In particular, because I'm selfish, I'm curious about the role of qualitative research. How do you advocate for that? I wonder, because I did talk to Hugo, and I do think synthetic qualitative research is a thing that's arrived, that makes me wonder, what is the understanding that you get from qualitative research that you don't get from anything else that companies either understand or don't understand?
I have a client right now that I work with who I'm a bit ride or die for. He's the CEO of the company. He is a believer in qualitative research. In particular, he likes ethnographic approaches. He is a rarity. He genuinely is.
But he's trying to - as a CEO, he's responsible for so many different areas of responsibility. He's got to be accountable to his holding company. He's got to be accountable to shareholders and so on. He's got to deal with staff and employees. He's got to deal with customers. He's got to be responsible for the product as well at the end of the day.
What's interesting is for him, that thick data is pound for pound more useful. It allows him to see people as they are in the world that they occupy and be creatively inspired. I think he has a kind of a service mindset. He thinks about how he can better serve these people, not how he can deconstruct the things they say.
It's more, "Okay, if they're having a problem with this, how do I make that easier? If this is too expensive, how do I make it less expensive? If they need help here, how do I give them help here?"
He has to balance it against the business interests, obviously. How much help is enough is sometimes a question. But I think that's actually a really important question. I think sometimes marketers can get really interventionist about things and people just need a little help a lot of the time, not everything done for them. There is an idea of useful friction after all. You don't want to - seamless is not necessarily the thing you're going for.
But he has that orientation. For clients that don't, I think they exist on a little bit of a spectrum. There are the clients who just have very little experience with qualitative research.
I had a new business call a couple months ago where they were like, "Oh, but in-person research is better than remote research, right?" I was like, "It depends, right? It depends on what you want to learn and how you want to learn it. It is not true, and probably never was, that one is better than the other. They're just different methods. Sometimes you want to do a mix of both." That's a kind of naivety problem that I think is relatively easy to solve.
I think there's another side of it, and this is particularly prevalent in the U.S., is just an orientation towards a phrase I despise, which is, "The data will tell us what to do." What they love about this moment is the apparent ease of fielding large-scale surveys and also the availability of analytics. Those two things to them obviate the need to actually talk to people.
That's - and it's faster and they feel like it's cheaper. I'm not convinced it is, but it's a trade-off. If your orientation towards pricing research is "what is the cost per interview," then sure, quant is cheaper. You might spend the same amount of money and not get substantively different answers than I'm going to get in a handful of interviews. The difference is you feel like you can cover your ass with a thousand-person survey in a way that you can't with a 15-person qualitative study.
I think there's some of that - just a vestigial American business data orientation that does not countenance qualitative data as data, even though it is. This is the thing - I used to speak at Strata Conference, which is a data scientist conference. It was always like, "People are data too."
The thing about surveys - and I've said this in many a public forum - is asking a thousand people their opinion does not transform those opinions into facts. Yet clients treat it like it does. But that's just a cognitive error. That is not what happens here. It's just, you asked a thousand people instead of asking 15. That's useful because it tells you how many people have a set of opinions, but it doesn't suddenly make those opinions the same as truth, whatever that might mean for you.
So I think when clients embrace qualitative these days, it's for a few reasons. One is they don't start from an orientation that talking to individual people is a waste of time or money. They see the value of it. They want the data to be humanized. They want thicker data sets. They want to be able to improvise and explore as they go along, as opposed to having to preset a list of questions and hope that they're the right questions.
We do a lot of mixed-methodology stuff where we do quant and qual, but we always do qual first. We never do quant first.
I'm curious, even at the most elementary level, how would you articulate that? Why qual first, and what are clients risking? What gets lost, or what are you - what's the risk of being quant-only and denying the value of qualitative data?
One of the things that I always say about why we do qual first is the survey instrument will be a better quality instrument if we use the language of the people we want to survey. In order to do that, we need to have a few conversations with the people we want to survey about the thing we want to survey them about, so that we understand how they talk about it.
Sometimes it's literally how do they describe certain brand attributes or certain product characteristics. But also it might be, what is their mental model for buying in this category or behaving in this category? How do they understand their choices and know how to evaluate their options within a category?
Unless - I don't know, unless you've been - I do believe in brand managers who have been in a category for a long time, are constantly interacting with customers, constantly looking for feedback, and they get to know those mental models really well. But there's also a lot of turnover in marketing organizations and insights teams at companies. They come from a consumer packaged goods category and now they're in a services category. They don't know what the mental models are there. They need to do that work first to orient themselves so that they can do the occasional fast-twitch survey or whatever of customers, and ask questions in a way that's relevant.
If you don't do that - so I recently put in the field, against my own advice - it's not entirely true, I've had lots of conversations with people about this - but I put a little survey of my own in the field about the experience of IVF patients and fertility treatment patients, in part because we did this project last year about maternal health and it just felt like a thing that was lurking under the surface of those conversations. I was curious.
What was helpful was, I wrote the survey - it's too long, but I'm a qualitative researcher, so of course it's too long - and I put it in the field and asked a few people first to take it. What was helpful there was those people felt they could tell me if questions were worded in a strange way, or if it felt like it was oriented too much towards couples and not towards single people trying to engage in family building. You didn't include this option, which a lot of people do.
Even though I'd done a fair amount of pretty exhaustive lurking in Reddit forums and talking to people who've gone through it and talking to pregnant women and so on for other things, there was stuff I left out. This survey is too long. It could be twice as long. It could be three times as long. It's never going to be fully exhaustive.
But there were some specific questions I had about it. If we'd done depth interviews first or some ethnography around it first, it would have been a shorter survey, probably. It also would have been a tighter one.
Frequently I also talk to clients about a handful of interviews, like eight interviews, will enable us to write such a better survey that the survey will be higher value to you and it will help you make the decisions that you want to make. It's worth this incremental additional investment.
We also have projects where my quant partners will literally tell clients, "I think you're better off just doing qual with this particular question, because it's so textured and layered that the complexity of the survey instrument would start getting to the point where I don't know that we could say we have valid results. It's easier to just have a conversation with people."
Do you have any rules of thumb around when qual - what's a qual problem and what's a quant problem?
The basics are like, if the answer to the question is yes/no, a number, or like a one-word answer, then quant's probably fine - just go do that.
I say that and then it's - but are those really the right questions? I have done exercises with clients to help them sift through, like, "All right, let's just grab all the questions that you have and let's sort them into buckets. These ones are best suited towards qual. These ones are best suited towards quant. Or these ones probably exist in your analytics stack somewhere, and you could probably just go ask somebody in IT to pull it for you. Let's not ask questions we don't need to ask of people in an interview."
That's usually the sort of starting point. The other question is, how much do you already know? How deep into this category and these consumers are you? Do you have a lot of experience here? Have you done a lot of research before? Then maybe we don't have to do as many interviews. Sometimes we don't have to do any interviews at all, because you already know so much about these people. You are so close to this audience that we don't need to repeat that work if you're in a hurry.
Last question. You quoted Peter Drucker in one of your pieces. You made a very awesome case for research:
"The purpose of a business is to create a customer. The toolkit that marketing, innovation, and research shares - if these are the essential functions of the enterprise, and why the hell are you depriving your organization of the tools it needs to perform those functions?"
What do you say to a CMO - you've got this client who's totally ethnographically oriented. What do you say to a CMO or to somebody to inspire them to think about qual in a different way?
I think the main thing is that qual is a really great way to start to sense change. It's much harder to sense change in surveys that are not longitudinal, that are not doing them in constant tracking flights.
Even then, you'd have to include things in there that would enable you to sense change. Because people's opinions are pretty stable over time. Adam Mastroianni has written about this - we have pretty stable opinions about things.
But when people start engaging in a new behavior, or feeling a new kind of way about a thing, or losing faith with a brand or a category or an institution or whatever, that comes up in conversation in a way that it just doesn't come up in a survey because you don't know to ask.
As long as you keep having conversations with people, you can do more change-sensing than you can in quantitative. I think in the last several years we have gone through so much change and there's no end in sight for that.
If you want to be able to adapt - my line right now is "adapting to change with integrity" - if you want to do that, then qualitative really is the answer, the best possible tool for you.
It would be a whole other episode to talk about why I don't think synthetic users gets you there. But I think that's the reason to keep talking to people. It doesn't have to be ethnographic. It doesn't have to be focus groups. It doesn't have to be any particular methodology. But humans can tell you stories that will allow your sense-making abilities to say, "That's new. I wasn't expecting that. We haven't heard that before." That can then help you do everything else a lot more sensitively to change.
Thank you so much. I really appreciate you accepting the invitation. It was a lot of fun. Thank you. I appreciate it.
This was a pleasure. I hope you have nice weather where you are.