Sam Ladner is a sociologist, researcher, and student of productivity studying the future of work.Sam is best known for her book Practical Ethnography: A Guide to Doing Ethnography in the Private Sector. It is used regularly in design, research, user experience, business and social science graduate-level classes. Her most recent book, Mixed Methods, is a companion to her popular course Ethnographic Research Design and Innovation available through EPIC.
So, I start all these conversations with the same question, which I borrowed from a friend of mine. She helps people. She's an oral historian. She helps people tell their story. Oh, cool. It's an awesome question, which is why I use it, but it's so big sometimes I over-explain it. So, before I ask it, I kind of want you to know that you're in total control. You can answer or not answer any way that you want to.The question is, where do you come from?
Oh, what a great question. Well, okay, you want to know, I've got a very big answer, probably bigger than most people. So, first of all, I'm adopted. So, I don't know where I come from. The short answer would be, I don't know.
Yeah.
I was born in Vancouver and I was raised in British Columbia. I went to University of British Columbia. I moved to Toronto in my early-ish adulthood, lived in Toronto for a long time, moved back to the West coast to Seattle. And that felt very much like home. And then moved down here to San Francisco, San Francisco Bay Area. So, every place I've lived is part of me, every single place. I lived in Halifax for a year also, right? So, every place is a part of me. And if you ask me where I'm from, I don't know. I don't have a place where I'm from. It's weird.
Yeah. What is that like? I mean, can you tell me?
Terrible. I don't like it.
Is it terrible?
Well, I mean, I'm learning to appreciate its gifts, but it is very discombobulating. I don't feel at home anywhere, but I feel at home everywhere, if that makes any sense. When we talk, my husband and I talk about, "Oh, we could go to such and such, and we can move to so-and-so." I'm like, "Anywhere you want, man. I can live anywhere. It doesn't matter." He's less like that than I am, right?
What do you observe in him that he knows where he came from?
Oh, it's interesting. And the irony in his case, he's from Toronto, born and raised, lived there his whole life until we moved to Seattle. He'd never set foot in Seattle before. And to his credit, he was like, "Oh, sure, let's go for it." Ironically, he loves the West Coast more than I do, even though I'm from here, right? And he loves California. I love California too. California is great. But he's in love with it. He thinks it's amazing. And so he has a deep attachment to Toronto that I don't have, but he also has this kind of like, "I need to run away. This is running away. I get to run away if I come here." And for me, I'm like, I don't have anywhere to run away from. I'm from everywhere, so it doesn't make any difference, right? It doesn't feel as like an escapist for me.
Yeah. Do you have a memory of what you wanted to be when you grew up, like as a child?
Oh, well, my cousin once told me that I told her that I wanted to do my PhD, and I didn't remember saying that. I was like, "Really? Did I say that?" And she's like, "Oh, yeah." I said, "How old was I?" She goes, "I don't know. Six." And I was like, "Wow, really?" So I don't remember that. I don't remember really knowing what I wanted to do, but I knew what I didn't want to do.
Which was what?
I did not want to be in sales. I didn't want to do that. Found that gross. Wasn't interested.
How did you know?
How did I know? That's a good question. How indeed? I must have picked it up from my family. I know that in my family's case, professional jobs were considered to be good, right? So like, lawyer, doctor, journalist, priest, you know, things like that were okay. Chiropractor? No. Like, nice try, but no. Why don't you just go be a real doctor, you know? Professor, those are all okay. Business person? I mean, somebody has to do it, but why you, you know?
I think that's kind of where I picked it up. So professionalism, professionals was okay. Engineering wasn't really included, but it would have been acceptable. You know, because technically it's a, you know, profession. So I don't know. I don't think I had a picture. I mean, I went through phases of things that I really enjoy doing. Like, I loved chemistry when I was in school. I found that really interesting. I was very interested in photography and journalism, and I was a journalist, but I found journalism to be rather superficial, you know? So in order to do the kind of journalism, I think that would have been satisfying for me, I would have had to have moved to the United States, you know? And I didn't think that was possible. I didn't think that was something I could do. It wasn't until much later in my life that it was possible to move to the United States.
I had a PhD. I could qualify for a visa, you know? I got a job offer from Microsoft. They had the resources to make it happen. But when I was younger, I was like, I don't think I can do this. I don't think I can move to the U.S., and so I'm not going to be like this big shot journalist. I'm going to be like, you know, I'm going to be doing the daily deadline. And I'm like, this is boring, you know? It's stressful and boring. So why would I do it? So I ended up going to grad school because I wanted to do more in-depth research. And I said to myself, if you don't like it, you can quit, you know? But I loved it. So I didn't. I kept going.
Yeah. And so there was a shift from journalism into research? What was the attraction? How did you make that leap?
Well, the reason I was a tech journalist, and so I love technology and I love how people use technology, but I wasn't going deep enough on that daily deadline kind of treadmill that I was on. And I didn't see a way to go deeper in journalism. I just didn't see how that was possible. So I thought, if I go to grad school and I study technology, maybe I can understand it a little bit more and maybe that can lead to professorship. And so I did my master's and I enjoyed it and I started my PhD and I did the analysis. And I remember I was like, oh, demographics are on my side. There's so many academic jobs opening up in the next 20 years. Wrong. Wrong.
I mean, the math was right. I knew that there was going to be a lot of retirements. First of all, there weren't as many retirements as I was expecting. And secondly, they weren't replacing people. So academic job market, I'm sure I don't have to tell you, the academic job market's terrible. It's just awful. So I had to come up with a plan B once I was finishing my PhD and plan B was to do applied research. So still got to do more in-depth stuff, more than I would say the daily deadline, more theory driven too, even in applied settings, even in industry. I don't think journalism really has a lot of theoretical canon.
It's so interesting. You do find it from time to time. Individual journalists are kind of erudite and they might have followed a thread or a theme or whatever, but it's not the same as becoming an expert in a particular area. And I think that's really what I wanted to do because I wanted to understand. I want to look at technology and I want to be able to understand how it's being used pretty quickly. I want to do it empirically, but I also want to do it theoretically. And how do I do that? Well, you can't really do that as a journalist, I think. So I don't think I had these articulated back when I was making that choice. I was just gravitating.
Yeah. And so catch me up now. Tell me a little bit about where you are now. I know you're semi-retired. You've had a long career. Tell us sort of where you are now and what you're working on
Well, right now, I just started, I think, as I mentioned, I just put together a proposal for my next book. So that's probably going to keep me occupied. If it gets accepted, it's going to keep me occupied. I'm doing a little bit of consulting work. Like yesterday, for example, I was at a tech company in San Jose and I did a workshop for their research team, helping them understand strategy and strategic foresight. And that was actually, that was great. It went really well. And so I've been doing a little bit of that. I'd like to do more of that because I find that the researchers that I know are stuck. They don't know how to grow. Training opportunities are limited. Mentorship opportunities are limited.
So I've done a lot of teaching through universities, but also through EPIC (Ethnographic Praxis in Industry Conference). And that's good, but I'd like to do more of that. So I'm setting myself up to kind of do that. I started a newsletter. I've been doing a lot of like one-on-one conversations with people.
Oh, beautiful.
Yeah.
Oh, nice. So I'm curious, when did you first discover or realize that you could make a living doing ethnography and studying people's behavior with technology? Did you know that that was out there and you went after it or do you sort of?
Well, I didn't actually. I wasn't sure I could make a living at it. I knew I enjoyed doing it. So I thought I would try to do more of it. I was working during my PhD at a design agency and doing research for clients. So we designed websites and apps and we had clients. And so in order to make those things, you needed to do some research. And so I was a director of research and we did like surveys and interviews. And I always was pushing for field work, always pushing for field work, which worked sometimes. Sometimes it didn't. The budget wasn't there. Timelines weren't there, blah, blah, blah. So I was like, okay, maybe I can do this. When I finished my PhD, I went onto the academic job market, but I wasn't getting much in that way. So I also started my own company and I started doing basically ethnography.
Basically, that was it. I did other things too, a few other things like interviewing and stuff like that. But I made a living doing it. It was tough. It was really tough because I had to pitch it every time. But I started getting repeat customers and it was good. The problem was I wasn't doing enough tech work. And it was partly because I was in Toronto, I think. And the tech scene in Toronto is not that big and certainly not that adventurous. So I was like, okay, I want to do more tech work. How am I going to do this? I thought, should I open an office in Chicago or something?
I had clients in San Francisco, but I wasn't getting enough technology work. It was mostly communications and marketing and things like that. And I was like, oh, that's fine. I did some CPG work, which was okay, but I wanted tech. So then I got an offer to work at Microsoft and I was like, I'm gone, 100%, let's do this. So we moved to Seattle and my job there was ethnographic work, more or less. And so Microsoft has deep pockets, they can afford to do stuff like that. Even there, it was tenuous though, I have to admit. I did a lot of different kinds of work. If I didn't know how to write a survey, that would have been a problem. Interviewing, of course, always doing interviews and things like that. But ethnography was kind of the expensive option.
How do you describe ethnography to somebody that doesn't know what it is?
Well, I often start by just, I talk about the ethno and the graphy, writing about folk. That's kind of what it is. It's writing about folks. And they go, "Oh, okay. So you're writing about people, you're talking about people." Yeah. And the key difference, of course, is between that and just plain interviewing, not to mention surveys or focus groups or what have you, is the observational aspect, right? Being embedded, going to, being part of, deep hanging out, being there. The being there part is so important.
I mean, in journalism, they used to have this thing, go, don't phone, which I completely agreed with at the time. And now I understand more deeply why that works, especially if you've got a trained eye for observation. You can see, and a lot of people like to simplify this quite simplistically, in my opinion, but they'll say like, "Oh, what people say and what they do are different things," which is not false. That's not false. But it doesn't tell you the whole picture.
What is missed with that?
I think that opens the door, that statement opens the door a little bit to this problematic assertion that people lie. And your job is to find out the lies. Your job is not to find, people are not lying. They are genuinely unaware of the differences between their perceptions and their activities. And sometimes to their own personal detriment, story of all of our lives, right? People can see us doing things all the time that we don't ourselves even know we're doing.
So when it comes to ethnographic work, what you're hoping to understand is that contradiction. And what are the contextual factors that lead to such contradictions? What are the perception gaps that people have about their own lives? And you need to know what those things are. If you're going to be doing applied ethnography for the purposes of business, you need to know what these differences are. And it's not people are lying. I hate that. I hate that belief. People simplify it that way. And I understand why they're doing it, because it is true. People oftentimes say they do things that they don't at all do. And you need to understand why that is, what's going on there. And to their best intentions, they want to do certain things.
Yeah. But it also feels like I totally connect with you on that. And I feel like over the past several, I mean, decades, I mean, at least in my career, that this sort of behaviorism has taken over based on that idea that if you can't trust what people are going to tell you, then you should not talk to them at all.
Yeah, completely. And I can see why people have that belief. It's frustrating for me to hear it. Because I think it isn't just innocuously incorrect. It's pernicious. It creates a sense of distance between you and that person, your customer, or your stakeholder, or whatever you want to call this person. There's a psychological distance that you're creating. It lacks empathy. It's a power move. We don't care what you're saying, but we have good reason to not care what you're saying. Ooh, isn't that convenient that you don't have to listen anymore? Convenient, right?
Yeah. You get to be correct and completely self-absorbed at the same time.
Exactly. Anytime that those two things come together, you have to ask yourself, is this a little too convenient?
I was so excited to talk to you because not only, I mean, clearly you're sharing so much of your thinking out there, but I feel like you also advocate for qualitative in ways that is so, it feels a little rare, oddly. I mean, just to say that. Maybe, I don't know why that came out of me, but it's just exciting to me because I often feel maybe as an independent, it always feels existential.
Oh, I completely understand that
So I'm always wondering, is there value in this thing? And you make a beautiful case for it over and over and over again. How do you feel about the state of ethnographic and qualitative research today and the bias against it?
It's a very interesting question because I see this coming in waves, right? I see the difference coming in waves. There's the hot new feature or the hot new this or that comes out and people gravitate toward that. It's the same with research methods. There's a hot new thing and people gravitate toward that. I think in the case of, I mean, I'm embedded in Silicon Valley, right? So I'm surrounded by tech bros all the time. And the kinds of things that I hear are hilarious. I'm like, wow, this feels like I've had this cycle many times before. And you think this is the first time this has ever happened. How did that happen? So the quantitative bias, as you say, the bias against qual isn't, I think, a bias against qual per se. I don't know if I would have said this in years past, but now I believe this. I don't think it's a bias against qual per se. I think it is a Cartesian curse, separation, mind and body.
And there's a belief that there's a perfection that we can achieve if we are to banish all the things that make us most human. And so we're going to get behaviorist and we're going to get, I mean, Skinner was like, that's one phase, Skinnerian psychology, one phase of this. And you're seeing it coming again with, oh, test and learn and AB testing and analytics and objective truth. And I just laugh because it seems so childish. You clearly are not well-rounded if you think that that's what all this is, right? So the bias against qual in general, I think, is a bias against all that is messy, all that is unstructured, all that reminds us of our own ambiguity as humans.
So I do take it seriously that I need to defend qual and I want people to understand that there's value there. So I work hard at articulating it. And like you, I've had to articulate it over and over and over again so that people understand. But what I see happening right now with this quant bias is it's almost manic and frenetic right now. And I think it's probably because people are really working hard to say AI is going to solve everything again. And we all know it's not. It's all so obvious, right? It's so obvious.
What is so obvious?
Uh, well, it's a joke. It's a punchline now. Like, "Oh, well, just AI that," you know. And it's like, no, we all know. We all know that the ability to AI a solution is limited. We all know that. And we're pretending that we don't have to do the hard work. We're pretending that we don't have to understand nuance. We're pretending that we don't have to. And I'm just waiting for everybody to get so horribly disappointed. You're already seeing it, right? Actually, it's coming faster than I thought. I thought it was going to take a little bit longer, but we're already seeing like chat GPT just blew everyone out of the water. But then they're like, "Oh, it doesn't do all these things." So now there's like, I can't remember what it's called. They've got a new model that's got like a reasoning aspect to it. And they're like, "Look at how it reasons." And it's like, okay, you use like a GMAT equivalent. Like you use a standardized test. Of course, it's going to reason in situations like this. You're not getting rid of the humans. You're just not.
But it is on the same continuum as the bias against qual as you're articulating.
Absolutely. Absolutely. Yeah. So I think that it's kind of manic, like the, "Oh, people lie and we have to watch what they do." And, you know, also, you know, "That's messy and it's not objective" and blah, blah, blah. And I was like, methinks you doth protest too much tech, bro. You know, I'll just wait. I'll wait. You'll figure it out. You know?
So what's my next question? I'm curious about. Well, yeah. How does it? I mean, I agree with you. And I feel like I have a very I remember I had a conversation with somebody and I think they said it, but they said that the Western world kind of wants to be a machine when it grows up. Like something.
Yeah
I think it resonates with what you said that there's aspirational about the abandonment of the fearful.
Yeah. You know, I mean, that's Descartes. It's Frederick Winslow Taylor. Like I have to eliminate all of this weakness, this messiness. I have to eliminate it, because if I do, then I'm going to be perfect.
Right.
It's so transparent to me. Psychologically, it's so transparent. It's here. You know?
Yeah. So how do you make space for the work that you do? And you're an island of messiness in an ocean of tech bros.
Imperfectly, I would say I make room for it. I mean, this is part of the reason why I decided it was time to leave full time, big tech, because it was a struggle every day. And the benefit that I was getting was it was declining. And I knew it would write it had declined zero, but it was declining. What was the benefit that I was getting out of doing the fighting the good fight because I was fighting the good fight and fighting the good fight is a heroic and it's good and it gives you purpose and makes you more well rounded. And I felt like I had achieved like I kind of it's a forcing function to take on individual fights to take on positions so that you can formulate a coherent worldview and that you can you can formulate a good argument.
Yeah.
And being embedded, you know, at Microsoft and Amazon at Workday. I was able to do that every single day. Every single day I was putting myself on the spot to develop a good, coherent argument to, you know, work around the edge cases to warm my way into the mainstream of the organization. Good practice, good benefit, good skills, well roundedness, psychological benefits. But those were waning because I was had now is starting to have the same arguments every single time, the same arguments, and they weren't differing significantly. I'd seen so many.
But the classic, of course, is the "How many people did you talk to?" question, you know, okay, well, first of all, you're challenging my research findings of which you have none. I have them all, right? Okay. You also have never studied research methodology, clearly, because you don't even understand the difference between probability and non-probability sampling. But let me school you for a few minutes, my son, I will help you.
I had that question. The first few times I had that questions quite early on in my career, it was kind of terrifying. I remember actually having one one time with the CEO of my agency happened to be there. And she sat in on this call with the client who was like this big bank. And they were asking this question. And I answered the question. And I was like, because I didn't know the CEO was going to be there. And I like I was like, "Oh, my God, this is like intense," right? That was like the hardest client was like, "Thank you. I felt like I went back to school. This is great," blah, blah, blah. And the CEO was like, "Good job." And I was like, "Oh, God, did I answer the question?" So then I got used to answering that question. That question is really old now.
And there were a bunch of other questions like, "Well, it's all nice and good. But how much time do we have?" And like, "What benefit does it give?" And you know, you name it all the objections. I got them over and over and over and over again. You could say I hit saturation.
Yeah, that's right.
I knew exactly what the question was going to be. I knew exactly three, four different answers to the question. I would eliminate two of them immediately. I choose one sometimes I would intentionally choose something crazy and say, "I don't know how this is gonna be a flyer. Let's see how this goes."
So I felt like I had gotten the growth for myself of being the underdog on a regular basis. So the benefit was tapering off. But even more so than I had to assess the opportunity cost. The opportunity cost of continuing to be embedded in an organization that is not a native land, shall we say? I mean, there's benefit to it, but there's also cost to it. And I felt like I was starting to kind of lose my sense of I want to say self that's overstated it. I was starting to lose a sense of like, terra firma. Like everything was becoming relative, and relativism. There's no, oh, I can justify anything. Oh, I can do anything. And I was like, I'm feeling too embedded. I've lost my sense of professional strangeness.
Yeah.
And I don't really like that. So let's see how this goes. And I kind of like sat on it for a while to see how I was going. Benefits were not growing. Sense of floating away kind of increased. And I was like, you know what? I don't need to do this anymore. So thanks, guys. Best of luck. I left on very good terms. And my boss, I explained it quite clearly to her. I was like, "It's just time for me to go." Yeah. And she's like, "Oh, okay. Okay, I get it." And she did not notably, she did not say, "Is there anything we can do?" She didn't say that because she knew, right? She knew it was like time for me to go. So I mean, I was happy to be there. And I'm happy to leave.
Yeah. How would you describe the state of sort of qual now as compared to when you started?
Well, I mean, the familiarity is much bigger than it was, I would say, which is good. We haven't talked about COVID and ethnography and how that's affected ethnography. And it really has. And I'm sure that has something to do with my sense of like, exhaustion, I think. Because doing fieldwork is transformative. And like, nobody wants to do it anymore, because they think they can't, or don't need to. And again, you have to ask yourself, oh, this is a very convenient choice. Because it's hard, right? So I would say that it's good. It's on the cusp of having, it has to get more forcefully out there, not just qualitative in-depth interviewing, but like, really emic positioned qualitative research, which may involve fieldwork. But you can't fool yourself into thinking that just doing, you know, Zoom-based interviews all the time is going to give you everything, you know, that everything you want. And I so I say good things, it's the familiarity is much larger, people accept it, they know it's important, etc.
But they're not taking full advantage of the qual, they're trying to be quantified qual a lot of times, you know, which is like a total waste, right? I had a, an interaction with a VP at my old company, I had done some research with him and his team. And I, qualitative Zoom interviews, right? And I mean, I'm a veteran practitioner. So I was able to get good stuff out of these interviews. And I gave him a memo. And he's like, "Great." So he went to the C-suite for something for this proposal he was going to make based on the research that I had done and some other research that that he had done, like, you know, competitive analysis, things like that. And he didn't tell me he was going when he was going to the C-suite, he didn't offer me a look at his presentation. I didn't take it personally, but he shared it with me after. And I looked at it and I was like, "Oh," and I ran into him in the coffee area. And I was like, "Oh, thanks for sharing the slides." He goes, "Yeah, what did you think?" And I was like, "Well, a few things were wrong." And he's like, "Oh, what parts?" And I was like, "Pretty much all of it."
Oh, wow.
And he's like, "Oh, really?" And I was like, "Yeah," I said, "You know, we had a conversation about qual and the power of qual, right?" I said, "You know, everybody, we all know we need quant and qual, right?" He's like, "Yeah, absolutely. You need to know the how and you need to know the how many." And I was like, "Exactly. You need to have both of those things." I said, "You took the qual stuff and you turned it into quant, which sucked it dry." I said, "If you had asked me before you went in, I would have coached you a little bit on how to present the qual in a way that is powerful and differentiating." And you kind of like lost some of the value. So I see that trend.
Yeah.
Which is a problem.
Yes. How do you, you write a lot about this. How do you elevate qual in the organization?
It's funny because I think a lot of times as researchers, what we do is we try, we privilege the methodological, you know, aspects of what we're doing. And if you take, in a meta point here, the emic position of your stakeholders, they don't care about that. This isn't religion for them, you know, where it is for us, right? It's not something that they're super, you know, steadfast. So the etic position would be like the researcher defines the unit of analysis and the categories and the areas of interest and you fit yourself into it. A survey that asks you something that gives you a set of answers and you don't fit into those answers. That is etic right there.
Right.
You know, like how often do you drive your car? And it doesn't ask you if you don't even own a car, right? Like, you know, like, uh, other, you know, um, that's the etic position. The emic position is I'm going to start with you. You're going to tell me what's important.
Oh, I see.
You're going to be the person who is guiding the direction of what's interesting here. You're going to give me the area of, of importance, right? So if we as researchers take that meta point and say, our stakeholders are going to tell us what's important.
Yes.
As much as we love methodological discussions, they do not. So, uh, one way that I have encouraged people very strongly to elevate the power of qual is to not really treat it as the power of qual, um, is instead to think about it as the inputs to strategy. Strategy is not, uh, good strategy is not a quantitative thing. It is a qualitative thing because it's basically unique value. How do you give a customer unique value? You can't find that out from a frigging survey or analytics. You have no idea what they find valuable. Even if you see their behaviors, you don't know what's value here. You don't even understand anything. You're just a dummy. You're looking at patterns and it's a, you know, lagging indicator. So much better to position yourself as like, listen, I'm going to find out exactly what's valuable, why it's valuable so that we can take advantage of that unmet need. And I can't do that quantitatively.
Yeah.
It's impossible.
Yes. That's beautiful. I feel like, um, I, uh, I identify with that a lot. I feel like I'm a research person who likes talking about research. I identify as a research person. Um, but it totally, um, makes it difficult. But if I were to position myself as a strategist and talk strategy, it would be a whole nother ballgame. And it's a simple positioning exercise really.
Yeah.
It is shift to just be like, stop talking about the research.
Yeah. As much as you love it. I mean, I mean, we do love it and we can have those conversations with people who are interested. Um, but I think, I think a lot of people who do research are typically trained in social science and they don't really want to call themselves quote unquote, strategists because that feels like something we're not, you know, like, "Oh, that's a highfalutin name. And I have to be anointed with that. And I don't have an MBA. So how could I be that," you know? Um, and I, I think you can dispense with that because strategy is not something that you have to, you can easily study it, you know, easily on your own.
There was, um, what you had, there was a post you shared too about, um, somebody posted about that. Apple never did any research.
Oh, right. Yes.
Do you remember this assumption? And you had a beautiful quote about, um, the difference not between no research versus research. And you taught, you said the, it's the result of extensive experience driven intuition. And I've become sort of fixated on this idea of the intuition as being where, what qualitative feeds. Right.
And yes, I would agree with that.
So I just wanted to hear you talk about the role of intuition and qualitative, especially given what you just said, like, I feel like I'm going to ramble a little bit here because it's interesting to me, but that the business world talks about data as the only sort of valid input to any sort of significant kind of decision-making.
Right.
And so, and when we say data, we basically mean quant.
Yeah. Often.
And then all qualitative input happens informally and nobody talks about it. There's no, there's no language. It's like a social silence, right? That there's no talks about this informal input of the intuition into all decision-making because they just weren't trained on it or they don't know that it's there or they just don't have the words for it.
I think you're right. They weren't trained on it. I think that's it. You know I actually have a friend of mine that I used to work with who's an engineer. He's a software engineer. And I remember he told me before he worked with me that he would every single day, you know, sitting in front of his keyboard coding would be faced with instances where he had to make decisions that had direct impact on what the user was going to experience. Right. And he knows this, like he knows that there's going to be an impact on it. Is this red or is this blue? Is this fast or this slow? Whatever, you know. And he had to make decisions all the time. And he said he used to make it with what he called developer whim. What do I think? I don't know. Blah, blah, blah. Right?
And he would just hard code his whim into the feature and into the software. After he worked with me, he said he developed what he called developer intuition, which is different than whim. Now was it perfect, predictive, correct? No. And he acknowledged that. But what I gave him was enough contextual understanding of the user, who they are, what they value, what's important, what's terrible. I gave him these things in these little baby bird ways. Right. Often saying things like, "Oh, this is just a lunch and learn" or "This is just some fun content" or, "Oh, let's have a conversation." You know, never telling people directly that actually this is the research readout. You need to know this. I wouldn't do that because they would start talking about how many people did you talk to and all this stuff.
Right.
So I would give them contextual insight into who the user is and what they really value. So when he would be sitting back at his keyboard, he now actually could make an intuitive judgment. "Oh, you know what? I don't have the exact answer for this particular question, but I have enough intuition that I feel like this could be the right answer. And it feels much better than the whim answer that I would have inputted before." Right. And I think the really great product leaders in technology, CPG, financial services, they have this intuition and they also have the humility to know that it's not always perfect. What they don't maybe realize is that intuition was probably in fact structured by a researcher who had a rigorous methodological approach to gathering the insight they needed. So they weren't just like randomly picking up bits and pieces, you know, and throwing it together. Right. Like they actually had somebody structuring the qual intuition that they were developing. And, you know, Jeff Bezos even has a quote about this. He talks about this, and I hate the word that he uses, anecdotes, and it drives me crazy. But he talked about the best product owners have a fine honed intuition based on many anecdotes, which is the closest we're going to get for him saying qual data.
Yeah.
Anecdote is not the same as qual data, but even Jeff Bezos understood that there was a structuring that came to qualitative research that was good. And he could call out, he could discern quickly from product managers that had just like randomly done things with no decision making whatsoever and picked up tidbits, actual anecdotes, as opposed to data, qual data, but he doesn't use the word qual data, which annoys me. But then he says, you can't get that from the averages of surveys. It's not there. So fine.
Why isn't it there?
Well, because it's in methodologically, it's impossible to get it from there. You can get a lay of the land from descriptive statistics. And it will tell you things that are describing the people. They have these demographic characteristics, they have these levels of education, they have, you know, these geographic locations, but that doesn't tell you anything about who they actually are, you're going to make inferences based on that. And sometimes those inferences are well founded. So for example, if I tell you that the average education level of your customer base is, you know, professional degree and graduate degree, we can make some really great assumptions about their health outcomes, about their income, about their divorce rates. These are proven, we know these things, right? But you're not going to be able to learn anything about their preferences, uh, their stylistic choices, their aesthetic profile, uh, their everyday behaviors. No, you won't know anything about those things.
There are some things you will know a lot about. I mean, I could probably tell you whether or not you're going to be divorced with three questions, right? Because the research has been done, but your product in that product area, nobody did that research because that's not general research. That's for you. You have to do that research.
Yeah. You have to create the structure where that develops intuition.
That's exactly right. That's exactly right. So I think it is a language issue. Like they just don't have the education about qual as a method. Like qualitative methods isn't something that most people realize is a thing. All they know is the scientific method. You know, they don't realize there's a whole other, you know, area of research. They don't get it.
Yeah. I'm curious about the distinction you're making between anecdote and qual data. I'm, I get excited to refer. There's a freakonomics piece about a quote that the plural of anecdote is not data. And it's, it's a popular quote and they demonstrate that there's more citations, like, you know, like a hundred times more citations for the quote, the plural of anecdotes is not data. And they kind of fact-checked it and caught and took it back to its source to a Stanford professor who was misquoted it as having said the plural of anecdote is data because what else, what else would it be?
Isn't that funny? Oh, that's funny. Well, I mean, anecdote, anecdote to me is, I mean, I, and I try to tell people this whenever they use the word anecdote interchangeably with qual data. Anecdote is qualitative insight that's gathered without regard for comprehensiveness, theoretical understanding, representation, ethical positioning, subject matter, expert. There's no filtering on how you get an anecdote. You get an anecdote at the gym, you get it at the grocery store, you get it from the coffee station, you get it from lots of places, but you don't, you can't tell me the difference between the anecdote you heard at the coffee station and the likelihood of you having heard something different.
Like you can't give me anything like that. You said, "Oh, well, the reason we went to the coffee station is because we actually study coffee. And, you know, I could have gone to the gym to do this, but nobody at the gym has opinions about coffee. And we wanted to know people who had informed opinions about coffee." That is not what people talk about when they talk about anecdotes. They just randomly throw things in that they have no systematic method to selecting how they find that data. They have no systematic method for excluding certain kinds of data. They can't tell you why they've made any particular decision to include this or exclude this. Qualitative data, on the other hand, can very clearly tell you what is included and what is excluded and why, right?
We included young people between the ages of 19 and 24, and they had to be not enrolled in any kind of schooling. They had to have a part-time or full-time job. And we did that on purpose because we wanted people to have, you know, blah, blah, blah. So the reason I think I can say this about young people's attitude towards paid work is because we had this systematic investigation. Like we went from here down to here, and this is what the outcome is. And then I'm going to do a comparative analysis maybe with the opposite, right?
Yeah.
I don't think people realize that you can structure qualitative inquiry rigorously. They think rigor is sheer numbers, you know?
Yes.
It's not. It's procedure. It's procedure. Do you have a procedure? And if you went to the gym and collected this, and then maybe you went to the grocery store and you heard it later there, what was your procedure? There was no procedure.
Yeah. What is the value, in your definition, what is the value of an anecdote?
Like an anecdote that I got at the gym or I got at the grocery store?
Yeah.
I don't think there's a lot of value.
Yeah.
I don't think there's a lot of value. I can't compare it. I can't compare it to anything. I can't say the gym and the grocery store, you know, opinions about such and such are different because of X. Like I can't tell you that. So like what's the value of it?
Yeah.
The only common denominator is me wandering around in the world and randomly bumping my head into conversations. So this is just my bias. It's not real, you know?
I'm curious about this question of there's nobody has any language for qual and how do you communicate the value of qual? I mean, what would you do if you had to sort of rebrand qualitative? It's sort of a tacky question.
I know. I hear you.
Do you know what I mean?
Yeah. I do know what you mean. It's difficult because I'm a bit of a wonky person and I know not everybody is, right? Like, you know, people don't want to hear, well, you need to read X, Y, and Z and, you know, all of these books are important, right? The rebranding is different than developing familiarity. For researchers specifically, I often say if you can't articulate these things in a way that you feel comfortable doing on a regular basis, you need to kind of do some reading and get some language. You need to understand, you know, because people, you know, the leaders in the qualitative research field, methodologists have articulated these differences and they have clearly spelled out, you know, why qual is different, how it's different, how you should talk about it to other academics, granted. But if you don't have that, you should do that as researchers. But for non-researchers who are consumers of this and, like, this is the rebranding question, like I said, don't get hung up on the religion of the methodological differences.
Instead, talk about the very real awareness people have of where things have fallen for them in the past. Like, "We tried this survey and it told us we should do this and everybody said that they liked it, but we completely missed the mark." And it's like, okay, well, do you know why you missed the mark? I think it's because you didn't have enough contextual understanding. You didn't have a why. You didn't start with a why. And it's messy and annoying, but we need to start with the why. What is value? You cannot discover value from a quantitative survey because it's etic, right? The emic position, you know, value is an emic. It's emically defined. What's valuable to you is valuable to you. So I need to find out what you think is valuable. How am I going to do that from a survey? I can't. I'm going to make all sorts of assumptions. I'm going to put you in little boxes and you're going to go, "Well, it's not really that, but you don't give me a choice, so I'll stick it in there," you know?
Yeah. That's beautiful. Do you want to talk a little bit about either your book or strategic foresight, like just sort of what your own practice?
Sure. I'll tell you a little bit about the book. The book is about strategic foresight and how to practice it in the organizational context.
Nice.
So there's a lot of books on strategic foresight that talk about like how to do it. And some are, you know, on the spectrum of very academic to very practical. So there's sources out there that help people. What I think is missing is doing that kind of work inside an organization, inside a particular organizational culture always proves to be so much more difficult. And people aren't telling practitioners or want to, you know, want to be practitioners. They're not giving them the tools to understand why it's so difficult.
So that's kind of the extra wrapping I'm going to put around the practice. How to do it, definitely. How to do it effectively with current technology and tools that you can use that, you know, increase your productivity significantly. But when push comes to shove, you're going to be facing allergic reactions from your organization. And you need to understand why and what they really indicate. They don't mean your work is not valuable. They do mean that individuals have anxieties and individuals aggregated up to organizational culture equals obstacles, organizational traps. So what are those traps? How do they function?
And a lot of it has to do with temporal bias. A lot of it has to do with our ability and inability. Yeah. We, the further out something is psychologically from us, the harder it is for us to understand it and the more abstract we talk about it. So that is both in psychological distance, but also temporal distance. So the further out we go in time, the less tangible and concrete the thing is to us and the way we can describe it. This is a known problem. Like this is a psychological issue that our puny little human brains struggle with. Multiply that by, you know, 5,000, 10,000, however many people work in your organization, you're going to see that it's almost impossible to move that.
Most of these foresight books don't talk about that. They don't talk about the organizational challenges. So if you don't understand the organizational challenges, and you're going to follow the foresight, you know, the generic foresight process, um, it probably won't work. And you're going to say, "Well, it's because I wasn't rigorous enough, or I wasn't fast enough, or whatever it was." And no, that's not it at all. There's other reasons. So the wrapping of the organizational context is where my book is going to give something unique.
Wow. Beautiful. Sounds amazing.
Well, I have to write it. I haven't written it yet.
And then do you want to give a shout out about your newsletter and what you're doing there?
Sure. Uh, the newsletter, uh, comes out every Tuesday, except for not next Tuesday, because I will be on vacation in the Hudson Valley. Um, I write about foresight and strategic foresight and how to do it. It's a very short, I try, I'm trying to be ruthless about how short it is.
So good.
Uh, are you reading it? Great.
Yeah. And there's stints with links. I mean, not to sort of stomp all over your description of your own work, but yeah, I think they're great.
Awesome. That's great. I try to make it ruthlessly short. I'm also editing, editing, editing, self-editing all the time. So what I'm trying to do is also that organizational context wrapper around the practice. I'm trying to give that subtly each time and giving people really practical things that they can try, you know, uh, they may not be able to solve all the problems by trying one of those things, but, uh, little, little approachable bites toward bigger problems. That's what it's designed to do.
I guess I have one last final big question, which is about AI and synthetic users. And I feel like all the arrival of this stuff is existential in a way. And it seems to really call out for, um, you know, the, the making the case for, for qualitative, like what the value actually is. Do you feel that as well?
I do. Yeah, I do.
What's your forecast? When you look ahead and you think about, I mean, your point about synthetic users is fantastic. That's exactly what the machine wants, right? So I don't have to talk to people.
Yeah. Problem solved. Problem solved. Um, I'm actually quite optimistic. Like I said earlier, uh, I've been expecting the, the, the fifth AI winter to arrive. And I think it's arriving right now, which I was surprised. I thought it would take a little bit longer. Um, you're gonna, you're already seeing it in stock prices and certain stock prices, which is a way of looking at prediction markets. So people are recognizing that the hype is overwrought. Um, so I am actually quite optimistic.
What I don't want to see is I don't want individual qualitative researchers to throw up their hands and say, "There's nothing I can do. I'm being replaced." I have have a little more faith in yourself, you know, uh, that work with the context, right. Be curious about what's going on. Um, and you, you'd be surprised if you were not part of the hype cycle, but you're curious about the actual potential embedded in what's going on. You're going to be, it'll take, you'll be early. People won't see what you see right away, but you know, they'll eventually see it because the hype cycle is going to crash and it's already started. So don't, don't, don't count yourself out.
I really appreciate you accepting the invitation. And this was, I just had a lot of fun with this conversation. So thank you so much.
Oh, it's nice to spend time with you.
Nice. Enjoy the Hudson Valley.
Thank you. You too.
Bye.
Share this post