The Exchange

May 7, 2026

Synthetic Hype vs. Reality: Where AI in Insights Is Actually Heading

In the Exchange episode 127, Karen Lynch and Lenny Murphy unpack synthetic data, AI limits, and the shifts shaping how insights actually get used.

 Synthetic Hype vs. Reality: Where AI in Insights Is Actually Heading

Check out the full episode below! Enjoy The Exchange? Don't forget to tune in live Friday at 12 pm EST on the Greenbook LinkedIn and Youtube Channel!

 

In episode 127 of The Exchange Karen Lynch and Lenny Murphy unpack what’s really happening beneath the surface, from synthetic data hype to the quiet shifts reshaping how insights actually get used. The team breaks down why synthetic respondents can’t replace fresh research, how major players are repositioning as data hubs, and where AI starts to fall apart outside its training.

If you’re trying to make sense of where the industry is heading and what actually matters versus what’s noise, this one gives you a clear, unfiltered view.

Many thanks to our producer, Karley Dartouzos.  

Use code EXCHANGE to get a 15% discount on your general admission IIEX tickets!

IIEX North America

IIEX Europe

IIEX.AI 

Stay Ahead of the Curve! Subscribe to The Exchange Newsletter on LinkedIn Today! 

Transcript

Lenny Murphy: Had Emily. And there we go. We're live.

Karen Lynch: Even then, there were two extra events. So let's be real.

Lenny Murphy: What an upset. Guys, we were talking. Karen was just venting, because it's event season, which is probably the segue, Karley.

Karen Lynch: It is the segue, Karley. North America. So the calendar hit April 1st. And I reached out to Bridget, who handles all of the Speaker admin onboarding, all of that administrative stuff. And I'm like, Oh no, it's April 1st. And she's like, I know, because that just means that it is the month of IAX North America, which means we're both incredibly excited and at the same time, incredibly overwhelmed. So, uh, you know, we have, we have such a good program this year. And I know I say that every year, but this year I'm really excited. Uh, 50% of our spots feature brands. So that's really just such great news. We have such good brands coming. I just, yeah, it's going to be a great show. I'm sorry you will not be joining us, Lenny, because it's going to be a good one.

Lenny Murphy: Well, and, you know, a guest appearance in the city by King Charles.

Karen Lynch: Well, shout out to Matt Gibbs overseas. So he chairs for us each year. He does such a good job as a chair. And he's like, so did you hear? And I'm like, of course he would be the one to tell us the news that his king is going to be here in the city. And I'm like, of course King Charles will be there. So there's that feeling of, oh, cool, it's exciting. And then that feeling of, oh, no, the gridlock. So for those of you coming to North America, public service announcements don't delay your arrival on day one because your Uber may have trouble getting around the city. If King Charles is addressing Congress that day, you will have, there will be roads closed, there will be security. That's a big deal. So do not delay. Set your alarm early, get there. We also have security in our building. So plan ahead. It's going to be an exciting buzz.

Lenny Murphy: So, um, it will be. All right. So, uh, welcome back. Uh, I, I missed you. Uh, I'm sure the audience, especially after last week of me flying solo, they're probably like, Oh my God, Karen.

Karen Lynch: I've already apologized to Lenny last week. It wasn't even a sick day. Cause I started off, I was like, I'm going to be able to do this. And like, I had had an early morning call that I literally like went like this during the call and I put my head down and I'm like, cannot be on the phone. I developed a fever during the course of the night and I was like, I just can't do this. I can't do this. My brain just, so I was hit with another virus. And the week before that, of course, I was out in Seattle at X4. I missed it too though, Lenny. I missed it too. I missed talking to you. I missed doing the show, but also just kind of catching up with you. Because then when I saw you Wednesday, I was like, oh, hey. Oh, hey. Good to see you. Too long.

Lenny Murphy: Well, so one thing that wasn't on the agenda, but if you're OK with it, let's take a minute and talk about your takeaway from X4. Because obviously, that was a big topic afterwards when the announcements came through around the press gain, the funding, et cetera, et cetera.

Karen Lynch: So what's interesting about that, the funding, is that that wasn't the buzz while we were there. Now, maybe some of the investors tapped into it, because it happened on the tail end of the week, right? So maybe some of them, I'm kind of in the category of media and analysts. I'm on the media side, right? But we are all kind of together. The analysts, when we first got there, the day of the briefing, the first day, that wasn't even really the buzz. So maybe by the end, things had shifted. So the media, we were all still in everything that we learned. And then, you know, kind of came home and I was like, oh, that's interesting, because that wasn't a part of the conversation we were having there. You know, not surprisingly. So, you know, my takeaway, and I go to that event with the lens of what can I bring back to insights professionals, specifically, you know, I'm not in the I'm not serving the investor audience when I do what we do here, which kind of big picture, but, you know, It was really about, let's really close this gap of the insights we collect into action, informing decision makers like never before. I mean, moving to action, informing decision making happening at the highest level. But also, I'll write specifically about synthetic research. I had a lot of great conversations about the use of synthetic data, their use cases for it, and their use cases against it. Which, you know, like I thought was really refreshing, actually, to kind of talk to an organization that, you know, does such quantity of research for their CX purposes, and also in their research kind of division, to really get this refreshing point of view about what it's not useful for, and kind of looking at that way, which I thought, anyway, we can talk more about that specifically. But also the reality is, and I kind of left there feeling really hopeful for the industry, which is synthetic respondents. If you will, synthetic data sets are only good up to the moment they're trained. So we have to continue to do research all the time to make sure that the synthetic data sets are up to date. And therefore it's actually a really good signal for the industry to just kind of anchor that fact because that research is going to have very specific use cases. It's really good for some of that, you know, pre-market testing or, you know, quick AB testing, maybe really good for things that are not the big budget research, you know, but, and that's great. And it will have this new use case. But it is not necessarily current, not now, not what's going to happen a month from now. People change because the context around them changes. The context that you and I are having this conversation in changes. And the LLMs don't understand the context of the now, for lack of a better word. So anyway, so the role of the researcher And again, Karley, I don't know if you want to share the article, the first kind of article coming out of it that I wrote. I didn't share it in the brief, but I did publish it on our site, the new questions facing insights leaders. The one thing in there and the last thing I'll say on this is the role of the researcher, which is really for all of the researchers listening, whether you're on the brand side or whether you work for an organization, your role is really discernment like never before. Understanding these methodologies more than ever. You know, I don't always do this, but you need to get to IAX to see the vendors in the space so that you can make thoughtful recommendations for which vendors for which methodologies like never before. Because you it is like it is more essential that you are equipped with that decision making ability because the researcher is going to be relied upon to discern when to trust AI and when not to, and when to lean on synthetic, and when not to. The ability to say that, like you used to be able to say qual versus quant, that is, you need to get proficient in that. That needs to become system one thinking, right? Like you need to be able to say, you know, synthetic versus, you know, fresh, or AI versus, qual versus quad, or a moderated qual. Like you need to get fast in that decision making. And the only way you do that is by immersing yourself in all of what's current. So anyway, that's my real, big picture takeaway. So I think it's really helpful for the industry. What I saw, very exciting, redefined role for insights professionals. Hopeful as long as you understand that we've got a new, we've just got new stuff on the shelf, friends.

Lenny Murphy: I wasn't even there, but agreed. But those themes keep coming up, right? We are the transition point, the mingling between the orchestration layer and judgment layer, the human in the loop, right? But nothing works without that judgment, discernment, you know, human in the loop. So, yeah.

Karen Lynch: There was a gentleman I talked to, Jordan Harper. He is probably one of the smartest men I've ever talked to. Principal AI Thought Leader, I think is his exact title. Don't hold me to that. You know, I kind of said to him when he took the job last year, I was like, you know, Principal AI Thought Leader, that's your title. You signed up for that a year ago. Like, is your title going to be obsolete, you know, pretty anytime soon? Like, that's hanging your head, hanging your hat on the AI title feels significant. And he said, look, this is going to be 10 to 15 years off, this is our disruption. This is not a short-term disruption. This disruption will span probably the rest of his career. And I just thought, also interesting for those of us thinking about, is this blip? No, we've been talking about it for three years already, right? So the reality is, we will be navigating this disruption. It will start to become more comfortable. It will start to become more familiar, but this is the way of business for the foreseeable future.

Lenny Murphy: 100%, like anything else, the industrial revolution, right? That was an ongoing thing. The introduction of electricity, the automotive, whatever, right? When you flip into that new paradigm that runs, right? I mean, we may feel like, because we're in the middle of this, of the flip, that it is some type of just event, but it is not. It is an epoch. So yeah. Well, that's probably, our audience should not be surprised on the common themes running throughout the news this week. But it continues, right, from that standpoint, signals of the transition of the flip.

Karen Lynch: Do you want to kick us off? No, you go ahead. I don't know anything about these two firms. You go ahead and get us started.

Lenny Murphy: Norstat, they've been on a tear, acquiring and aggregating panels. They continued that. The Swiss research firm, Demoscope, added Qual and Quant, plus 60,000 responding consumer panels in Switzerland. And Norstat, they started as a kind of Northern European panel company, and they continue to expand now, acquiring more and more assets. Of building a very diversified, um, and, uh, access to humans. And I think that's, that's the takeaway for a lot of the, uh, what we'll talk about today is this to your point, the access to humans to share information, regardless of the use case, right. Building synthetic answering service, participating in qualitative studies, you know, AI training, it doesn't matter that if you have access to high quality humans, then you own an oil well, right? It's incredibly valuable, could do lots of different things, and Norstat seems to be doing that.

Karen Lynch: So this Mirror Voice, I want to talk about this access to humans, because Mirror Voice came out with 6.3 in seed funding, came out of self, with 6.3 million in seed funding to scale AI native And when you Read it, sure enough, it's like, this is an investment in phone surveys. And I just sat there and I'm like, wow, I don't know what that is, that to me seems significant, Lenny, that we're investing in a phone. And I'm like, like, I just read this one this morning. So I didn't really, I really didn't have time to kind of think about why this feels significant. So I'm glad we're going to unpack that. And I'm like, wow, I don't know. Packet a little bit, but investment in phone surveys, like gone are the days of spam. Is this because of data quality? Is this because that is linked to a human being? What are we?

Lenny Murphy: It's interesting. So yeah, let's talk about it for a minute. Cause it is, it is there. There's there there's angles here that are not clear in this article, but the form factor of voice. We've talked about that before. So in a first blessing, OK, it's they're just going to an AI voice as the form factor for engaging. OK, fine. It's no big deal. I mean, sorry, no big deal. It's like I get that. But there are significant complexities, particularly from a legislative standpoint around telephone sampling, particularly with automation. I grew up. My second job in research was with an IVR company, right? Yeah, so even back then, you know, in the Stone Age, there's rules for robo-dialing, right? That's, you know, a human must be involved for it to be legal. So it'll be interesting to see how this scales.

Karen Lynch: Yeah, so if it's AI-needed, which means it's chatbot, assuming maybe it's chat bot or AI initiated, but I have to think, and again, this wasn't in the article, friends, so if Mirror Voice, if you folks are listening, because we'll tag you in the post follow-up, what's the play here that warrants the phone survey aspect? Because I would think that maybe it has a connection that is also a data quality play. That's just the only thing I can think is that it might be connected to a real person, not a bot that if you have a phone number, you are a human being. And therefore maybe, I don't know, maybe that's, maybe I don't know.

Lenny Murphy: I, you know, they're going to the poor, et cetera, et cetera. You know, the gold standard still remains from a validity standpoint, um, uh, is, is random digit dialing telephone. However, it's expensive. Um, uh, it's. Engagement rates think blah, blah, there's lots of systemic issues against, you know, and yet they got some money. So, right, right. So I was so so to mirror my voice, we'd like to know more, we want to know more. Because it's very cool to go back to a kind of a foundational methodology of the industry. Oh, we shifted away from that, right? And okay, what's a get the official of an AI moderated interview, get that, but the, what is the use case, particularly when setting aside the form factor into how to get through to people. So, because that is, it is a barrier, telephone, telephone research, unless you have a relationship with the respondent is, I can say illegal, but pretty damn close. I mean, it's a big pain in the butt.

Karen Lynch: I feel like again, there's, there's, there's something about this. That's making us say, Hmm, right. When you get a telephone call from a, you know, a person tells you something or other, and you don't want to deal with it. You're like, no, thank you. And you hang up on this poor person who called you to ask for money or ask if you want to participate in the survey, you just hang up on them really quickly. If it's a buy. Are you as, you know what I mean? Like, is it less offensive? Like, what is the behavior of the user? Is there something to that? Did early results show that an AI-initiated call actually does better? I don't know.

Lenny Murphy: Back in the IVR days, there was extensive amounts of study that showed that it was a great methodology for sensitive topics. People would talk about things that they wouldn't normally remember doing a study on, uh, on, uh, sexual behavior and leave it at that. Then, for the government, uh, and people were, they would talk to the IVR. They would not talk to a human, but, but the barrier still remains that my understanding, my general understanding is that an automated system cannot initiate a call to a human without permission. That's kind of the fundamental component. So even though it happens all the time, we'll see. Mirror Voice, reach out to us, but pay attention, right?

Karen Lynch: There's something about this, yeah. There's something about this. If Lenny and I have spider senses going up, you know there's something. All right, I don't know who Omniscient is, but they closed 4.1 million. Precede round to expand synthesized decision intelligence platform for boards and companies. Here's what I like about this. If you Read this brief, I don't know if you have thoughts too, so I'm kind of cutting you off and jumping right to it. But I liked the idea that what they've done is AI agents each covering a specific domain. I don't know if you saw that in there, but it's like, you know, like there's one that's going out and covering supply chain and there's one that's going out there and covering regulation and one that's going out there and covering competition. And I thought, I want this. I literally had the at our level, I'm like, I want the dashboard. This is like product development friends. Like I want the dashboard that has the AI agents in the different categories that we work in. You know, I want the CPG brief. I want the, you know, financial brief. I want the, all of that in a dashboard bringing the relevant information to us. I'm like, I can see this use case. It's excellent.

Lenny Murphy: And it's safe.

Karen Lynch: It's not complicated.

Lenny Murphy: It's not.

Karen Lynch: So I, I, we each have like things, but I'm telling you like, do you have a dashboard yet?

Lenny Murphy: Oh, I did. I built a dashboard for, uh, uh, it's just me, but I, I built, I built it in perplexity using, uh, use perplexity computer, uh, to have a real time update. I created a whole dashboard for grit. So I'm waiting for Lukasz and Nelson to sign off on it. I created a whole dashboard for grit. So I'm waiting for Lukasz and Nelson to sign off on it. I created a whole dashboard for grit. So I'm waiting for Lukasz and Nelson to sign off on it.

Karen Lynch: I created a whole dashboard for grit. So I'm waiting for Lukasz and Nelson to sign off on it.

Lenny Murphy: I created a whole dashboard for grit. So I'm waiting for Lukasz and Nelson to sign off on it. You can create anything in perplexity now.

Karen Lynch: Yeah, yeah.

Lenny Murphy: The computer runs multiple models. So it is so. So it is better.

Karen Lynch: I see that now I haven't dabbled with computers.

Lenny Murphy: The computer is running every model out there. Claude is doing most calls on us doing most of the heavy lift. I have built like eight things in the past two weeks. But to your point, that's what I'm gone.

Karen Lynch: No, I just wanted what I want because I have agents I have. I have people doing the people I have called them people, friends, don't get nervous. And they're coming into a feed, but I don't have a dashboard. That's what I want. I want to log in and see this stuff.

Lenny Murphy: Jim Collison You absolutely can. And, it's so easy. So no, no shade on Omnition. I'm sure there's a lot more to it. And, and I'm sure they're also, but, but the idea, the use case that they tapped in is, look, these people want to see information pulled in. In one centralized location with the functionality to, okay, now can I query it? What are the applications, et cetera, et cetera? Yes, they've built that. We can do that too. For that very purpose, pull everything together. Here's, I'll throw this out real quick before we move on. I actually would like, if I find the time, I want to build that for us for The Exchange.

Karen Lynch: Yeah, yeah, yeah, let's do it. Why can't we?

Lenny Murphy: There's no reason that we cannot, so guys, we're gonna try and get more efficient because right now we just like to pull stuff and pop it into a slack channel.

Karen Lynch: Imagine if in real time we're sitting here and then our dashboard gives us an update and we can give a real time update.

Lenny Murphy: Well, so I enabled the computer in my slack. So I have not done anything with it yet. But I could deploy, if I just find the time to do it, it could pull everything out of our Slack channel and build a dashboard out of our Slack channel if we wanted to, and it could be accessible within our Slack channel. Anyway, all right, we digress.

Karen Lynch: The dashboard is just something we can both log into and we don't even need to use the Slack channel.

Lenny Murphy: I'll send you the link to the dashboards I did just so you can see it.

Karen Lynch: We digress, friends. We digress. When I say Lenny and I haven't talked in two weeks, this is what happens, right? Let's go.

Lenny Murphy: And I queued all types of stuff within IQ. They've had a long time relationship with MRI Simmons, but now they've joined com scores data partner network. And I think that's what set me. There's that Nielsen cloud. Now there's the com score cloud, right? So it's the same principle we just talked about. These large players are now becoming hubs. They're becoming synthesis hubs, data hubs, connected. If you're not a Nielsen customer, but you're a CommScore customer, you can access the same data, which is interesting that Nielsen and CommScore, who are really competitors, are now sharing information.

Karen Lynch: Yeah, I think that is very interesting.

Lenny Murphy: It is, because it's the data that is the value producer, that is the so they're, they're expanding their market penetration with their data, because that's what they make money off of. So it's a volume play. So very interesting to see that happen. Of course, they also have within that a whole other predictive technology called proximate, which is around contextual audiences. So it's a rad target so that was interesting. SurveyMonkey added Salesforce-based automated SMS survey invites. So SurveyMonkey is trying to become more integrated into the CRM world.

Karen Lynch: I don't believe so. The thing that's interesting about this is the idea that CX or support teams or whomever could send out surveys is quickly based on Salesforce updates.

Lenny Murphy: In SMS.

Karen Lynch: So yeah. So, but what's interesting is it really is so dependent upon the user updating, you know, um, but you know, maybe, uh, maybe the users are really good about that. Maybe there's, you know, all the integrations in play that as soon as, you know, the phone is hung up or whatever the call, you know, that the agentic workforce, you know, takes over and updates Salesforce. So, you know, that, like, that's interesting. I'm sure it's automation at work. So really cool.

Lenny Murphy: I get SMS text invites all the time after interaction with a brand or a company or, you know, so yeah, so doing that. Why don't you?

Karen Lynch: Good stuff, because, um, uh, well, the first two are to to, you know, kind of green book partners that were both in a both kind of got got to mention this, this, this past week in the news. So Panoply, which was previously Glimpse, they were founded in 2021 under Glimpse, they've become Panoply, they ranked number five in Inc's 2026 Regionals Northeast list, AI-powered research platform. They're getting a shout out because they are in the top five of the Insights Innovation Competition winners for IAX North America. So another reason come to IAX North America, if you want to kind of see why are they ranking number five regionally, let's check them out on stage, see what they're doing. Cause they're, they're one of the finalists and you'll be able to see what they are, what they've cooked up and how they got this honor. So, you know, hats off to them. And Bellamy has landed number 70, it's 77 Fortune's 2026 most innovative companies list. So, you know, one of them, what is Inc magazine? The other one is Fortune magazine, like, no small accomplishments.

Lenny Murphy: So, yeah. It was shouted out to John Sessions and the team of Bellamy, right? Because they're a really good example of a company that Bellamy's runs forever. You know, they're an established player in the industry and they keep innovating.

Karen Lynch: And they mentioned in their press release about it, like they have a commitment to, you know, their ongoing commitment to innovation. And I think that that's one of the things is like, um, We say all the time, you have to commit to this. You have to commit to being an innovative company. You can't just say, we're an innovative company. Are you? Because tell me what you're doing to think innovatively, to explore different innovations, to, I don't know, tell me what you're doing that's making you innovative, other than just trying to slap a label on yourself, because that is not it. Yep, yep.

Lenny Murphy: And they always, just shout out to them, it's a great example, they built the resources internally to be innovative. They invested in technology. They have smart people that are not wedded to a specific method or business model. They've kept pushing it to say, how do we deploy this and build the resources internally to do that? And they're not private equity backed. They're not VC funded. They've done that through sheer organic, you know, this is important for the long-term sustainability of the business. So it's very impressive and then All right Are you? We did It's Guys, I really want to talk to you. Are you I really really doing this because in this interview on CNBC it was a great interview and the interviewers were very like you're held 21 20 and 17 Billion dollar valuation hundred million dollar raise, but I'll tell you what these I'm sorry. I almost say to kids I shouldn't have my apologies. I'm 55 years older than you, several children older than you. And it blows my mind because this is their second company. They'd already built other companies while they were in their teens. And now we're launching this. So one, hats off. My God, overachievers. I feel like such a slacker. And they're so poised in that interview. They came across so well. I, I do not believe that you are actually your age. You got plastic surgery or something. You're like 40, just masquerading as olds, 20 something, but

Karen Lynch: Getting on the CMPC is no small thing.

Lenny Murphy: And there was a long segment. This wasn't like a little, you know, little fluff piece. It was like 15 minutes. So, uh, uh, it was a long conversation and about. The synthetic sample appliance in their model, uh, with, uh, uh, a prediction market component in there as well, to an extent to, uh, you know, and they're, they're moving and shaking.

Karen Lynch: So I love moving and shaking.

Lenny Murphy: And, and again, hats off, but apologies if I come across as like an old man, I cannot help it. I am so impressed with, yeah, they're impressive. I think they are impressive. Uh, and, and I am jealous as well. So, uh, all right. A whole bunch of new product features.

Karen Lynch: You want to, uh, yeah, well, you know, we're talking about, um, an IQ, uh, they've also, in addition to the other things they're doing, they have this ask Arthur chat, um, you know, letting their clients at, I mean, this, this, the, it's interesting because these two, there's these, the two product features that are the next, this one in the next like earth-shattering, right? But anyway, AI-powered access to consumer insights across its data assets, you know, where you can have kind of a conversational AI, a client can have conversational AI with your data, basically, and pull up your insights there. So I think it's like, yes, that needs to be happening. So, you know, kudos. And I think the same thing is going on with Ask Nicely, which I've never heard of They've launched Ask Nice AI, to turn their customer feedback into instant insights and then also action plans, driving revenue growth, blah, blah, blah. But the whole idea, yes, if you collect data and have insights at your disposal, you sure as heck are developing conversational AI tools to access that data and make it accessible to your clients. That doesn't even seem like... That's just, isn't that what everybody does now? Like it shouldn't even be something you haven't started developing yet if you are a data company.

Lenny Murphy: Yeah, 100%. And, and turn that into revenue growth, right? I mean, so there's an objective in doing that. And on that, so let's take a second on this one that YouScan launches Tiger Finder and our TikTok influencer discovery tool, finding creators by niche audience and engagement. It dawned on me this week in a way that I hadn't framed it before. Is the new influencer channel. Recommendations, search, et cetera. So now when we think about marketing spend, it already exists, the traditional social influencer channels, podcasts, shows, et cetera, et cetera. Now there's this whole other channel that is AI recommendations that disrupts the traditional search. So this idea of being able to identify influencers, whether it is across the board, right, or wherever that node of influence is, is becoming increasingly important, just as it is to understand advertising, traditional advertising spend. So, and now we're seeing, yeah, okay, they're AI-ing it to be able to make it easier to identify, oh, hey.

Karen Lynch: On our podcast, Nikki Quast, she's with Microsoft, but she is also on TikTok, Data Driven Nikki, Data Driven Nikki, shout out to Nikki, you can find her on TikTok. And somebody on our team was like, hey, have you all seen Data Driven Nikki? I guess she talked about the grit report on TikTok. And when I'm on TikTok, you don't even wanna know what I'm on TikTok for, but it is not Just know it's all personal. And my tribes are out there, but it is, and largely informed by things my children share with me, or my friends share with me, which is like nonsense, outside of professionalism, nonsense. So I was like, no, I did not come across Data-Driven Nikki. And so anyway, we have this podcast episode, Karley can probably find that too, because she was involved with that. Sorry, Karley, that's the second thing I'm throwing at you. But anyway, the point was, she is out here talking about insights and sharing it on YouTube, but a tool like this would allow us, so when I pulled this up, I was like, you know what? Are there others that I don't know about that are actually talking about insights in such a way that they are influential? Now, I can tell you that at Qualtrics, there are influencers, CX influencers, and that is their role and title and I've met them and their whole job is just really to talk about it because they have a, you know, quite the following. So, I mean, some of them have a larger following than Green Book does just in the CX space. So it's really interesting to me from a B2B standpoint, like it's obvious to see who's, it's obvious you want to play around who's influencing, you know, energy drinks or, you know, makeup brands or whatever. Whatever, like the CPG space, it's really obvious, but when you start to get into the weeds of B2B influencers, suddenly it changes and shifts. I'm like, I'm very curious about all that.

Lenny Murphy: So it is an influence. Lukasz and I had this debate many, many years ago, somebody did an independent analysis on, you know, the industry, uh, and Tom DeRike was the most well-known, uh, had the largest audience. I was the most influential and this year a decade ago. The point is, I was like, I'll take that all day long. Because influence drives decision-making. No notoriety. That's different. That's brand awareness. So, and that's important. No, but so as we think this entire marketing process, So I just use that as an example that, you know, don't ignore influencers, guys. It's a data feed that we need to have as we're looking at trying to help brands be successful. Yeah, because it drives it by the very nature of influence that drives your perception and your action. So cool to see companies coming out during that. Well, the and speaking of that chat GPT 100 million in ad revenue plans to open self-serve ads by April. If I recall correctly, they just did that like what, like yesterday?

Karen Lynch: They've surpassed a hundred million in ad revenue in just six weeks. And then it also spread there from less than 20% eligible users. So, I mean, this means a couple of things, right?

Lenny Murphy: So this means self serve means I bet you create it right there.

Karen Lynch: And I can create it, create my ad, let's go. Like, that's what we're talking about. We want to put it I mean, we're not going to spoiler alert, we're not going to do this. But um, you know, we create an ad for a green book, and we build it in chat GPT. And the next thing you know, we put it out there and we see what kind of revenue it brings in.

Lenny Murphy: It's all race rivals. If you have an audience, a user base, they can be monetized. That goes, well, the next question or the next meta, which by the way, I'm trying to build this for us too. This inspired me for Green Book. That's all of the conversation. Meta's internal AI analytics agent, this approach of cookbooks. Cookbooks, recipes, ingredients, which is basically how they're thinking about structuring the agents to leverage internal information, um, uh, to automate data analysis. Uh, and the, that met his primary revenue stream is advertising. You know, they need to understand all of this information so they can optimize their advertising effectiveness. Um, which ultimately means people on meta platforms, yeah. Watch an ad, click a link, go buy something. Uh, it's in the chat. So any of these platforms, that is what they are ultimately leveraging. Uh, very interesting.

Karen Lynch: It's very interesting. And I mean, certainly the tick tock shop is successful. We've talked about that, right? When you think about social media buying, uh, you know, I, I've said like, I don't want to buy from tick tock shops. I have my husband as, um, you see something, it, it just, it's a, it's made easy for you. And you're like, Oh, the algorithm says I would like this. Guess what? Was right, and now I'm spending some money if it's the right price point. And that is a very unnerving system as a consumer. So we're headed in a very interesting direction there. So interesting commerce, because really, if that's the way I'm shopping, if I'm using Meta or Chut GPT, or even TikTok to that degree for buying decisions, then guess what? I'm not doing as much. I'm not shopping in traditional channels, because I'm like, I've done enough.

Lenny Murphy: And that's a whole, we can have a whole show talking about this, this issue.

Karen Lynch: Um, we can have a whole show on the next set of yes, basically. So here's where we're concerned, concerned folks. Cause we've already been talking for 40 minutes, but we have some amazing reads to share with you. And, and we segmented them and I just don't want to cut them short Lenny, because there's some important things in here. So let's go, go run through them. Oh my gosh. So, you know, Kareem Pepin, who, We talk about her a lot. She's been very active, very active. But she's actually got two big LinkedIn posts in the last week. One is a post arguing that survey quality problems are still driven primarily by bad actors and bad sourcing rather than AI agents at scale. So debate is happening over there. She even says her bottom line is that there are AI agents as widespread as people think, human quality is still very much the issue. And thankfully, we still have some control over it. So if you want to have data quality conversations online, just tag her. She is fun to have them with. Like, there is nobody more passionate about data quality issues online right now than Corrine. So yeah.

Lenny Murphy: Oh, I'll also shout out Brandon Olish at Stanley Black & Decker.

Karen Lynch: Who I will be interviewing on stage. At IAX North America.

Lenny Murphy: And I, I'm going to have him on the CEO series, you know, he's not a CEO because he reached out.

Karen Lynch: I did not play that episode. I did not listen to it, but maybe, maybe it's been, let me dial, let me dial that back and see.

Lenny Murphy: That's what we've been talking about, but this topic, right? And I didn't get to that yet because he's in the, he's in the, um, he's, He's in the thing for that.

Karen Lynch: Just do it on the Green Book podcast. Don't mess with the CEO series formula. Let's interview him for the podcast. But yeah, he's on stage. He's a plenary session on North America. Rachel, we got it. We got to talk. We can't do these things without talking.

Lenny Murphy: You're right. So sorry, I shouldn't have even spread that out. I think we should talk about the topics that he and Karina tag team a lot in as you know, as a brand side leader. Doing the same stuff. So yeah, great posts there. I didn't read the YouGov response to Corinne's other post. So what was that about?

Karen Lynch: So Corinne also shared that. OK, so there's a survey fraud case that involves church attendance.

Lenny Murphy: And YouGov had to withdraw the study, had to say oops, right?

Karen Lynch: So very interesting. And I mean, I had seen it. She had shared Susan Griffin, you know, she shared it with us also. You know, human error missed some controls there. But what was interesting in her thread, if you scroll her thread, Doug Finley, who's at iGenie now, he even wrote, and I wrote this down because I liked it, YouGov knows what they're doing. They have their own panel. This puts them in a better position than a lot of others. So it's like, yes. So, but, you know, pay attention to that concept that they are experts and this happened to them, right? So their response was basically they explained what happened methodologically, and then they're like, we'll carry out an updated study later in the year. So they held themselves accountable. They were kind of transparent about it. They'll make it right, but you know, it happened. It happened.

Lenny Murphy: Yeah, yeah. And I remember I mean, it got a lot of, uh, of, buzz and we should probably recognize that too. In this era, this is a defined criteria. If the data's bad, but it gets large, it gets public attention. And then you go back and correct it. Nobody sees the correction.

Karen Lynch: I know, I know.

Lenny Murphy: It is that first thing that comes out. It is that hot take. That is what people embed in their minds and they will go forth for the rest of their life thinking that was true. Because they will never see the correction. It drives me crazy and all types of stuff, right? It happens.

Karen Lynch: And I think that that's a, you know, they do a lot of human interest studies, right? Things that are just interesting to people, so they're going to get eyeballs on them, right? And anyway, I think that that's part of the nature of that kind of a game. There's a lot of studies that, you know, human beings aren't necessarily paying attention to. But something that has to do with just how humans behave in the world. And certainly, church attendance might be one of those things that people are curious about. So yeah, so eyeballs are on them. I'm sorry it happened to them. At least they held themselves accountable and were transparent. They were like, yeah, it happened. So anyway, check it all out. It's pretty interesting. So I did not get to Daniel Gilbert's piece. Why don't you talk about that one? Because literally, I still haven't pulled up, but I'm like, haven't gotten there.

Lenny Murphy: It goes back to the thing that we talked about earlier, the creative dividend, basically our role as orchestrators, judgment, et cetera, et cetera, but dives in deeper in a much more, just dives deeper into all of that and the evolving role of researcher and the skill sets that we deploy as researchers that are not necessarily attached to the method Attached to our judgment so good stuff. Yeah, I've really been enjoying Christie's Oki's If you're not following Christie She is she is showing how she's experimenting with AI every week and basically making tutorials and You know and those don't know Christie, I mean she began at P&G and then she launched I'm so sorry Christie I'm so sorry. Knowledge hound. Knowledge hound. And so she's been on both sides and just knocked it out, showing like, hey, as a solo practitioner, because that's kind of what she is now, I'm doing all types of really cool stuff with these tools and showing how to do it. So pay attention to what Christy does. This was her defense.

Karen Lynch: Well, what she says, what's interesting in this, again, another LinkedIn post, but there's a thought in here. I'm going to Read it. Surveys never predicted human behavior perfectly. We knew that. We built workarounds for it. We applied adjustments, basically. We accepted 80% 85% re-interview consistency rates. We had an acceptance that it wasn't perfect. We built an entire industry around it anyway, because directional insight has value. And I was like, oh damn, that's the mic drop right there. It's never been perfect. What the heck? It never has been perfect. We knew it was never gonna be perfect. We knew we couldn't accept 100% accuracy our survey results, that would be nonsensical. Why are we expecting it? Anyway, I just thought I was like, damn, that is probably the most honest and accurate thing I have heard about the whole synthetic debate.

Lenny Murphy: So hats off. I loved it. Hats off. I'm going to have Christy on the CEO series soon. So that one, she qualifies. So we'll explore all those things more. Well, you know what I mean. Yeah Also, what do you think about this anthropic? Did you get a chance to look at the the anthropic study?

Karen Lynch: Yeah, what I liked about it is, you know again, it's a LinkedIn post. You click on the article. He said something that I could really relate to. Too many decisions, too much context switching, too much mental clutter, then returning to generate, you know, we turned to AI for that. And I was like, you know what? That's actually accurate for me. I turn to Chatty PT or Claude or Notebook LM when I am like, you know what? Overwhelmed up here. I need some help. Like that's when I get there. And he used the sentence, what they are actually buying is relief. And I thought, accurate. Because I am relieved when I get the help, when I get the lift, when I'm like, even if I'm in my email and it's Gemini and I'm like, I don't even know which email to respond to first right now. I'm so behind at this moment because I've been on back-to-back meetings. What's the most urgent email? I asked Gemini and Gemini said, take a look at these two. And I'm like, oh, thank you. It's like, anyway, so that's what I think he tapped into is this bigger AI story is why and how people are using these systems, not just how smart the systems are. And I thought that is such a research point of view to have, and we should all start to think about that. Why are we doing this, Lenny? Why do we want that shiny dashboard for ourselves? Why are you building? We're not talking about the whys enough. Anyway, hats off, Yogesh. I thought that was cool. It's a good point.

Lenny Murphy: Yes, it makes our life in some form or fashion, right? Decrease cognitive load, discipline, structure, you know, all those things. All of it. To be productive, to be more, I mean, ultimately, that's how I view all of this. It helps me be more productive and create more value with the things that really matter. And yes, you brought up those great points. Let's just fire through these last four, because I actually have to drop for another meeting here in a minute.

Karen Lynch: Yeah, I mean, what about coming here? It's been a long one. Two weeks. It's been, we just have to come back. We haven't talked in two weeks.

Lenny Murphy: We have it. Our friend data safety and a great, great white paper on grounded synthetic, synthetic data is missing foundation. They have a really interesting approach where the data, it's passive measurement on the device and data stays on the device. So it's kind of privacy centric. But back to our earlier thing, right, it's an ongoing feed of information to help update panels or models in a more holistic way. So not just attitudinal, but also behavioral. So cool stuff. I love that the inside association, their accessible insights consortium, um, insights for all toolkits. So, uh, how to make sure that, you know, our studies are accessible to, to everyone. Um, uh, you know, hearing, vision, neurodivergence, whatever the case may be. Uh, let's make sure that we're, we're incorporating those in. So cool. You want to talk about this? Really, there's a paper called the jagged technology frontier. It was basically on the assessment of how the AI tools for knowledge workers. Where are they good? Where are they bad? Where did they miss the mark? And yes, they can be very good in specific areas. The more you go outside the model's training set, the less reliable they are. So there's a lot of error that goes into that. And that's just a good thing for us to keep in mind when we're using these widespread models. And at last, a fun video on YouTube. Opinion polling might be broken. We go all the way back to the mirror voice. There's challenges with all that. But maybe we'll get to a place where that's being corrected. I'll tell you, I did have two pollsters recently out to me a couple of weeks ago. Um, and I won't, won't name drop them, but, uh, it was, they're very prominent pollsters that I follow and they wanted to talk about commercial applications and stuff. So, uh, and boy, that's a tough damn business. Polling is a tough business in so many ways, right? I mean, it's tough methodologically.

Karen Lynch: It's tough ebbs and flows.

Lenny Murphy: You gotta do a lot of the political crap. It is a tough business, but I have nothing but I respect people that engage in polling from the standpoint of it is not an easy way to make a living. And it's not getting easier either.

Karen Lynch: That's going to be the funding round that we're going to be like, wait, what? When a polling company gets that seed funding, we know they're onto something.

Lenny Murphy: Well, there's a reason those guys reached out to me. But I didn't have any bright ideas for them. Unfortunately, I wish I did. If I knew what that play looked like I would have already done it.

Karen Lynch: Oh My gosh way too long of a show we're so sorry friends Lenny we can't go two weeks. Apparently that's just the rules. That's the rules so we will see you next week for a much tighter show. One benefit of the go-going solo was that I was right at 30 minutes. Yeah, we'll get there. We'll get there. All right, friends. We'll see you next time.

Lenny Murphy: Bye, everybody. Happy Friday. Happy Good Friday. Happy Easter.

Karen Lynch: Oh, yes. Yes, for sure. Thank you.

Links from the episode:

Qualtrics X4 and the New Questions Facing Insights Leaders 

Norstat acquires Swiss research firm DemoSCOPE  

Miravoice comes out of stealth with $6.3M in seed funding 

Omniscient closes a $4.1M pre-seed round 

NIQ and MRI-Simmons join Comscore’s Data Partner Network 

SurveyMonkey adds Salesforce-based automated SMS survey invites 

Panoplai ranked No. 5 in Inc.’s 2026 Regionals Northeast list 

Bellomy Named to Fortune’s 2026 List of America’s Most Innovative Companies 

Aaru’s CNBC appearance spotlights the company’s origin story plus its views on prediction markets 

NIQ launches Ask Arthur Chat 

AskNicely launches Ask NiceAI 

YouScan launches Tiger Finder 

Greenbook Podcast - Nikki Quast of Microsoft on AI’s Future in Market Research 

ChatGPT reportedly surpasses $100M in annualized ad revenue 

Inside Meta’s Home Grown AI Analytics Agent 

Karine Pepin shared a post arguing that survey-quality problems are still driven more by bad actors and bad sourcing than by AI agents at scale 

YouGov's response 

Daniel Gilbert’s critique of “The Creative Dividend” argues the widely praised paper is a masterclass in how not to do marketing research 

Kristi Zuhlke argues LLM-based human simulation should not be dismissed simply because humans are irrational 

Yogesh Chavda shared insights from Anthropic’s analysis of 81,000 user interactions 

Data Sapien: Grounded Synthetic: Synthetic Data's Missing Foundation 

Accessible Insights Consortium Expands “Insights for All!” Toolkit to Advance Inclusive Research Practices 

The “jagged technology frontier” paper finds AI improves speed and quality on many knowledge-work tasks but reduces correctness on tasks outside the model’s frontier 

“Opinion Polling Might Be Broken” is a useful video resource on structural problems in modern polling

synthetic datarespondent engagement artificial intelligenceThe Exchange

Comments

Comments are moderated to ensure respect towards the author and to prevent spam or self-promotion. Your comment may be edited, rejected, or approved based on these criteria. By commenting, you accept these terms and take responsibility for your contributions.

Karen Lynch

Karen Lynch

Head of Content at Greenbook

339 articles

author bio

Leonard Murphy

Leonard Murphy

Chief Advisor for Insights and Development at Greenbook

760 articles

author bio

Disclaimer

The views, opinions, data, and methodologies expressed above are those of the contributor(s) and do not necessarily reflect or represent the official policies, positions, or beliefs of Greenbook.

More from Karen Lynch

Laura Gonzalez Quijano on Human-Centered Innovation, Emotion, and the Future of Insights
Future List Honorees

Laura Gonzalez Quijano on Human-Centered Innovation, Emotion, and the Future of Insights

Future List Honoree Laura Gonzalez Quijano explores empathy, AI, and human-centered innovation in shaping the future of insights.

The Future Role of the Researcher Is Taking Shape
Artificial Intelligence and Machine Learning

The Future Role of the Researcher Is Taking Shape

As AI accelerates research, the insights role isn’t disappearing, it’s evolving. Discover how researchers shift from creators to guardians of quality ...

What Synthetic Research Can Do Now, and What It Still Can’t
Data Science

What Synthetic Research Can Do Now, and What It Still Can’t

Synthetic research is evolving fast. Beyond the hype, what can it truly do well today — and where does it still fall short for insights teams?

Sign Up for
Updates

Get content that matters, written by top insights industry experts, delivered right to your inbox.

67k+ subscribers