The Insight250 spotlights and celebrates 250 of the world’s premier leaders and innovators in market research, consumer insights, and data-driven marketing. The inaugural list was revealed last April 2021 and the latest 2022 winners were announced in September at ESOMAR Congress in Toronto, creating renewed excitement across the industry whilst strengthening the connectivity of the market research community.
With so many exceptional professionals named to the Insight250, it seems fitting to tap into their expertise and unique perspectives across various topics. This weekly series does just that; inquiring about the expert perspectives of many of these individuals in a series of short topical features.
Understanding markets, competitors, and customers have never been more vital to brands. However, with the onslaught of information, it is increasingly difficult to manage and control data integrity. To better understand this process, I sat down with Pam Harrison, Senior Manager, NIVEA U.S. and Global Center of Excellence for Body for Beiersdorf. In this session, we discuss the evolving importance of data integrity and the partners, tools, and approaches that are helping enterprises enhance the quality of the research and insights at their fingertips.
Pam, you’ve worked at research agencies and on the enterprise side; in your experience, does the attitude to data quality and integrity differ?
“It absolutely differs. I pride myself on being incredibly accurate and somewhat of a purist in all that I do. I have found since being back on the enterprise side that most research agencies are in perfect alignment with my approach to data quality and integrity, but not all. Those who are more ‘consultancy’ type groups vs. research agencies at their core tend not to be as much in the weeds with assuring quality data as those who are truly conducting the work hands-on.”
How can agencies and enterprises work together in strong partnerships to maintain the integrity of data and the relationship?
“Honestly, I believe the ownership of the integrity of the data sits completely with the agencies. As a client, I should not have to concern myself with data integrity. It is kind of like saying you expect the seat belt to hold you in but should you analyse the science behind whether it will?
“That said, I am now intimately involved with one of my research agencies in making sure that steps are put in place going forward to ensure data quality given issues we had last year on a big initiative. Together we have designed steps to make sure sampling is done properly, open-end responses are screened for sensibility and replaced with new respondents as needed, straight liners are removed, speeders/slow boaters, BOTs, etc.
“Given this one very bad experience last year, my antennae are up, and I find myself having quality/integrity conversations with all of our partners to make sure what I had assumed was being taken care of is, in fact, taken care of. I am a reasonable person, and errors sometimes happen. But taking pride in ownership in making sure that what we present to our stakeholders is accurate is something the agency needs to value as much as I do.”
As life continues to accelerate at an exponential rate, is there more pressure to deliver research results faster, and does this increase the risk to data quality?
“Yes, it does in many ways. The quality of the incoming sample can be jeopardised by how it is sourced for quick-turn studies. Then you layer on top of that the rush to output results, and human error can magnify those issues.
“I am a bit of an old-school kind of gal and allow those ‘quick and dirty’ type of studies only on low-risk business issues. Having different tools in our tool basket allows us to take a breath and slow down on key substantial initiatives while being lightning fast on areas that are lower risk in nature. And everything in between the two.”
Why must data quality be maintained, and what advice can you give to those who seek to “cut corners”?
“Simply put – garbage in, garbage out. If your study is fraught with sampling errors or respondents who don’t care to give quality open ended responses, the end story will not be reflective of reality. Cutting corners and not being attentive to data quality
details leaves you in a position of trying to create a reason that the story makes sense when in fact, it very well might not make sense.
“There is a time and a place and topic matter(s) where cutting corners is going to be necessary. But if the incoming data are just fundamentally flawed, there is no overcoming that. So find a corner to cut elsewhere – on the back end. You can’t build learning off of a flawed foundation.”
Can you give examples of where you have seen this go wrong?
“I think the example I alluded to earlier is part of the most egregious data quality issue I’ve seen in my career. The issues began to unfold after we had fully presented and workshopped the data, so we were left to make sense of a story that really didn’t make sense at all. Not only was sampling completely flawed, but the upfront open-ended questions also were not screened for quality and sensibility. Hundreds of interviews should have been tossed and replaced at no cost to us due to gibberish responses, nonsensical answers, people trying to play the system, and so forth. In this example, not only was the data quality degraded, but the integrity of the vendor-partner relationship was very much called into question by them making excuses along the way and not taking ownership of the errors of their ways.”
As well as data integrity, the integrity of research partners is crucial - how do you ensure your partners operate to the highest standards?
“For the most part, I don’t worry about this. When issues arise (even with the best of our partners) and the research partner fully engages in a non-defensive way to make sure the quality of the data is at its highest, then I know that I have a true partner in my camp.
Given my blend of experience on the vendor and enterprise side, I tend to see the one needle in a haystack that is flawed. Having a partner that evaluates my train of thinking and goes through the QC follow-ups with an open mind to find where the issue lies are the types of vendors who will win my business.”
What are the “watch-outs” for partnerships about to fail?
"I think the biggest watch out is if the client (instead of the agency) is continually finding the errors in the data, that is a huge watch out. It says to me that the ‘shop is not being watched,’ so to speak. If when issues are brought to the surface that is not the fault of the end client, making good on what was contracted for should be the owner of the research agency. If that doesn’t happen smoothly, that is a major red flag for me.”
Should associations like ESOMAR, and national associations, such as the Insights Association (in the US), do more to help maintain strict integrity guidelines?
“I think it would be very helpful for there to be an industry standard for when panel providers are made to replace interviews at no cost to the research agency (and ultimately to the client). For instance, speeders and straight liners seemingly are automatically replaced. The same standard should apply when open ends get responses that are gibberish or nonsensical – especially when they are among the very first questions in a survey.”
HOT TOPIC: Artificial Intelligence
What role does AI play in research, and will it make researchers redundant?
“I am not a huge advocate nor an early adopter of AI solutions, to be honest. I don’t think, or at least I hope, they never are used without the assistance of human interaction to lend colour to a story driven by AI. I’m intrigued at the prospect of how AI can be used to improve data quality – for instance … could an AI program read through open-ended responses in real-time as a survey ends to ensure that it was an engaged, quality respondent? Could an AI tool decipher that Elmer Fudd really has nothing to do with brands of hand and body lotion? I would hope so someday, if not already.”
“If it doesn't feel right, it isn’t! Push through until you find the error.”
With the increasing need for data integrity, experts like Pam are focused on leveraging partners and technology to deliver the most complete understanding of customers, markets, and competitors possible. Pam echoes the common “garbage in - garbage out” warning that we hear from many experts in this series. As she points out, the need for rapid insights does not benefit from the disaster that comes from cutting corners. So, balancing speed and quality are essential in terms of delivering data integrity. It is increasingly clear that AI can help, and needs to help more (e.g. banishing inappropriate “Elmer Fudd” references) but only if it is in the hands of experienced and capable practitioners who respect the limitations and use their expertise to apply it appropriately.
Thank you for taking the time to provide insight on this topic, Pam. It really was a pleasure talking with you.