Everything you want to know about ResTech

3 December 2021

What is “research technology” and Why Is It booming? Answers to the questions in the webinar

15 min read
15 min read
MRII ResTech graphic

On 1 December 2021, Jamin Brazil and Vivek Bhaskaran hosted a webinar around ResTech - What is it and why is it booming? - As part of the webinar - there were so many questions that we could not answer all of them. This blog post allows both of them to answer the questions that got missed in the online session. 

Is there a subset of the tech options out there that enable both CONSUMER research/analysis, but also allow for the integration with broader industry trends, competitive analysis, etc. to get at the bigger picture?

Currently, there is not a tech option that facilities both ad hoc primary research and packaged relevant secondary research. However, there are many syndicated reports and tracking studies that leverage both.

Jamin Brazil 

The closest one I can think of is Disquo - who integrate consumer research with behavioral data from their panel. SimilarWeb recently acquired Embee Mobile and they are also planning on doing something similar.

Vivek Bhaskaran

Can either of you speak to the new TSAPI API standard?

Patrick  Johnston, Vista Research Services, Inc. 

TripleS was the first framework that helped us, as tech platforms, create a common language for primary data. It is an important initiative and, I’m sure, once one of the large survey platforms adopt it, everyone else will follow.

Jamin Brazil 

Jeffrey Henning has talked to me about this and for any interoperability standard to be adopted, it has to be mandated by customers - and the mandate has to be real. A good analogy is GDPR - it was adopted by all technology companies because it had real teeth and I think things like interoperability should be mandated by industry bodies like ESOMAR and there should be a benefit (and price) for platforms.

Vivek Bhaskaran

There was a shift in the creative industry ten years ago when people's job titles shifted to being about the tools they used - so 3D modelers became 'Maya Engineers' etc. Do you see the same thing happening in research, where full service people become defined by the tools they use?

Chris Robson, Deckchair Data

I’m unfamiliar with the creative industry and, as such, missed this shift. I do not think job titles are being framed in context of the tools they use. Even down to survey programming specialists...they usually use multiple platforms based on the customer/project/fit. Currently, tools are still the commodity and the value defines the job, e.g. Project Manager, Account Executive, Analyst, etc.

 Jamin Brazil 

I do think the term “Market Researcher” is too overly broad. Everyone is a data-scientist or a researcher. We are already seeing titles like “Director, Next Gen Insights” that clearly want to differentiate between “Previous Gen Insights” ;) - so over the next few years, I think a new set of titles like - “Research Ops Manager” or “Community Research” will emerge based on functions within the organization. Each of those functional roles will need operational tools and technology to make them productive.

Vivek Bhaskaran

Anyone can do research, but quality is as important as tech!       

Tony Podziemski, Georgian College 

Agreed. And, it is incumbent on the tech platforms to bake quality into their systems. The issue is that this requires investment. And, that investment has to come from either:

  1. Improved right to win in the market (increase customer win rate)

  2. Increased cost to users

One of the big problems in the industry is that the end user of the insight doesn’t have the knowledge around how the sausage is made (participant data quality, question bias, etc.). So, increased costs in the tech platform usually come at the expense of the Agency’s Gross Margin. So, everyone is slow to focus on quality.

Jamin Brazil 

I’ll stick to my position - Speed trumps quality :)

Vivek Bhaskaran 

What do you see as the bottlenecks in the research process that limit speed at good levels of quality? (We all know that fast and poor quality is always possible)          

Richard Wood, DQ&A 

If I had to pick one, participant quality. Today, every agency expects to clean out about ⅓ of the completes they are buying on the exchanges. That takes time and human consideration.

Jamin Brazil 

The one bottle neck I also see is folks are not thinking about ResearchOps as a discipline. Lets even take a “Research Brief” - do we have guidelines for business teams BEFORE they ask a question to the research team? Can they look up a Wiki or a repository to look at insights that have already been collected? These systems increase efficiency and by definition the lack of these systems create bottlenecks.

Vivek Bhaskaran 


Don't you think expertise in areas such as methodology and sample specific>corresponding population are required for professionals to scientifically make inferences? Market research is way beyond data analytic, after all.

Debashis Sengupta, Sengupta Consulting

100% yes. Recently I was speaking with the head of insights at GTMHub. He compared making a business decision to a “courtroom drama”. In his analogy, the executive is the jury. As a researcher, we build a case from various sources then the executive makes a decision based on our POV that moves the business forward.

Jamin Brazil 

What are some of the data privacy issues that come with ResTech?   

David Balikuddembe, Cutting Edge Business Support 

There are a lot. And, data privacy concerns are growing. Here are a few big ones:

  1. Use of Video in Reports

  2. Storage of PII like email, phone numbers, etc.

  3. Sharing of PII between countries

  4. Additionally, all the fear around mishandling PII had put us into a default “can’t share” position which creates a lack of participant validation between platforms.

Jamin Brazil 

I actually think data-residency is a bigger issue than privacy. Every country (now China too) has a data residency requirement. This means that data collected in those countries cannot leave the country. This is not a trivial issue - given that opening and managing data-centers in 50+ countries is not convenient and operationally viable.

Vivek Bhaskaran 

Is video survey research easier to put through sentiment analysis than text? Currently the models are not exactly meeting my needs for sentiment analysis           

Abigail Rosh, NetJets

No. Video feedback is still largely a brute force effort. It is easier to take raw text than have to deal with video. But, video is a far more human way of receiving feedback from participants and communicating participants POV. That is why, I believe, video will be a significant unlock for consumer insights in the future.

Jamin Brazil 

I find that what clients really want is deep insights and meaning from information (quant and qual). How can ResTech be used to provide deeper, more customized insights? It seems like this is where the human element is still critical.

Doug Keith, Future Research Consulting

I think of Research Operations as the assembly floor at a manufacturing company. We (as researchers) receive the raw material (participant feedback) and then put that raw material through various machines (ResTech). At the end, we have a presentation of findings and business recommendations.

How much of the raw material processing happening on YOUR assembly floor will determine your research technology stack. As Vivek Bhaskaran stated, you’ll want to leverage technology to reduce the repeatable processes of your data handling and analysis.

For me, I look at the actual flow of data and the processes in the same way my friend who owns a corrugated box facility looks at his box production.

Jamin Brazil 

Syndicated research has been around for years and discounted, denigrated by many, yet, "agile" research is now, finally defined: it is standardized research, syndicated being one model of research that presents standard information

Richard Hamer, Deft Research, LLC

Exactly. Simply put, ResTech simply automates redundant data handling. This can more easily happen when there is a large amount of redundant action like in doing the same project over and over again. This could be syndicated, ad hoc, or whatever.

Jamin Brazil 

How will AI improve the quality of research insights?  

Robert  Walker, Surveys & Forecasts, LLC

Vivek Bhaskaran’s  POV here is exactly right. What we are calling AI, in most cases, is simply algorithms. So, increased automation built into reporting will continue to help us all. For example QuestionPro’s ability to do a Van Westendorp Pricing Model or TURF. The output provided in the tool saves a ton of time.

Jamin Brazil 

For Jamin, about fraudulent responses, how did you in your Facebook survey (and Proctor/Gamble in their research) identify define and operationalize "fraudulent"? Lying? If Lying, how would you know. Usually this is done by speed, but speed alone does not = lying. Interested in your definition of "fraudulent" and how it might mean "not valuable"

Russ T, L&M

For the Facebook recruit, we added Research Defender after the first project showed us there were a lot of data quality issues. Then, we identified there is about 18.5% fraud coming from that source. We continue to monitor it per project. What is interesting is that the more we source the higher the number. Meaning, during times of high demand fraud scales to meet that demand while real people do not. This has a huge impact on election years where participant demand grows by ~20%.

Jamin Brazil 

Isn't the problem with the "build-your-own-panel" strategy that it takes a great deal of effort and on-going maintenance that cuts into MR firm margins -- in other words, clients are not willing to pay more for the better sample.

Richard Hamer, Deft Research, LLC

As an agency, building a panel for your customer represents a huge revenue opportunity, point of differentiation, and relationship protection for your business:

  1. What if you could add to your client’s panel both qualified and terminated participants? Let’s say it costs $5 to recruit 10 people to your survey of which 1 qualifies. You need 100 qualified participants for your research. That’ll cost you $500 in Facebook ads and, likely, another $5 incentive for a total of $1,000 for 100 participants. In today’s world, a $10 CPI is expensive. However, you’ve actually got 1,000 people which you can then use for future research.

  2. Brands recognize the value and are willing to pay $2,000 to $5,000 per month for the panel engagement and maintenance.

  3. If you host the panel, your customer will always use  you. They simply can’t move off the asset you’ve built for them.

  4. Speed to insight is much improved because you take out the sample provider piece when you have your own panel.

I could go on. :)

Jamin Brazil  

Jamin has said enough ;)

Vivek Bhaskaran

Re: agile: what about curated panels of particular consumer types? (e.g.golfers, people with a particular health need) are you seeing much of that?       

Richard Wood, DQ&A

Yes. For example, I’m seeing a large interest and adoption of panels by gaming companies.

Jamin Brazil  

Folks who have internal panels have a long view of research and typically have executive buy-in and bet on the fact that research gives them an edge in decision making. This then translates into dividing the research budget from a “Project” based approach to a hybrid one - where you spend on initiatives like consumer communities that go beyond a particular project and then allow for budget for individual projects.

Vivek Bhaskaran

Re: democratizing. One factor in the finance 2008 crash was models made about mortgage securities were so slimmed down and distributed as plug and play tools to "anybody" on a sales floor that the model logic was not understood by users. PhDs who created models were long retired. Disintermediation of knowledge with tools led to troubles when the underlying conditions/ environment changed and critical thinking about models was not there. What nuance gets lost as everybody moves onto these platforms and panels? Unconscious bias already shows up in algorithms affecting lives in unintended ways.

Brett Barndt  Parsons School of Design

Exactly right. No one cares how the sausage is made until people start getting sick. We are at a moment in market research where companies are moving on data that has large amounts of bad data but they simply don’t know it.

Jamin Brazil  

Very interesting session - do you think the target audience(s) for MR/Insights agencies is now changing esp. given the focus on software, tech and automation?     

Nick Bull, Maru Group

It is 100% the case that market research agencies have changed and will continue to do so. The shelter in place mandate forced us to accelerate our research technology adoption by 5 years. Now, everyone knows that if you are not investing in your research technology stack you are jeopardizing your future.

Jamin Brazil  

I don’t know of a single large agency that does not view technology as a competitive advantage. They may not say it directly, but they all know it. Very similar to every single auto-manufacturer has a electrification / EV strategy - every single agency is thinking about their own tech stack.

Vivek Bhaskaran

One of the dangers of "anyone can do research" lies in the interpretation - for example, anyone can build an automated regression model, but they may not know how to properly evaluate the goodness of the model and interpret the results. How does ResTech fit into this?

Mark Ratekin, Forsta

The issue around creating black box research where the math is entirely obfuscated from the user means that the user must have complete trust that the technology is giving them the correct interpretation. That is why, to Vivek’s point during the webinar, technology automation is focused on repeatable methodologies. And, is internally purpose built by brands so the tools can take into account the context and outputs of the insights.

However, the point you are making is correct, you can’t understand a regression model if you don’t understand regression models. And, my biggest fear is that we are automating aspects of our work that can do harm, “Just because I have a scalpel doesn’t mean I should perform surgery.”

Jamin Brazil 

Expanding on “research repositories” I’m curious to hear your perspectives on what you see with regard to either the current state or need for innovation in insights delivery to both clients (via providers) & teams (via managers).

Marcus Jimenez, Breefly, the insights reporting platform

I will provide you with a real world example. 8 years ago I did some consulting for the CMO of a fortune 500 bank. The objective was to optimize their consumer data handling. During my initial audit, I found they were conducting 13 simultaneous NPS studies globally. None of the sponsors of the 13 had any visibility that there were other NPS studies being conducted. What was even more troubling was that the results from the studies had some conflicting findings. 

Think about the amount of waste in the above example. Both in terms of research dollars but also in terms of market decisions. 

Data visibility and accessibility is of critical importance for today’s brands. But, the issue is hard and getting harder as Customer Experience, User Experience, and Market Research are growing at their current rates. 

 Jamin Brazil 

I think I’ll agree (for a change) with Jamin on this one ;) But seriously, in 5 years my bet is that every single research organization will start with a repository - the ResearchOps will be the starting point - not an afterthought. Just like we would NEVER build a sales team without a CRM or start software development without a source code repository - we cannot start a research team without a repository!

Vivek Bhaskaran

Jamin Brazil
Co-founder and CEO at HubUX
Vivek Bhaskaran
Founding Member and Executive Chairman at QuestionPro