Archive for the ‘online research’ tag
By Joe Jordan, Vice President of Panel Operations
Many times I discovered that the sample vendor I chose for my high-priority, top secret, critical study was just a mere middle man to other various sample vendors I specifically did not choose because they had wronged me in the past. Much like elephants and the IRS, researchers never forget vendors who have failed them at the final hour. They have nightmares of that 4 a.m. email the day a study should close, saying “We are reaching out to other partners” on a “best efforts basis.” It makes finding a sample provider for your next project all the more daunting.
As you send out your RFPs to preferred sample vendors with your exact demographic profiles and 18 nested quotas of left-handed grape soda drinkers who bought a laptop and puppy in the past 30 days, think through the reality of how likely that vendor can deliver your exact needs. Is it really possible they have this group of consumers anxiously waiting in front of their computers for the riveting subject line “A New Survey Just for You” to come flashing across the screen so they can sit for 45 minutes and give their entire purchase decision criteria in glorious detail?
So how can researchers separate those who only talk from those who also walk? Do your own research! Here are three suggestions for how to find the best sample provider for your next study.
- Ask what percentage of their projects requires partners. All sample companies use partners to assist in some percentage of their projects — the real question is what percentage. While the complexity of the audiences is always the driving factor in these studies, it is important to understand how often they outsource. Doing so usually limits a provider’s control of timing and feasibility and causes you anxiety about making the deadline. Part of why we brand uSamp as a technology company first is because it is technology that allows us to grow and manage our own, diverse panel. When I joined uSamp, I was impressed to learn that 95 percent of the projects completed in 2013 were sourced exclusively from our own suite of proprietary panels. This includes a wide variety of consumer segments and business-decision makers that are part of a network of websites and publisher partners. This vast network ensures the uSamp panel has the breadth to provide unique individuals from all customer segments willing to give their feedback on products and brands.
- Take the registration survey. As you evaluate your sample vendor, take a look through their registration page and review the type and number of questions required to sign up. Go through the double opt-in process to find out what the second wave of criteria is asking. Ideally, you should time this process and think through how long it takes to sign up and become an active member. Will a new panelist be willing to take this much time before even knowing if they can participate in the studies? Many of these registration forms are longer than mortgage applications and are about as exciting.To improve panelist engagement and reduce tedium and burnout, uSamp has launched Adaptive Profiling™, a profiling system that asks respondents targeted questions in short bursts and then utilizes predictive analytics and complex statistical analysis to identify other tendencies about panelists and connect them to the appropriate studies. This also allows uSamp to quickly assemble an audience that is custom-suited to clients’ specific needs while offering panelists more opportunities to qualify for studies without the cumbersome registration form.
- Ask how many unique panelists register daily. You’ve heard it before: “bigger is better.” At least, that’s what every sample vendor says when they proudly promote their panel as the largest on the market. But as a researcher, you care most about how they can target your specific audience quickly and accurately with unique and meaningful data.A raw count of millions of panelists does not mean they are engaged, active or applicable to your needs. You need respondents who are making decisions now, using smartphones, and interested in offering their opinions in the moment, rather than the heritage panelists who have been taking surveys for income for six years and registered their profile details on an eight-pound laptop. uSamp consistently signs up 18,000 new panelists a day who are fresh, engaged in the moment and ready offer insights on your products and brands.
Before you dive into your next relationship with a dubious sample vendor, remember to ask about other partners, play the role of the panelist, and find out about their new daily signups.
I hope you find a deep and reliable partner, at least until the next complex project comes along.
By The Editors
At MRMW this year, Justin Wheeler shared fascinating research in a presentation that probed one seemingly simple question: Are mobile respondents more honest? Wheeler’s research is trying to get at the twin problems of social desirability bias and consumer satisificing. The former describes the phenomenon of respondents providing answers that they think researchers will want to hear or that they think will make them appear in a more positive light in researchers’ eyes. The latter describes the mental shortcuts or paths of least resistance consumers will unconsciously take when asked to recall specifics of advertisements or products in an online survey. Wheeler’s research indicates that mobile could be an antidote to both of these problems. How? In-context mobile surveys remove interviewers from the equation, mitigating the influence of social desirability, and also eliminate the need for consumer recall.
See below for a video of Wheeler’s entire presentation at MRMW:
by Ben Leet, Sales Director, uSamp
It is the summer of 2012. As I sit on a busy train during my morning commute, fellow passengers are glued to their mobile devices – almost all smartphones. Most people are not talking on these phones. They are pinching and scrolling, browsing and thumbing. They are streaming information as fast as it’s released- a visual sign of the future of human behaviour. In fact, scrap “future” – it’s here, right now, and is only expected to proliferate. So I wonder why we continue to question whether our industry should adopt mobile research as a core methodology?
Morgan Stanley predicts that the number of searches done on mobile handsets will overtake those done by PC in the next year. Others posture that the mobile internet will overtake desktop internet usage within three years. Alarming though this sounds, it’s a real environment for the consumer – they, we, have the internet in our pockets 7 days a week and 24 hours a day. Why would we open up a PC or notebook to do our browsing, surfing, buying, networking, organising when we can do it with just a few clicks without moving from our chairs?
So what does this all mean for the MR industry?
During the past few years, there’s been a great deal of talk within the market research industry about online panels and sample quality. I’ve been in online sampling since ’99, when my business partner and I started our first sampling firm, goZing.com, which we sold to Greenfield Online in 2005. I’m currently co-founder and CEO of uSamp (www.uSamp.com), a technology company providing panel and sampling solutions to market researchers worldwide.
As someone with a vested interest in the long-term viability of quantitative research online, I want to share my thoughts about areas that need attention. My critique of what can and should be done to preserve the field’s integrity is intended to be constructive throughout, informed by more than a decade of observing both vendor/client and consumer behavior.
Addressing sample burn
Panelists are people. Over the past several years, brands across the globe have become increasingly invested in collecting, interpreting, and monetizing data. To many, data is a means to an end, quickly forgotten as results become more important than processes. We often refer to panelists as “sample,” not “people,” but to market research professionals working in an industry founded on such data, panelists should be regarded as living and breathing entities. They are our neighbors, our friends, our family members. These panelists eat and sleep just like us, and understand the concepts of time management and reward motivations.
Participating in an online research panel can be a tedious experience, during which panelists attempt surveys with the best intentions, and spend a great deal of time trying to qualify inside of narrow quota segments — only to frequently be terminated or screened-out with little or no compensation for their time. Many opt-out and stop taking surveys altogether.
Sampling firms do their best to manage this panel burn, but due to complex business requirements and certain persistent gaps in technology between sample suppliers and research survey software, it’s impossible for sample companies to know exactly what quotas market research firms require. Sample firms are mostly blind to the real-time needs of survey quotas, largely because industry processes are heavily manual and lack full transparency.
Imagine that survey software was able to communicate with sampling databases, and, in real-time, deliver exactly the right people at the right time. Panelists wouldn’t waste time and sample companies wouldn’t disappoint panelists (in other words, burn sample).
When panelists stop taking surveys, sample firms need to refresh the panel with new people – and there are real costs associated with managing this attrition. These costs are passed on indirectly through the CPI (Cost-per-interview)-based pricing model. The fewer panelists used in a survey, the lower the price. Higher incidence (and better targeting) likewise means lower pricing.
As it gets harder and harder for sample companies to retain panelists, the industry has been placing constraints on sample companies. Many initiatives require address-validated panelists. Ask a family member if he or she is willing to give personally identifiable information to a sample company simply to earn $25 a year for taking surveys. Does this mean that panelists who are not willing to give personally identifiable information to a sample company should be left out of online sampling methodology? What does this do to the scalability of online quantitative research? Will we reach a ceiling where companies can no longer fill quotas?