28 Questions to Help Research Buyers of Online Sample

    The primary aim of these 28 Questions is to increase transparency and raise awareness of the key issues for researchers to consider when deciding whether an online sampling approach is fit for their purpose. Put another way, the aim is to help researchers to ensure that what they receive meets their expectations. The questions are also designed to introduce consistent terminology for providers to state how they maintain quality, to enable buyers to compare the services of different sample suppliers. Notes on the context of the questions explain why the questions should be asked and which issues researchers should expect to be covered in the answer.

    These new questions replace ESOMAR’s “26 Questions to Help Research Buyers of Online Samples.’ ESOMAR has updated the text to recognize the ongoing development of techniques. While some of the questions remain constant, new questions have been added to incorporate new techniques and new technology in this area. In particular, this revision recognises the broad trend within the industry to build online samples from multiple sources rather than relying on a single panel.

    It should be noted that these 28 Questions focus on the questions that need to be asked by those buying online samples. If the sample provider is also hosting the data collection, you will need to ask additional questions to ensure that your project is carried out in a way that satisfies your quality requirements.

    The 28 Questions complement ESOMAR’s Guideline to Online Research, which was revised in 2011 to add updated legal and ethical guidance and new sections on privacy notices, cookies, downloadable technology and interactive mobile.

    Company Profile:

    What experience does your company have in providing online samples for market research?

    Context: This answer might help you to form an opinion about the relevant experience of the sample provider. How long has the sample provider been providing this service and do they have, for example, a market research, direct marketing or more technological background? Are the samples solely provided for third party research, or does the company also conduct proprietary work using their panels?

    We at InVeritas Research have been providing fieldwork and data collection services since 2009 and ever since have been building proprietary online panel and doing online surveys.

    We have expertise in reaching difficult to reach respondents like healthcare professionals, banking and C-Suit executives and decision makers which help our clients from banking, automobiles, pharmaceuticals, medical devices, leading brands, and CPG/FMCG firms to execute almost every type of study which includes but are not limited to package and concept testing, brand tracking, ATU studies and public opinions.

    Sample Sources and Recruitment:

    Please describe and explain the type(s) of online sample sources from which you get respondents. Are these databases? Actively managed research panels? Direct marketing lists? Social networks? Web intercept (also known as river) samples?

    Context: The description of the types of sources a provider uses for delivering an online sample will provide insight into the quality of the sample.

    InVeritas Research’s panel is managed by a dedicated panel management team and the panelists are recruited from multiple sources to ensure a possibly diverse representation. Respondents are recruited from different sources like

    Social network

    Email recruitment trough invitations

    Recruitment through phone and face to face

    Respondents opt-in to join the panel and are further profiled through a survey and upon successful completion and verification they are invited for the survey.  

    If you provide samples from more than one source: How are the different sample sources blended together to ensure validity? How can this be replicated over time to provide reliability? How do you deal with the possibility of duplication of respondents across sources?

    Context: The variation in data coming from different sources has been well documented. Overlap between different panel providers can be significant in some cases and de-duplication removes this source of error and frustration for participants.

    InVeritas Research’s samples are recruited through different methods and sources, so all the respondents are profiled and verified before being invited to any of the market research survey.

    We employ Geo-IP check, browser cookie and few other acceptable digital fingerprinting technologies to de-dupe sample.

    Instant quality checks during recruitment and after completion of the survey though various parameters help us determine and exclude peculiar respondent from the study as well as from the panel.

    Are your sample source(s) used solely for market research? If not, what other purposes are they used for?

    Context: Combining participants from sources set up primarily for different purposes (like direct marketing for example) may cause undesirable survey effects.

    Our panel is used solely for “Market Research”. We never share our panel member’s personal details with our clients, associates or any third party for any activities.

    How do you source groups that may be hard to reach on the Internet?

    Context: Ensuring the inclusion of hard-to-reach groups on the internet (like ethnic minority groups, young people, seniors, etc.) may increase population coverage and improve the quality of the sample provided.

    InVeritas Research’s core competencies lies in targeting  and recruiting difficult to reach audiences such as physician’s sub-specialties, certain patients groups, C Suit Executives in different sectors.

    We carry out recruitment through various sources like social media, email invitations, offline to online and many others. One of the methodologies through which we systematically recruit hard to reach audience is partnering with media and journals publishers having access to members.

    If, on a particular project, you need to supplement your sample(s) with sample(s) from other providers, how do you select those partners? Is it your policy to notify a client in advance when using a third-party provider?

    Context: Many providers work with third parties. This means that the quality of the sample is also dependent on the quality of sample providers that the buyer did not select. Transparency is essential in this situation. Overlap between different providers can be significant in some cases and de-duplication removes this source of error and frustration for participants. Providers who observe process standards like the ISO standards are required to give you this information.

    In a situation where InVeritas Research’s panel needs to be supplemented using a third party sample provider, we notify clients well in advance and work closely with the third party to ensure quality data collection.

    Third party suppliers are selected based on their willingness to adhere to set of quality and management process which are monitored through stringent guidelines for every project commissioned.

    Sampling and Project Management:

    What steps do you take to achieve a representative sample of the target population?

    Context: The sampling processes (i.e. how individuals are selected or allocated from the sample sources) used are the main factor in sample provision. A systematic approach based on market research fundamentals may increase sample quality.

    InVeritas Research has the capability to sample single target group from our proprietary panel as per the client needs and uses census data to define a representative sample. Survey participants are profiled once they agree to join the panel and based on their answerers to the questions, respondents are matched with potential surveys they will be able to take. We further diversify our recruitment sources to avoid any biases.

    Do you employ a survey router?

    Context: A survey router is a software system that allocates willing participants to surveys for which they are likely to qualify. Participants will have been directed to the router for different reasons, perhaps after not qualifying for another survey in which they had been directly invited to participate, or maybe as a result of a general invitation from the router itself. There is no consensus at present about whether and how the use of a router affects the responses that individuals give to survey questions.

     

    InVeritas Research does not employ survey router.

    If you use a router: Please describe the allocation process within your router. How do you decide which surveys might be considered for a respondent? On what priority basis are respondents allocated to surveys?

    Context: Biases of varying severity may arise from the prioritization in choices of surveys to present to participants and the method of allocation.

    InVeritas Research does not employ survey router.

    If you use a router: What measures do you take to guard against or mitigate, any bias arising from employing a router? How do you measure and report any bias?

    Context: If Person A is allocated to Survey X on the basis of some characteristic then they may not be allowed to also do Survey Y. The sample for Survey Y is potentially biased by the absence of people like Person A.

    InVeritas Research does not employ survey router.

    If you use a router: Who in your company sets the parameters of the router? Is it a dedicated team or individual project managers?

    Context: It may be necessary to try to replicate your project in the future with as many of the parameters as possible set to the same values. How difficult or easy will this be?

    InVeritas Research does not employ survey router.

    What profiling data is held on respondents? How is it done? How does this differ across sample sources? How is it kept up-to-date? If no relevant profiling data is held, how are low incidence projects dealt with?

    Context: The usefulness to your project of pre-profiled information will depend on the precise question asked and may also depend on when it was asked. If real-time profiling is used, what control do you have over what question is actually asked?

    InVeritas Research holds this information on all participants. The methodology and process are the same, respondents are required to complete a profiling survey that always includes standard profiling questions in addition to questions specific to various survey qualifications. Some of the questionnaires which are included apart from standard mandatory registration variables are business titles, detailed demographic and geographic information, auto ownership, ailments, product ownership, travel, shopping habits, and employer profile.

    All data are stored and used for better feasibility estimation of low incidence studies and improving the respondent experience in subsequent surveys.

    Please describe your survey invitation process. What is the proposition that people are offered to take part in individual surveys? What information about the project itself is given in the process? Apart from direct invitations to specific surveys (or to a router), what other means of invitation to surveys are respondents exposed to? You should note that not all invitations to participate take the form of emails.

    Context: The type of proposition (and associated rewards) could influence the type of people who agree to take part in specific projects and can therefore influence sample quality. The level of detail given about the project may also influence response.

    InVeritas Research’s proprietary panel management platform enables fieldwork managers to accurately estimate requirements that match client’s desired target profile. Invitations of all types with a diversified approach are used to motivate respondents to take part in research study and the panelist are told that by doing so they have the chance to be involved in the development of future products and services.

    The invitations are sent by email with a unique URL that provides access to survey questionnaire, telephone alerts are also used in reaching low incidence research studies. The invitations content include approximate length of the survey, incentive or prizes offered for their valuable time and opinion.

    Please describe the incentives that respondents are offered for taking part in your surveys. How does this differ by sample source, by interview length, by respondent characteristics?

    Context: The reward or incentive system may have an impact on the reasons why people participate in a specific project and these effects can cause sample bias.

    We adopt a clear and transparent incentive system based on cash rewards, gift vouchers, sweepstakes awards and bonus points. The incentives vary based on length of interviews, country and complexity of the survey.

    Panelist can choose to collect or redeem their reward, gift vouchers of multiple merchants and cash through PayPal, Alipay, or direct Bank transfers.

    What information about a project do you need in order to give an accurate estimate of feasibility using your own resources?

    Context: The “size” of any panel or source may not necessarily be an accurate indicator that your specific project can be completed or completed within your desired time frame.

    To provide accurate feasibility, we ask for the following information from clients:

    • Target Sample/Audience of the study
    • Demographics
    • Incidence Rate if known
    • Estimated Length of Interviews
    • Estimated time on Field
    • Number of completes and quotas or sub-quotas if any.

    Do you measure respondent satisfaction? Is this information made available to clients?

    Context: Participant satisfaction may be an indicator of willingness to take future surveys. Participant reactions to your survey from self- reported feedback or from an analysis of suspend points might be very valuable to help understand survey results.

    Yes, we measure our respondent’s satisfaction and maintain it as an integral part of our quality and panel management process. Our panel management team ensures that all enquiries or complaints raised by our panelist are quickly addressed and resolved.

    What information do you provide to debrief your client after the project has finished?

    Context: One should expect a full sample provider debrief report, including gross sample, start rate, participation rate, drop-out rate, the invitation/contact text, a description of the field work process and so on. Sample providers should be able to list the standard reports and metrics that they make available.

    After the project is finished we provide clients with detailed information such as:

    • Date and time of survey launch
    • Incidence rate and response rate
    • Average length of the survey recorded
    • No of total completes
    • No of screen outs
    • No of quota-full’s and
    • Respondent’s feedback about the survey if necessary.

    Other details can be shared on request.

    Data Quality and Validation:

    Who is responsible for data quality checks? If it is you, do you have in place procedures to reduce or eliminate undesired within survey behaviors, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item non-response (e.g., “Don’t Know”) or (d) speeding (too rapid survey completion)? Please describe these procedures.

    Context: The use of such procedures may increase the reliability and validity of the survey data.

    InVeritas Research’s panel management team is responsible for the data quality check which is performed after the completion of each project and is a part of our SOP to ensure high quality survey data.

    Several quality control measures are employed for a high quality survey output and to ensure our panelists are real and attentive, like:

    • Geo-IP validation and proxy detection.
    • Digital fingerprinting and using web cookies for duplicate device detection
    • Speedsters and straight liners are identified and removed from the survey and panel

    We keep a track of fraudulent users and blacklist them from ever entering any InVeritas Research’s online survey.

    How often can the same individual be contacted to take part in a survey within a specified period, whether they respond to the contact or not? How does this vary across your sample sources?

    Context: Over solicitation may have an impact on participant engagement or on self-selection and non-response bias.

    We practice 1 invite, 1 reminder policy; however in low incidence rate studies we send a maximum of 3 reminders with a gap of at least 4 days between each reminder.

    Once the respondent responds and completes he/she is not invited for another study for 4 days.

    Other factor which forbids invitations to the panelist for longer periods is similar topic survey or similar product category survey.

    How often can the same individual take part in a survey within a specified period? How does this vary across your sample sources? How do you manage this within categories and/or time periods?

    Context: Frequency of survey participation may increase the risk of undesirable conditioning effects or other potential biases.

    We allow our panelist to complete up to a maximum four surveys a month, no matter the sample source. We exclude panelists from a specific survey based on previous participation or completion of any kind only upon request from client or project requirements.

    However in the case of a tracker or wave study where repeat entry is required we sample in accordance with the study requirements.

    Do you maintain individual-level data such as recent participation history, date of entry, source, etc. on your survey respondents? Are you able to supply your client with a project analysis of such individual level data?

    Context: This type of data per participant, including how the total population is defined and how the sample was selected and drawn, may increase the possibilities for analysis of data quality.

    Yes we maintain a complete record of each panelist and are available upon client request. All relevant information except PII can be shared.

    Do you have a confirmation of participant identity procedure? Do you have procedures to detect fraudulent participants? Please describe these procedures as they are implemented at sample source registration and/or at the point of entry to a survey or router. If you offer B2B sample, what are the procedures, if any?

    Context: Confirmation of identity can increase quality by decreasing multiple entries, fraudulent panelists, etc.

    InVeritas Research’s panel management team leverage technological approach for quality respondent management.

    Several quality control measures are employed for a high quality survey output and participant identity confirmations to ensure our panelists are real and genuine like:

    • Geo-IP validation and proxy detection.
    • Duplicate email and contact details verification
    • Double-opt in email confirmation at the time of registration
    • Digital fingerprinting and using web cookies for duplicate device detection
    • Speedsters and straight liners are identified and removed from the survey and panel

    We keep a track of fraudulent users and blacklist them from ever entering any InVeritas Research’s online survey.

    Policies and Compliance:

    Please describe the ‘opt-in for market research’ processes for all your online sample sources.

    Context: The opt-in process indicates the participants’ relationship with the sample source provider. The market generally makes a distinction between single and double opt-in. Double opt-in refers to the process by which a check is made to confirm that the person joining a panel or database wishes to be a member and understands what to expect (in advance of participating in an actual survey for a paying client).

    InVeritas Research uses a double opt-in process for all respondents and all prospect panelists are required to completes a panel registration form, which includes a set of screening and profiling questions like business titles, detailed demographic and geographic information, auto ownership, ailments, product ownerships, travel, shopping habits, and employer profile.

    After completion of the recruitment questionnaire, panelist receives an email which needs to be confirmed by them (double-opt-in-process) as their willingness to join the panel and an acknowledgement of our Terms and Privacy Policy.

    Please provide a link to your Privacy Policy. How is your Privacy Policy provided to your respondents?

    Context: Not complying with local and international privacy laws might mean the sample provider is operating illegally. An example privacy policy is given in the ESOMAR Guideline for Online Research.

     

    We comply with local laws for data protection as well as the research standards and guidelines and strive to confirm our privacy practices to applicable laws standards including associations like Insight Associations and ESOMAR.

     

    InVeritas Research’s privacy policy is available at the below link:

    www.inveritasresearch.com/privacy

    It is required that panelists agree to our privacy policy and other terms and conditions at the time of registration in order to become a panelist and receive survey invitations as per their profile and demography.

    Please describe the measures you take to ensure data protection and data security.

    Context: The sample provider usually stores sensitive and confidential information on panelists and clients in databases. These data need to be properly secured and backed-up, as does any confidential information provided by the client. The sample provider should be able to provide you with the latest date at which their security has been evaluated by a credible third-party.

    All panelist and client information’s are stored and secured via industry standard firewalls and stringent IT security policies.  Access is restricted and requires authorization. Access to participant data is restricted by password and secured in a separate server. We adhere to strict guidelines and high security measures are taken when it comes to physical and logical security of panelist and client data.

    What practices do you follow to decide whether online research should be used to present commercially-sensitive client data or materials to survey respondents?

    Context: There are no foolproof methods for protecting audio, video, still images or concept descriptions in online surveys. In today’s social media world, clients should be aware that the combination of technology solutions and respondent confidentiality agreements are “speed bumps” that mitigate but cannot guarantee that a client’s stimuli will not be shared or described in social media.

    All panelists are informed at the time of joining the panel that they may be presented/shown confidential and sensitive material in the terms & conditions and they are asked to agree not to share any concepts that they are shown in surveys they participate.

    Also all respondents are asked to give their formal agreement to a confidential statement before the start of the surveys and additionally some technical barriers can be implemented which protects the sensitive material from getting copied by disabling copy and save functions and immediately blacklisting respondents from the survey if he/she tries to use copy/PrtScr functions.

    Are you certified to any specific quality system? If so, which one(s)?

    Context: Being certified may require the supplier to perform tasks in a pre-determined manner and document procedures that should be followed.

    We currently do not hold any particular quality certification; however we are in compliance and abide by all standards, ethics and code of conduct of major marketing and opinion research organizations like ICC/ESOMAR.

    Do you conduct online surveys with children and young people? If so, do you adhere to the standards that ESOMAR provides? What other rules or standards, for example COPPA in the United States, do you comply with?

    Context: The ICC/ ESOMAR International Code requires special permissions for interviewing children. These are described in the ESOMAR Online Research Guideline. In the USA researchers must adhere to the requirements of the Children’s Online Privacy Act (COPPA). Further information on legislation and codes of practice can be found in Section 2 of ESOMAR’s Guideline for Online Research and in the ESOMAR Guideline on Interviewing Children and Young People.

    We follow country-specific and local regulations to ensure that we are compliant with regional legislation and privacy laws when it comes to interviewing young people and children, like we adhere to all COPPA regulations and so we do not interview children under the age of 13 in US.

    In cases where we need to interview minors, we recruit them only through their parents and no personal information is stored.