Is putting a screener in the survey title recommended? NO IT IS NOT!

Against Prolific Policy or Not?

I am not sure about this one. Often when performing a prescreening survey to find a niche of participants that are not identified by existing Prolific screeners, then part of the expense is that one may have to pay a lot of participants who will be screened out.

For example, if a researcher is looking for a particularly rare niche – such as adults who wear a superhero suit – then one might have to prescreen a couple of hundred participants to find one person in the niche. This means that even the 15p (about 19 cents) paid to those who take part in the prescreener, results in a cost of 30 UKP or 38 USD per niche member found, and then perhaps only 50-70% of those found will come back for the survey proper for which again participants will need to be paid.

Today my participant account was asked to take part in a survey titled:

Are you an XYZ user? ** Please participate only if you are an XYZ user**

In other words, I think, the researcher has put the prescreening question in the title of their prescreening survey. This is likely therefore to mean that the number of those that do not match the prescreening condition is greatly decreased, and it will become much less expensive to find those niche participants. The strategy seems to be working well because the places are filling slowly.

I wonder if this falls foul of the “do not include screeners in your survey” rule of Prolific YES IT DOES. The current rules of Prolific say "It is also not appropriate to put the screening criteria in the study description alone. " And since the title is imho an extension of the description, then putting even voluntary requests to screen in the title would seem to be inappropriate.

Some sort of opt in screened study option seems to me to be a good feature. I.e. only those participants that don’t mind being called up only to get screened out be called up to such surveys.

Tim

5 Likes

I vote no, it should not be against prolific policy. Also, for my study, for every 10 people who complete my prescreen, only 2 complete the main study (long cognitive study on zoom)—that’s 20%! It would be nice if prolific users would screen themselves out when they read the details of the study (provided in the prescreener description).

3 Likes

Interesting question - thanks for starting this discussion @timtak and @David_Martinez. If you’re just joining this thread, the policy in question is outlined in the Can I screen participants within my study? help article.

I’ve checked around with a few members of the team. In principle, it isn’t against policy for a researcher saying the title or criteria in the title or study description of their survey. And, if it helps find the more niche participants you’re looking for, we’re very happy for researchers to continue to do so.

Prolific’s policy on screening out within a survey is designed to prevent participants from being offered a study only to then lose out on the study and payment. If you are to screen in this manner, you’d want to ensure that all participants that are offered your survey receive payment, regardless of their response.

The key things to remember:

  • The title of your survey does not count as an official screener, and therefore participants cannot be rejected for not meeting screening criteria that is only in the title or study description. In setting your survey, you will want to ensure that you set your screening criteria using the prescreeners as well as including it within the title and/or study description of your survey.

  • You can validate your participants by asking the screening question again in the survey. As per the help article:

This validation question should be at the very beginning of your study and must be worded exactly as it appears on Prolific . …
If a participant’s response doesn’t match their prescreening answer, you can redirect them to a separate end of survey page with the following message:
‘You are ineligible for this study as you have provided information which is inconsistent with your Prolific prescreening responses. Please return your submission on Prolific by selecting the ‘stop without completing’ button.’

3 Likes

On whether it’s effective though, I’d be interested to hear from more people who are using this tactic with their surveys.

If you’ve used the title as a screener (in addition to the pre-screeners), how’s it work for you? Would you recommend it to others or not so much?

Dear Jon

I am far from 100% sure but the study I mentioned seems to have filled up quite slowly despite it being the sort of quick prescreener that might otherwise have filled up in no time at all.

I will take the liberty, therefore, of recommending that researchers use question titles for prescreeners to help with reducing the rate of attrition (?) under the understanding that those that take the study will still be paid even if they say now.

Tim

1 Like

Agreed that under Prolific’s rules, you would need to pay anyone who completes the survey attentively, even if they don’t meet your criteria. But my bigger concern would be providing options on every single question for people who don’t meet the criteria. eg, every question about the respondent’s superhero suit needs a “n/a” or “I don’t wear a superhero suit” option. Otherwise, you’ll have people for whom the questions aren’t relevant giving false or speculative answers.

You might also put a screening question at the end of the survey with repeated, emphatic text saying that users will be paid no matter how they answer, but tbh I don’t know how much respondents believe those.

The fact that the OP’s title-screened study is filling slowly is not evidence that this problem doesn’t happen. It’s a signal detection problem – imagine that you’re running a study that 2% of people qualify for, and title-screening works really well: 95% of unqualified people follow the request not to respond. Even then, more than half the responses will be from unqualified respondents giving potentially garbage answers.

Michael

Very good point. Putting the screening question in the title makes it really obvious what the screener is screening for.
I’d certainly recommend adding an explicit question within the survey, including mention of the fact that they will be paid irrespective of their answer, but as you say, they may not believe it, or answer in the affirmative to get re-invited to the study-proper.

In the past I have recommended adding extraneous options to screening questions

Do you wear

  • a corset
  • a wig
  • a super hero suit
  • medieval armour

So as not to give away the option that you are seeking, so that participants do not answer in the affirmative to be re-invited.

The survey that actually occurred was prescreening for a piece of equipment that I think the researcher could subsequently detect in the ‘study proper,’ and assuming mismatch with answers to researcher prescreeners can be treated in the same way as mismatch with answers to Prolific prescreeners – as grounds for immediate ejection – then those that are unqualified could be ejected from the study.

But with other types of prescreening, such as for those that wear super suits, as Michael says, this can lead to garbage answers, so the question in title technique, or any prescreener where the central prescreening question is unobfuscated and obvious, needs to be used, perhaps, only in testable cases.

Tim

1 Like

Read your comment in the other thread, the “put screener in title” one, that you noticed seeing a slowing down of participants that took your survey. So I guess it worked?

Yes, I use this too.
So far, since from what I gather you can not add it to the Prolific screener, you add it only to the survey, correct?

Hi null

it worked?

Yes, and No.

The participants slowed down, so I am pretty sure that only or mainly-only those that were prepared to say yes to the screener took the screener, so the researcher had to pay far less for hits.

But at the same time, since it was a complete giveaway as to what it was screening for, some participants may have taken the screener, said yes to the question in order to justify their hav ing taken the screener and to be asked back, when in fact they may not match the screener, since, with the screener in the title it is a dead giveaway as to what the researcher is looking for.

So I suggested using this technique only in cases where one can subsequently test to see if the participants really match the screener. This can perhaps be achieved in cases where the screener is screening for hardware that can be sniffed by the browser such as the information here provided

If you can ask the user to send a photo of themselves in a super-suit then perhaps one can check for that too.

But if no check can be performed, it may not be such a good idea to make the screening question obvious. In that case, while the slower recruitment makes it seem that the tactic has worked, it may in fact be resulting in a high proportion of false data, or at least there is that danger.

Tim

2 Likes

@Jon sorry to revive this thread but Prolific should edit the page detailing the policy. See https://www.reddit.com/r/ProlificAc/comments/vdi882/why_do_we_even_have_rules_on_this_site/?utm_source=share&utm_medium=web2x&context=3

Thank you David

Yes. The poster slipperyMonkey07 has made a very good point and I stand corrected. The Prolific rule says

" it is not permitted to screen participants within your survey or study description."

And if it is not permitted in the study description then it implies that it is not permitted in the title too, imho, since this is I think part of the “study description” in toto.

So contra what I said earlier in the thread, unless that FAQ changes, then even the attempt to screen (even if everyone is paid) is against the rules of Prolific.

I am sorry I did not read the FAQ more closely.

Tim

No need to apologize @David_Martinez - that’s why we keep the old threads open! Thanks for flagging up the Reddit thread as well - Support is already on the case with that one.

I can see the case for rewording the FAQ a bit here - it’s probably not the clearest what is actually a guideline, and what is a rule that would get one’s study removed from the site, so I’ll chat further with our Support folks on how this is communicated.

Recapping here though, it seems we’re discussing both:

  • Whether it’s recommended (e.g. beneficial), and
  • Whether it’s allowed within Prolific’s rules.

The discussion further up the thread still holds true here:

  • You can include the screening criteria in the title and/or study description if you have used pre-screeners (or followed the custom sample method) and commit to paying participants regardless of whether they meet the criteria set out in the title. However…
  • You cannot use your title and study description as the basis on which you screen participants from your study.

The latter of these is the rule mentioned by slipperyMonkey07 in the Reddit thread of it not being permitted to screen participants within your survey or study description. @timtak the title does count as part of this, however the rule applies more to the action you take beyond it, rather than just the mere mention.

The key principle behind this rule is to ensure a good participant experience:

  • Participants should only see studies for which they meet the eligibility criteria, and
  • if a participant takes a study, they should be paid in accordance to Prolific’s payment principles, regardless of whether they meet the criteria.

All in, mentioning the screening criteria in your title and description is ok as long as you’re not screening for it. As for a recommendation though, you’d want to consider the benefits and risks:

  • It may help you identify a more niche demographic beyond your existing screener. For example, if you screened for people wearing superhero costumes, you could use your title to potentially identify a specific type of superhero costume therein.
  • It may lower your data quality, as people who may not be in the demographic you’re looking for might count themselves in, while equally people you are looking to attract may screen themselves out.
  • It may lead to a poorer participant experience as well as it may force the participant to do more work, or give the impression of needing to do more work, for less reward.

Hope that adds some clarity - I’ll work with our support team on how we can make it a bit clearer in the FAQ.

3 Likes

Semi-related, new suggestions for screening questions are always welcome, so if you’re after something in particular do start a new topic on what you’re looking for!