Multiple IDs sending the same replies

I am running a few pilot studies with small sample sizes and different participant filters for each study. My approach is that I always duplicate the previous study, change a filter, and run again with 20-50 participants.

Now the problem is, that I keep getting participant IDs that are clearly fraud as they (a) give the exact same wrong answers to an open question (same spelling, etc) as other participant IDs within the study (b) give the exact same wrong answers to an open question (same spelling, etc) as other participant IDs across studies, even though I have a filter that should exclude participants from earlier studies.

This is clearly the case for around 50% of my participants and maybe for more, but for those others I cannot be sure. Is there a way to prevent that from happening? If not, is there a way to return these participants without me having to write emails to the support team or report the IDs and then having to wait a few days until I can continue? (I need studies to be completed before I can run a new one so that I can filter IDs that have already participated in previous studies.)

Thanks a lot for your help!

1 Like

Hello Markus, welcome back to the Forum!

Mmh, the issue doesn’t seem trivial.

From your title I read “multiple responses by same ID”, while when going through your post I read “give the exact same […] as other participant IDs”. So, is it the same ID sending identical responses or are they different IDs sending them?

If it is the first case, maybe there is some issue with the filter you put on. When duplicating the study from the previous one, are you selecting the “Exclude participants from previous studies” filter? Also, are you making sure to remove any Custom Allowlist, if present? If both are true, then the anomalous behavior could be due to some bug.

If it’s the second case, so multiple IDs sending same replies, then it’s likely the case of fraud participants having subscribed with more accounts.

For both cases (either the bug or fraud subjects) I would consider sending a request from this form. I think it should be a priority of Prolific’s Team to solve the issue. You can also specify there the list of suspect IDs so that the Team can take actions. Also, to make your study ‘Complete’ and hence to be able to proceed towards the next one, I would consider rejecting those submissions (at least, the “copies” of the first submissions). I think these cases could fall under the “The participant did not sufficiently engage in a task where the required level of engagement was clearly specified.” reason. But it’s my opinion! Also, I know it might take long, but consider sending a Bulk Report.

Is there a way to prevent that from happening?

For your next experiment you could rather considering using the filter Custom Blocklist and adding there the full list of suspect IDs to make sure none will access it.

Sorry if I cannot be of more help.

Hi Veronica,

thanks a lot for your help!

Sorry, the title is misleading (just updated it). It is the second case, multiple IDs sending the same replies. So far, after each session I have sent an email to the support team in which I list the IDs and then they return these submissions. However, this always takes a few days in which I cannot run the next session (since I can only filter out participants from completed previous studies.) The Custom Blocklist could help here though - thanks a lot!

Rejecting does not always work since there is a limit on the maximum number of participants that you can reject, and sometimes I have as many as 50% of clear fraud subjects.

Glad my reply was helpful!

As for rejection limits, for valid reasons it’s possible to increase them. However, it’s not something you can do automatically and Prolific Team has to handle this kind of requests (and hence the usual form has to be sent). You can read more about the topic on this page.

@Josh is on holiday right now, otherwise I’m sure he’d make sure with the Team to give priority to your requests.


1 Like

Hi Markus! Sheila from Prolific here, so sorry we’ve missed this post!

Can I check, have you sent through a Support Request on our website that I can have a look for? If you can DM me with the email address the request came from, I’ll be happy to escalate this with my team!

Hi @Markus_Eyting , tagging you just in case you haven’t seen Sheila’s response