The two most effective incentives for crowdsourced work


From a recent study looking at the most effective incentives for increasing answer quality from crowdsourced Mechanical Turk workers: both of the top two incentives relied on getting workers to think about what others’ answers would be:

Punishment Agreement (financial)
“After this HIT has been completed, we will review the answers for at least one of the following five questions. For each of the questions we review, we will penalize you if you disagree with the majority of other workers who complete this HIT. The penalty will be a deduction of 10% from the total bonus you could have earned if your answer had agreed with the majority.”

Bayesian Truth Serum or BTS (financial)
“For the following five questions, we will also ask you to predict the re- sponses of other workers who complete this task. There is no incentive to misreport what you truly believe to be your answers as well as others’ answers. You will have a higher probability of winning a lottery (bonus payment) if you submit answers that are more surprisingly common than collectively predicted.”

Read the full paper here. [PDF]


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s