Reward or Penalty: Aligning Incentives of Stakeholders in Crowdsourcing

Abstract—Crowdsourcing is a promising platform, whereby massive tasks are broadcasted to a crowd of semi-skilled workers by the requester for reliable solutions. In this paper, we consider four key evaluation indices of a crowdsourcing community (i.e. quality, cost, latency, and platform improvement), and demonstrate that these indices involve the interests of the three stakeholders, namely requester, worker and crowdsourcing platform. Since the incentives among these three stakeholders always conflict with each other, to elevate the long-term development of the crowdsourcing community, we take the perspective of the whole crowdsourcing community, and design a crowdsourcing mechanism to align incentives of stakeholders together. Specifically, we give workers reward or penalty according to their reporting solutions instead of only nonnegative payment. Furthermore, we find a series of proper reward-penalty function pairs and compute workers personal order values, which can provide different amounts of reward and penalty according to both the workers reporting beliefs and their individual history performances, and keep the incentive of workers at the same time. The proposed mechanism can help latency control, promote quality and platform evolution of crowdsourcing community, and improve the aforementioned four key evaluation indices. Theoretical analysis and experimental results are provided to validate and evaluate the proposed mechanism respectively.

CONCLUSION

In this paper, we have demonstrated that a crowdsourcing community involves the interests of the three stakeholders, namely requester, worker and crowdsourcing platform, and the incentives among them always conflict with each other. We have proposed and verified the hypothesis that all workers believe that in most cases they observe the real solution of each task perturbed only by unbiased noise, and Any Query Call Us: 9566355386

 

design a crowdsourcing mechanism, encompassing a series of proper reward-penalty function pairs and workers’ personal order values, to align the interests of different stakeholders, which has been validated by the theoretical analysis and experimental results. This work can help to relieve the platform and requesters of crowdsourcing community from monitoring workers’ efforts and capacities in performing crowdsourcing tasks, save the costs of requesters, and attract more professional workers to the crowdsourcing platforms. It can accelerate the long-term development of the whole crowdsourcing community.

SYSTEM REQUIREMENTS:

HARDWARE REQUIREMENTS:

• System : Pentium IV 2.4 GHz.

• Hard Disk : 40 GB.

• Floppy Drive : 1.44 Mb.

• Monitor : 15 VGA Colour.

• Mouse : Logitech.

• Ram : 512 Mb.

 

SOFTWARE REQUIREMENTS:

• Operating system : - Windows XP/7.

• Coding Language : JAVA/J2EE

• Data Base : MYSQL

 

REFERENCES Any Query Call Us: 9566355386

 

[1] B. Guo, C. Chen, D. Zhang, Z. Yu, and A. Chin, ―Mobile crowd sensing and computing: when participatory sensing meets participatory social media,‖ IEEE Communications Magazine, vol. 54, no. 2, pp. 131–137, 2016.

[2] R. Jurca and B. Faltings, ―Error rate analysis of labeling by crowdsourcing,‖ in Proceedings of the 30th International Conference on Machine Learning Workshop (ICML), pp. 1–19, MIT Press, 2013.

[3] Y. Gao, Y. Chen, and K. J. R. Liu, ―On cost-effective incentive mechanisms in microtask crowdsourcing,‖ IEEE Transactions on Computational Intelligence and AI in Games, vol. 7, pp. 3–15, 3 2015.