ABSTRACT
This paper studied how social transparency and different peer-dependent reward schemes (i.e., individual, teamwork, and competition) affect the outcomes of crowdsourcing. The results showed that when social transparency was increased by asking otherwise anonymous workers to share their demographic information (e.g., name, nationality) to the paired worker, they performed significantly better. A more detailed analysis showed that in a teamwork reward scheme, in which the reward of the paired workers depended only on the collective outcomes, increasing social transparency could offset effects of social loafing by making them more accountable to their teammates. In a competition reward scheme, in which workers competed against each other and the reward depended on how much they outperformed their opponent, increasing social transparency could augment effects of social facilitation by providing more incentives for them to outperform their opponent. The results suggested that a careful combination of methods that increase social transparency and different reward schemes can significantly improve crowdsourcing outcomes.
- Alonso, O., Rose, D. E., and Stewart, B. Crowdsourcing for relevance evaluation. SIGIR Forum 42, 2 (Nov. 2008), 9--15. Google ScholarDigital Library
- Bandiera, O., Barankay, I., and Rasul, I. Social preferences and the response to incentives: Evidence from personnel data. The Quarterly Journal of Economics 120, 3 (2005), 917--962.Google Scholar
- Dow, S., Kulkarni, A., Klemmer, S., and Hartmann, B. Shepherding the crowd yields better work. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work, CSCW '12, ACM (2012), 1013--1022. Google ScholarDigital Library
- Erickson, T., Halverson, C., Kellogg, W. A., Laff, M., and Wolf, T. Social translucence: designing social infrastructures that make collective activity visible. Commun. ACM 45, 4 (Apr. 2002), 40--44. Google ScholarDigital Library
- Erickson, T., and Kellogg, W. A. Social translucence: an approach to designing systems that support social processes. ACM Trans. Comput.-Hum. Interact. 7, 1 (Mar. 2000), 59--83. Google ScholarDigital Library
- Festinger, L. A theory of social comparison processes. Human relations 7, 2 (1954), 117--140.Google Scholar
- Harkins, S. G. Social loafing and social facilitation. Journal of Experimental Social Psychology 23, 1 (1987), 1--18.Google ScholarCross Ref
- Howe, J. The rise of crowdsourcing. Wired Magazine (06 2006).Google Scholar
- Huang, S.-W., and Fu, W.-T. Systematic analysis of output agreement games: Effects of gaming environment, social interaction, and feedback. In Proceedings of HCOMP12: The 4th Workshop on Human Computation (2012).Google Scholar
- Huang, S.-W., and Fu, W.-T. Enhancing reliability using peer consistency evaluation in human computation. In Proceedings of the ACM 2013 conference on Computer Supported Cooperative Work, CSCW '13, ACM (2013). Google ScholarDigital Library
- Ipeirotis, P. G., Provost, F., and Wang, J. Quality management on amazon mechanical turk. In Proceedings of the ACM SIGKDD Workshop on Human Computation, HCOMP '10, ACM (2010), 64--67. Google ScholarDigital Library
- Karau, S., and Williams, K. Social loafing: A meta-analytic review and theoretical integration. Journal of Personality and Social Psychology 65, 4 (1993), 681--706.Google ScholarCross Ref
- Kerr, N. L. Motivation losses in small groups: A social dilemma analysis. Journal of Personality and Social Psychology 45 (1983), 819--828.Google ScholarCross Ref
- Kittur, A., Smus, B., Khamkar, S., and Kraut, R. E. Crowdforge: crowdsourcing complex work. In Proceedings of the 24th annual ACM symposium on User interface software and technology, UIST '11, ACM (2011), 43--52. Google ScholarDigital Library
- Kittur, A., Suh, B., and Chi, E. H. Can you ever trust a wiki?: impacting perceived trustworthiness in wikipedia. In Proceedings of the 2008 ACM conference on Computer supported cooperative work, CSCW '08, ACM (2008), 477--480. Google ScholarDigital Library
- Law, E., and Von Ahn, L. Human Computation. Synthesis Lectures on Artificial Intelligence and Machine Learning. Morgan & Claypool Publishers, 2011. Google ScholarDigital Library
- Mas, A., and Moretti, E. Peers at work. Working Paper 12508, National Bureau of Economic Research, September 2006.Google Scholar
- Mason, W., and Watts, D. J. Financial incentives and the "performance of crowds". SIGKDD Explor. Newsl. 11, 2 (May 2010), 100--108. Google ScholarDigital Library
- Millen, D. R., and Patterson, J. F. Identity disclosure and the creation of social capital. In CHI '03 extended abstracts on Human factors in computing systems, CHI EA '03, ACM (2003), 720--721. Google ScholarDigital Library
- Postmes, T., Spears, R., Lee, A., and Novak, R. Individuality and social influence in groups: inductive and deductive routes to group identity. Journal of personality and social psychology 89, 5 (2005), 747.Google Scholar
- Quinn, A. J., and Bederson, B. B. Human computation: a survey and taxonomy of a growing field. In Proceedings of the 2011 annual conference on Human factors in computing systems, CHI '11, ACM (2011), 1403--1412. Google ScholarDigital Library
- Robertson, S., Vojnovic, M., and Weber, I. Rethinking the esp game. In Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, CHI EA '09, ACM (2009), 3937--3942. Google ScholarDigital Library
- Snow, R., O'Connor, B., Jurafsky, D., and Ng, A. Y. Cheap and fast - but is it good?: evaluating non-expert annotations for natural language tasks. In Proceedings of the Conference on Empirical Methods in Natural Language Processing, EMNLP '08, Association for Computational Linguistics (Stroudsburg, PA, USA, 2008), 254--263. Google ScholarDigital Library
- Stuart, H. C., Dabbish, L., Kiesler, S., Kinnaird, P., and Kang, R. Social transparency in networked information exchange: a theoretical framework. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work, CSCW '12, ACM (2012), 451--460. Google ScholarDigital Library
- Suh, B., Chi, E. H., Kittur, A., and Pendleton, B. A. Lifting the veil: improving accountability and social transparency in wikipedia with wikidashboard. In Proceedings of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, CHI '08 (2008). Google ScholarDigital Library
- von Ahn, L., and Dabbish, L. Labeling images with a computer game. In Proceedings of the SIGCHI conference on Human factors in computing systems, CHI '04, ACM (2004), 319--326. Google ScholarDigital Library
- von Ahn, L., and Dabbish, L. Designing games with a purpose. Commun. ACM 51 (Aug. 2008), 58--67. Google ScholarDigital Library
- Zajonc, R. Social facilitation. Science 17 (1965), 65.Google Scholar
Index Terms
- Don't hide in the crowd!: increasing social transparency between peer workers improves crowdsourcing outcomes
Recommendations
Motivating crowds using social facilitation and social transparency
CSCW '13: Proceedings of the 2013 conference on Computer supported cooperative work companionWe reported results from an experiment using an image labeling task, when workers were able to compare their own labels with the labels generated by another worker, they were motivated to generate more labels. In addition, when the workers shared their ...
Exploring Real-Time Collaboration in Crowd-Powered Systems Through a UI Design Tool
Real-time collaboration between a requester and crowd workers expands the scope of tasks that crowdsourcing can be used for by letting requesters and crowd workers interactively create various artifacts (e.g., a sketch prototype, writing, or program ...
Who's the boss?: requester transparency and motivation in a microtask marketplace
CHI EA '14: CHI '14 Extended Abstracts on Human Factors in Computing SystemsWorkers on crowdsourcing platforms such as Amazon's Mechanical Turk often receive little to no information about who is requesting the task they are asked to perform. This can lead to psychological distance and reduced work quality as a result. In this ...
Comments