skip to main content
10.1145/2736277.2741685acmotherconferencesArticle/Chapter ViewAbstractPublication PageswwwConference Proceedingsconference-collections
research-article

The Dynamics of Micro-Task Crowdsourcing: The Case of Amazon MTurk

Published:18 May 2015Publication History

ABSTRACT

Micro-task crowdsourcing is rapidly gaining popularity among research communities and businesses as a means to leverage Human Computation in their daily operations. Unlike any other service, a crowdsourcing platform is in fact a marketplace subject to human factors that affect its performance, both in terms of speed and quality. Indeed, such factors shape the dynamics of the crowdsourcing market. For example, a known behavior of such markets is that increasing the reward of a set of tasks would lead to faster results. However, it is still unclear how different dimensions interact with each other: reward, task type, market competition, requester reputation, etc. In this paper, we adopt a data-driven approach to (A) perform a long-term analysis of a popular micro-task crowdsourcing platform and understand the evolution of its main actors (workers, requesters, and platform). (B) We leverage the main findings of our five year log analysis to propose features used in a predictive model aiming at determining the expected performance of any batch at a specific point in time. We show that the number of tasks left in a batch and how recent the batch is are two key features of the prediction. (C) Finally, we conduct an analysis of the demand (new tasks posted by the requesters) and supply (number of tasks completed by the workforce) and show how they affect task prices on the marketplace.

References

  1. O. Alonso and S. Mizzaro. Using crowdsourcing for TREC relevance assessment. Inf. Process. Manage., 48(6):1053--1066, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. A. Bozzon, M. Brambilla, S. Ceri, M. Silvestri, and G. Vesci. Choosing the right crowd: Expert finding in social networks. In Proceedings of the 16th International Conference on Extending Database Technology, EDBT '13, pages 637--648, New York, NY, USA, 2013. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. L. Breiman and A. Cutler. Random Forests. https://www.stat.berkeley.edu/ breiman/RandomForests/cc_home.htm. Accessed: 2015-03-04.Google ScholarGoogle Scholar
  4. G. Demartini, D. E. Difallah, and P. Cudré-Mauroux. ZenCrowd: Leveraging Probabilistic Reasoning and Crowdsourcing Techniques for Large-scale Entity Linking. In Proceedings of the 21st International Conference on World Wide Web, WWW '12, pages 469--478, New York, NY, USA, 2012. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. D. E. Difallah, M. Catasta, G. Demartini, and P. Cudré-Mauroux. Scaling-up the Crowd: Micro-Task Pricing Schemes for Worker Retention and Latency Improvement. In Second AAAI Conference on Human Computation and Crowdsourcing, 2014.Google ScholarGoogle Scholar
  6. D. E. Difallah, G. Demartini, and P. Cudré-Mauroux. Pick-a-crowd: Tell me what you like, and i'll tell you what to do. In Proceedings of the 22Nd International Conference on World Wide Web, WWW '13, pages 367--374, Republic and Canton of Geneva, Switzerland, 2013. International World Wide Web Conferences Steering Committee. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. Faradani, B. Hartmann, and P. G. Ipeirotis. What's the right price? pricing tasks for finishing on time. Human Computation, 11, 2011.Google ScholarGoogle Scholar
  8. J. D. Farmer, A. Gerig, F. Lillo, and S. Mike. Market Efficiency and the Long-Memory of Supply and Demand: Is Price Impact Variable and Permanent or Fixed and Temporary. Quant. Finance, 6(2):107--112, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  9. M. J. Franklin, D. Kossmann, T. Kraska, S. Ramesh, and R. Xin. CrowdDB: Answering Queries with Crowdsourcing. In Proceedings of the 2011 ACM SIGMOD International Conference on Management of Data, SIGMOD '11, pages 61--72, New York, NY, USA, 2011. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. U. Gadiraju, R. Kawase, and S. Dietze. A taxonomy of microtasks on the web. In Proceedings of the 25th ACM Conference on Hypertext and Social Media, HT '14, pages 218--223, New York, NY, USA, 2014. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Y. Gao and A. G. Parameswaran. Finish them!: Pricing algorithms for human computation. PVLDB, 7(14):1965--1976, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. P. G. Ipeirotis. Analyzing the amazon mechanical turk marketplace. XRDS, 17(2):16--21, Dec. 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. L. C. Irani and M. S. Silberman. Turkopticon: Interrupting Worker Invisibility in Amazon Mechanical Turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '13, pages 611--620, New York, NY, USA, 2013. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. A. Kittur, J. V. Nickerson, M. Bernstein, E. Gerber, A. Shaw, J. Zimmerman, M. Lease, and J. Horton. The future of crowd work. In Proceedings of the 2013 Conference on Computer Supported Cooperative Work, CSCW '13, pages 1301--1318, New York, NY, USA, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. P. Kucherbaev, S. Tranquillini, F. Daniel, F. Casati, M. Marchese, M. Brambilla, and P. Fraternali. Business processes for the crowd computer. In M. La Rosa and P. Soffer, editors, Business Process Management Workshops, volume 132 of Lecture Notes in Business Information Processing, pages 256--267. Springer Berlin Heidelberg, 2013.Google ScholarGoogle Scholar
  16. A. Kulkarni, M. Can, and B. Hartmann. Collaboratively crowdsourcing workflows with turkomatic. In Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work, CSCW '12, pages 1003--1012, New York, NY, USA, 2012. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. J. Mortensen, M. A. Musen, and N. F. Noy. Crowdsourcing the verification of relationships in biomedical ontologies. In AMIA, 2013.Google ScholarGoogle Scholar
  18. A. Parameswaran, A. D. Sarma, H. Garcia-Molina, N. Polyzotis, and J. Widom. Human-assisted graph search: It's okay to ask questions. Proc. VLDB Endow., 4(5):267--278, Feb. 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. C. Sarasua, E. Simperl, and N. F. Noy. Crowdmap: Crowdsourcing ontology alignment with microtasks. In Proceedings of the 11th International Conference on The Semantic Web - Volume Part I, ISWC'12, pages 525--541, Berlin, Heidelberg, 2012. Springer-Verlag. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. M. S. Silberman, L. Irani, and J. Ross. Ethics and tactics of professional crowdwork. XRDS, 17(2):39--43, Dec. 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. L. von Ahn and L. Dabbish. Designing games with a purpose. Commun. ACM, 51(8):58--67, Aug. 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. L. von Ahn, R. Liu, and M. Blum. Peekaboom: a game for locating objects in images. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '06, pages 55--64, New York, NY, USA, 2006. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. M. Vukovic. Crowdsourcing for enterprises. In Services-I, 2009 World Conference on, pages 686--692. IEEE, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. J. Wang, T. Kraska, M. J. Franklin, and J. Feng. CrowdER: Crowdsourcing Entity Resolution. Proc. VLDB Endow., 5(11):1483--1494, July 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. J. Wang, G. Li, T. Kraska, M. J. Franklin, and J. Feng. Leveraging transitive relations for crowdsourced joins. In Proceedings of the 2013 ACM SIGMOD International Conference on Management of Data, pages 229--240. ACM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. The Dynamics of Micro-Task Crowdsourcing: The Case of Amazon MTurk

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      WWW '15: Proceedings of the 24th International Conference on World Wide Web
      May 2015
      1460 pages
      ISBN:9781450334693

      Copyright © 2015 Copyright is held by the International World Wide Web Conference Committee (IW3C2)

      Publisher

      International World Wide Web Conferences Steering Committee

      Republic and Canton of Geneva, Switzerland

      Publication History

      • Published: 18 May 2015

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      WWW '15 Paper Acceptance Rate131of929submissions,14%Overall Acceptance Rate1,899of8,196submissions,23%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader