skip to main content
10.1145/1571941.1572115acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
poster

Relevance criteria for e-commerce: a crowdsourcing-based experimental analysis

Published:19 July 2009Publication History

ABSTRACT

We discuss the concept of relevance criteria in the context of e-Commerce search. A vast body of research literature describes the beyond-topical criteria used to determine the relevance of the document to the need. We argue that in an e-Commerce scenario there are some differences, and novel and different criteria can be used to determine relevance. We experimentally validate this hypothesis by means of Amazon Mechanical Turk using a crowdsourcing approach.

References

  1. O. Alonso, D. Rose, and B. Stewart. Crowdsourcing for relevance evaluation. SIGIR Forum, 42(2):9--15, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. C. L. Barry and L. Schamber. Users' criteria for relevance evaluation: A cross--situational comparison. IP&M, 34(2/3):219--236, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. A. Kittur, E. H. Chi, and B. Suh. Crowdsourcing user studies with Mechanical Turk. In CHI '08: Proceeding of the 26th SIGCHI, pages 453--456, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. S. Mizzaro. Relevance: The whole history. JASIS, 48(9):810--832, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. T. Saracevic. Relevance: A review of the literature and a framework for thinking on the notion in information science. Part II: nature and manifestations of relevance. JASIST, 58(13):1915--1933, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. T. Saracevic. Relevance: A review of the literature and a framework for thinking on the notion in information science. Part III: behavior and effects of relevance. JASIST, 58(13):2126--2144, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Relevance criteria for e-commerce: a crowdsourcing-based experimental analysis

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SIGIR '09: Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
      July 2009
      896 pages
      ISBN:9781605584836
      DOI:10.1145/1571941

      Copyright © 2009 Copyright is held by the author/owner(s)

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 19 July 2009

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • poster

      Acceptance Rates

      Overall Acceptance Rate792of3,983submissions,20%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader