skip to main content
10.1145/3295750.3298978acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
abstract

Investigating the Effects of Popularity Data on Predictive Relevance Judgments in Academic Search Systems

Published:08 March 2019Publication History

ABSTRACT

The elements of a surrogate serve as clues to relevance. They may be seen as operationalized relevance criteria by which users judge the relevance of a search result according to their information need. In addition to short textual summaries, today's academic search systems integrate additional data into their search results presentation, for example, the number of citations or the number of downloads. This kind of data can be described as popularity data, serving as factors also incorporated in search engines' ranking algorithms. Past research shows that there are diverse criteria and factors involved in relevance judgements from the user perspective. However, previous empirical studies on relevance criteria and clues examined surrogates that did not include popularity data. The goal of my doctoral research is to gain significant knowledge on the criteria by which users in an academic search situation make relevance judgements based on surrogates that include popularity data. This paper describes the current state of the experimental research design and method of data collection.

References

  1. Balatsoukas, P. and Ruthven, I. 2012. An eye-tracking approach to the analysis of relevance judgments on the Web: The case of Google search engine. Journal of the American Society for Information Science and Technology. 63, 9 (Sep. 2012), 1728--1746. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Barry, C.L. 1998. Document representations and clues to document relevance. Journal of the American Society for Information Science. 49, 14 (Jan. 1998), 1293--1303. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Barry, C.L. and Schamber, L. 1998. Users' criteria for relevance evaluation: A cross-situational comparison. Information Processing & Management. 34, 2--3 (März 1998), 219--236. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bateman, J. 1998. Changes in Relevance Criteria: A Longitudinal Study. Proceedings of the 61st ASIS Annual Meeting (1998), 23--32.Google ScholarGoogle Scholar
  5. Behnert, C. and Lewandowski, D. 2015. Ranking search results in library information systems - Considering ranking approaches adapted from web search engines. The Journal of Academic Librarianship. 41, 6 (Nov. 2015), 725--735.Google ScholarGoogle ScholarCross RefCross Ref
  6. Beresi, U.C. et. al. 2010. Why did you pick that? Visualising relevance criteria in exploratory search. International Journal on Digital Libraries. 11, 2 (June 2010), 59--74. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Borlund, P. und Ingwersen, P. 1997. The development of a method for the evaluation of interactive information retrieval systems. Journal of Documentation. 53, 3 (Aug. 1997), 225--250.Google ScholarGoogle ScholarCross RefCross Ref
  8. Bruce, H.W. 1994. A cognitive view of the situational dynamism of user-centered relevance estimation. Journal of the American Society for Information Science. 45, 3 (Apr. 1994), 142--148. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Buckland, M.K. 2017. Information and society. MIT Press.Google ScholarGoogle Scholar
  10. Cool, C. u. a. 1993. Characteristics of texts affecting relevance judgments. Proceedings of the 14th Annual National Online Meeting, New York, May 4-6, 1993 (1993), 77--84.Google ScholarGoogle Scholar
  11. Howard, D.L. 1994. Pertinence as reflected in personal constructs. Journal of the American Society for Information Science. 45, 3 (1994), 172--185. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Kelly, D. 2009. Methods for evaluating interactive information retrieval systems with users. Foundations and Trends® in Information Retrieval. 3, 1-2 (2009). Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Kelly, D. and Cresenzi, A. 2016. From design to analysis: Conducting controlled laboratory experiments with users. Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval - SIGIR '16 (New York, New York, USA, 2016), 1207--1210. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Mizzaro, S. 1997. Relevance: The whole history. Journal of the American Society for Information Science. 48, 9 (Sep. 1997), 810--832. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Papaeconomou, C. et. al. 2008. Searchers' relevance judgments and criteria in evaluating web pages in a learning style perspective. Proceedings of the second international symposium on Information interaction in context - IIiX '08 (New York, New York, USA, 2008), 123--132. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Park, T.K. 1993. The nature of relevance in Information Retrieval: An empirical study. The Library Quarterly. 63, 3 (July 1993), 318--351.Google ScholarGoogle Scholar
  17. Plassmeier, K. et. al. 2015. Evaluating popularity data for relevance ranking in library information systems. Proceedings of the 78th ASIS&T Annual Meeting (2015). Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Rieh, S.Y. 2009. Credibility and cognitive authority of information. Encyclopedia of Library and Information Sciences. CRC Press. 1337--1344.Google ScholarGoogle Scholar
  19. Rieh, S.Y. 2002. Judgment of information quality and cognitive authority in the Web. Journal of the American Society for Information Science and Technology. 53, 2 (Jan. 2002), 145--161. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Rieh, S.Y. and Belkin, N.J. 1998. Understanding judgment of information quality and cognitive authority in the WWW. Proceedings of the 61st ASIS Annual Meeting (1998), 279--289.Google ScholarGoogle Scholar
  21. Saracevic, T. 2007. Relevance: A review of the literature and a framework for thinking on the notion in information science. Part III: Behavior and effects of relevance. Journal of the American Society for Information Science and Technology. 58, 13 (Nov. 2007), 2126--2144. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Saracevic, T. 2016. The Notion of relevance in information science: Everybody knows what relevance is. But, what is it really?. Morgan & Claypool.Google ScholarGoogle Scholar
  23. Savolainen, R. and Kari, J. 2006. User-defined relevance criteria in web searching. Journal of Documentation. 62, 6 (Nov. 2006), 685--707.Google ScholarGoogle ScholarCross RefCross Ref
  24. Surowiecki, J. 2005. The wisdom of crowds. Anchor Books. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Tang, R. and Solomon, P. 1998. Toward an understanding of the dynamics of relevance judgment: An analysis of one person's search behavior. Information Processing & Management. 34, 2-3 (March 1998), 237--256. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Tang, R. and Solomon, P. 2001. Use of relevance criteria across stages of document evaluation: On the complementarity of experimental and naturalistic studies. Journal of the American Society for Information Science and Technology. 52, 8 (Jan. 2001), 676--685. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Taylor, A. 2013. Examination of work task and criteria choices for the relevance judgment process. Journal of Documentation. 69, 4 (July 2013), 523--544.Google ScholarGoogle ScholarCross RefCross Ref
  28. Taylor, A. 2012. User relevance criteria choices and the information search process. Information Processing & Management. 48, 1 (Jan. 2012), 136--153. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Wang, P. 1994. A cognitive model of document selection of real users of information retrieval systems. University of Maryland; College of Library and Information Science.Google ScholarGoogle Scholar
  30. Wang, P. and Soergel, D. 1998. A cognitive model of document use during a research project. Study I. Document selection. Journal of the American Society for Information Science. 49, 2 (1998), 115--133. Google ScholarGoogle ScholarCross RefCross Ref
  31. Wilson, P. 1983. Second-hand knowledge: An inquiry into cognitive authority. Greenwood Press.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    CHIIR '19: Proceedings of the 2019 Conference on Human Information Interaction and Retrieval
    March 2019
    463 pages
    ISBN:9781450360258
    DOI:10.1145/3295750

    Copyright © 2019 Owner/Author

    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 8 March 2019

    Check for updates

    Qualifiers

    • abstract

    Acceptance Rates

    Overall Acceptance Rate55of163submissions,34%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader