skip to main content
10.1145/1148170.1148177acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
Article

Improving web search ranking by incorporating user behavior information

Published:06 August 2006Publication History

ABSTRACT

We show that incorporating user behavior data can significantly improve ordering of top results in real web search setting. We examine alternatives for incorporating feedback into the ranking process and explore the contributions of user feedback compared to other common web search features. We report results of a large scale evaluation over 3,000 queries and 12 million user interactions with a popular web search engine. We show that incorporating implicit feedback can augment other features, improving the accuracy of a competitive web search ranking algorithms by as much as 31% relative to the original performance.

References

  1. E. Agichtein, E. Brill, S. Dumais, and R.Ragno, Learning User Interaction Models for Predicting Web Search Result Preferences. In Proceedings of the ACM Conference on Research and Development on Information Retrieval (SIGIR), 2006 Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. J. Allan, HARD Track Overview in TREC 2003, High Accuracy Retrieval from Documents, 2003Google ScholarGoogle Scholar
  3. R. Baeza-Yates and B. Ribeiro-Neto, Modern Information Retrieval, Addison-Wesley, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. S. Brin and L. Page, The Anatomy of a Large-scale Hypertextual Web Search Engine, in Proceedings of WWW, 1997 Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. C.J.C. Burges, T. Shaked, E. Renshaw, A. Lazier, M. Deeds, N. Hamilton, G. Hullender, Learning to Rank using Gradient Descent, in Proceedings of the International Conference on Machine Learning, 2005 Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. D.M. Chickering, The WinMine Toolkit, Microsoft Technical Report MSR-TR-2002-103, 2002Google ScholarGoogle Scholar
  7. M. Claypool, D. Brown, P. Lee and M. Waseda. Inferring user interest. IEEE Internet Computing. 2001 Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. S. Fox, K. Karnawat, M. Mydland, S. T. Dumais and T. White. Evaluating implicit measures to improve the search experience. In ACM Transactions on Information Systems, 2005 Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. J. Goecks and J. Shavlick. Learning users' interests by unobtrusively observing their normal behavior. In Proceedings of the IJCAI Workshop on Machine Learning for Information Filtering. 1999.Google ScholarGoogle Scholar
  10. K Jarvelin and J. Kekalainen. IR evaluation methods for retrieving highly relevant documents. In Proceedings of the ACM Conference on Research and Development on Information Retrieval (SIGIR), 2000 Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. T. Joachims, Optimizing Search Engines Using Clickthrough Data. In Proceedings of the ACM Conference on Knowledge Discovery and Datamining (SIGKDD), 2002 Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. T. Joachims, L. Granka, B. Pang, H. Hembrooke, and G. Gay, Accurately Interpreting Clickthrough Data as Implicit Feedback, Proceedings of the ACM Conference on Research and Development on Information Retrieval (SIGIR), 2005 Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. T. Joachims, Making Large-Scale SVM Learning Practical. Advances in Kernel Methods, in Support Vector Learning, MIT Press, 1999 Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. D. Kelly and J. Teevan, Implicit feedback for inferring user preference: A bibliography. In SIGIR Forum, 2003 Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. J. Konstan, B. Miller, D. Maltz, J. Herlocker, L. Gordon, and J. Riedl. GroupLens: Applying collaborative filtering to usenet news. In Communications of ACM, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. M. Morita, and Y. Shinoda, Information filtering based on user behavior analysis and best match text retrieval. Proceedings of the ACM Conference on Research and Development on Information Retrieval (SIGIR), 1994 Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. D. Oard and J. Kim. Implicit feedback for recommender systems. In Proceedings of the AAAI Workshop on Recommender Systems. 1998Google ScholarGoogle Scholar
  18. D. Oard and J. Kim. Modeling information content using observable behavior. In Proceedings of the 64th Annual Meeting of the American Society for Information Science and Technology. 2001Google ScholarGoogle Scholar
  19. N. Pharo, N. and K. Järvelin. The SST method: a tool for analyzing web information search processes. In Information Processing & Management, 2004 Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. P. Pirolli, The Use of Proximal Information Scent to Forage for Distal Content on the World Wide Web. In Working with Technology in Mind: Brunswikian. Resources for Cognitive Science and Engineering, Oxford University Press, 2004Google ScholarGoogle Scholar
  21. F. Radlinski and T. Joachims, Query Chains: Learning to Rank from Implicit Feedback. In Proceedings of the ACM Conference on Knowledge Discovery and Data Mining (SIGKDD), 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. F. Radlinski and T. Joachims, Evaluating the Robustness of Learning from Implicit Feedback, in Proceedings of the ICML Workshop on Learning in Web Search, 2005Google ScholarGoogle Scholar
  23. S. E. Robertson, H. Zaragoza, and M. Taylor, Simple BM25 extension to multiple weighted fields, in Proceedings of the Conference on Information and Knowledge Management (CIKM), 2004 Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. G. Salton & M. McGill. Introduction to modern information retrieval. McGraw-Hill, 1983 Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. E.M. Voorhees, D. Harman, Overview of TREC, 2001Google ScholarGoogle Scholar
  26. G.R. Xue, H.J. Zeng, Z. Chen, Y. Yu, W.Y. Ma, W.S. Xi, and W.G. Fan, Optimizing web search using web click-through data, in Proceedings of the Conference on Information and Knowledge Management (CIKM), 2004 Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. H. Zaragoza, N. Craswell, M. Taylor, S. Saria, and S. Robertson. Microsoft Cambridge at TREC 13: Web and Hard Tracks. In Proceedings of TREC 2004Google ScholarGoogle Scholar

Index Terms

  1. Improving web search ranking by incorporating user behavior information

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            SIGIR '06: Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
            August 2006
            768 pages
            ISBN:1595933697
            DOI:10.1145/1148170

            Copyright © 2006 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 6 August 2006

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • Article

            Acceptance Rates

            Overall Acceptance Rate792of3,983submissions,20%

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader