Skip to main content
Log in

A day without a search engine: an experimental study of online and offline searches

  • Original Paper
  • Published:
Experimental Economics Aims and scope Submit manuscript

Abstract

With the evolution of the Web and development of web-based search engines, online searching has become a common method for obtaining information. Given this popularity, the question arises as to how much time people save by using search engines for their information needs compared to offline sources, as well as how online searching affects both search experiences and search outcomes. Using a random sample of queries from a major search engine and a sample of reference questions from the Internet Public Library (IPL), we conduct a real-effort experiment to compare online and offline search experiences and outcomes. We find that participants are significantly more likely to find an answer on the Web (100 %), compared to offline searching (between 87 % and 90 %). Restricting our analysis to the set of questions in which participants find answers in both treatments, a Web search takes on average 7 (9) minutes, whereas the corresponding offline search takes 22 (19) minutes for a search-engine (IPL) question. Furthermore, while raters judge library sources to be significantly more trustworthy and authoritative than the corresponding Web sources, they judge Web sources as significantly more relevant. Balancing all factors, we find that the overall source quality is not significantly different between the two treatments for the set of search-engine questions. However, for IPL questions, we find that non-Web sources are judged to have significantly higher overall quality than the corresponding Web sources. In comparison, for factual questions, Web search results are significantly more likely to be correct (66 % vs. 43 %). Lastly, post-search questionnaires reveal that participants find online searching more enjoyable than offline searching.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. Real-effort experiments have become increasingly popular in experimental economics (Carpenter et al. 2010). The main advantage of a real-effort experiment compared to an abstract framing is increased external validity.

  2. Dozens of articles have been written about the IPL (McCrea 2004). Additionally, the IPL has received many awards, including the 2010 Best Free Reference Web Sites award from the Reference and User Services Association.

  3. http://www.economist.com/news/finance-and-economics/21573091-how-quantify-gains-internet-has-brought-consumers-net-benefits.

  4. http://www.nytimes.com/2013/05/01/business/statistics-miss-the-benefits-of-technology.html?pagewanted=1&_r=2&ref=eduardoporter&.

  5. McKinsey estimates that the Internet accounts for 21 % of GDP growth over the past five years in mature countries (du Rausas et al. 2011).

  6. Simple search tasks are search tasks for which the answer can be found in a single document, while complex search tasks are open-ended and without a clear answer.

  7. Since 1992, the U.S. National Institute of Standards and Technology (NIST) has sponsored a workshop series that compares and evaluates different information retrieval systems, called the Text REtrieval Conference (TREC) (Voorhees and Harman 2005).

  8. Our two graduate students were recruited from a pool of students who took either SI 665 (Online Searching and Databases) or SI 666 (Organization of Information Resources), each of which offers students a variety of skills and knowledge regarding information searches.

  9. The use of three raters follows the evaluation methodology from the Text REtrieval Conference (TREC), which has been a source of evaluation methods and protocols for IIR researchers (Voorhees and Harman 2005).

  10. The 356 unique queries correspond to 1420 if we include duplicates.

  11. We thank Maurita Holland for giving us access to the IPL database. The IPL maintains a complete data archive of all questions and answers provided. From Fall 2003 on, each volunteer was asked to write down the actual amount of time it took to answer a question, which was the main reason we select our questions from the database from Fall 2003 onwards. We do not have access to the questions sent to the IPL after 2009.

  12. Based on a survey conducted on the IPL in 2004, the most frequently chosen response to the question, “Why did you choose our [Ask-A-Question] service?” is “Wasn’t able to answer the question on my own” (69.72 %). Other reasons include: “to get reliable and authoritative information,” “looking for free resources,” “lack of alternative library services,” “good experience in the past,” and “only service I knew about” (Chang and Holland 2005).

  13. Note that, while 200 out of 600 queries from the search engine are classified as Web, only 1 out of the 108 IPL questions is classified as Web.

  14. “Scoring rubrics are typically employed when a judgement of quality is required and may be used to evaluate a broad range of subjects and activities. One common use of scoring rubrics is to guide the evaluation of writing samples. Judgements concerning the quality of a given writing sample may vary depending upon the criteria established by the individual evaluator. One evaluator may heavily weigh the evaluation process upon the linguistic structure, while another evaluator may be more interested in the persuasiveness of the argument. A high quality essay is likely to have a combination of these and other factors. By developing a pre-defined scheme for the evaluation process, the subjectivity involved in evaluating an essay becomes more objective” (Moskal 2000).

  15. Based on our private communication with the Office of New Student Programs at the University of Michigan, 95 % of freshmen attend the summer orientation. The remaining 5 % of students who cannot attend summer orientation are required to attend a final fall orientation instead.

  16. Questions on the affective aspects of the search process are derived from White et al. (2003).

  17. Both courses prepare students for reference services in settings such as libraries or other information centers, requiring students to work with actual reference questions. In both courses, students gain expertise in searching, evaluating information resources, and answering questions submitted online.

  18. There are six main cases of intraclass correlation coefficients (ICC), distinguished by the numbers in parentheses following the letters ICC. The first number indicates the statistical model assumed. Case 3 assumes that judges are fixed and not drawn from a random population. The second number indicates the number of raters. More details on the use of ICC computation can be found in Shrout and Fleiss (1979).

  19. Rater training took place between February 22 to 25, 2010. The rating of question sets 1–4 was completed between February 22 and March 19, whereas the rating of question set 5 took place between April 28 and 30, 2010.

  20. Recall that, in these studies, simple search tasks are those for which the answer could be found in a single document, while complex search tasks are those for which one needs to combine multiple sources.

  21. Recall Morae records on-screen activities and keyboard/mouse inputs.

  22. Organic search results are listings on search engine results pages that appear because of their relevance to the search queries. The ranking of organic search results is based on search engine algorithms, such as PageRank (Brin and Page 1998). In contrast, sponsored search results are listings on search engine results pages from content providers who pay search engines for traffic from the search engine to their websites. Sponsored search results are often grouped together and appear in a specific region of the result page with a different background color. They are ranked based on the price paid by content providers (Jansen and Mullen 2008).

  23. The top ten most visited websites in the world include (in decreasing traffic) Facebook, Google, YouTube, Yahoo!, Baidu (Chinese), Wikipedia, Amazon, QQ (Chinese), Windows Live, and Taobao (Chinese), retrieved from http://www.alexa.com/topsites on May 10, 2013.

References

  • Baily, M. N., Hulten, C., & Campbell, D. (1992). Productivity dynamics in manufacturing plants. Brookings Papers on Economic Activity. Microeconomics, 1992, 187–267.

    Article  Google Scholar 

  • Beaulieu, M., Robertson, S., & Rasmussen, E. (1996). Evaluating interactive systems in TREC. Journal of the American Society for Information Science, 47(1), 85–94.

    Article  Google Scholar 

  • Belkin, N. J., Cool, C., Kelly, D., Kim, G., Kim, J.-Y., Lee, H.-J., Muresan, G., Tang, M.-C., & Yuan, X.-J. (2003). Query length in interactive information retrieval. In Proceedings of the 26th annual international ACM SIGIR conference on research and development in information retrieval (pp. 205–212).

    Google Scholar 

  • Borlund, P. (2000). Experimental components for the evaluation of interactive information retrieval systems. Journal of Documentation, 56(1), 71–90.

    Article  Google Scholar 

  • Brandts, J., & Cooper, D. J. (2007). It’s what you say, not what you pay: an experimental study of manager-employee relationships in overcoming coordination failure. Journal of the European Economic Association, 5(6), 1223–1268.

    Article  Google Scholar 

  • Brin, S., & Page, L. (1998). The anatomy of a large-scale hypertextual Web search engine. Computer Networks and {ISDN} Systems, 30(1–7), 107–117. Proceedings of the seventh international World Wide Web conference.

    Article  Google Scholar 

  • Broder, A. (2002). A taxonomy of web search. SIGIR Forum, 36(2), 3–10.

    Article  Google Scholar 

  • Brookhart, S. M. (1999). The art and science of classroom assessment: the missing part of pedagogy. ASHE-ERIC Higher Education Report, 27(1).

  • Brynjolfsson, E., & Oh, J.H. (2013). The attention economy: measuring the value of free digital services on the Internet.

  • Brynjolfsson, E., & Hitt, L. (1996). Paradox lost? Firm-level evidence on the returns to information systems spending. Management Science, 42(4), 541–558.

    Article  Google Scholar 

  • Brynjolfsson, E., & Hitt, L. M. (2000). Beyond computation: information technology, organizational transformation and business performance. The Journal of Economic Perspectives, 14(4), 23–48.

    Article  Google Scholar 

  • Bughin, J., Corb, L., Manyika, J., Nottebohm, O., Chui, M., de Muller Barbat, B., & Said, R. (2011). The impact of Internet technologies: search. New York: McKinsey & Co.

    Google Scholar 

  • Carpenter, J., Matthews, P. H., & Schirm, J. (2010). Tournaments and office politics: evidence from a real effort experiment. The American Economic Review, 100(1), 504–517.

    Article  Google Scholar 

  • Chang, H. R., & Holland, M. P. (2005). User satisfaction survey of ask-a-question service at the Internet public library. Internet Reference Services Quarterly, 10(2), 61–73.

    Article  Google Scholar 

  • Chen, Y., Ho, T.-H., & Kim, Y.-M. (2010). Knowledge market design: a field experiment at Google answers. Journal of Public Economic Theory, 12(4), 641–664.

    Article  Google Scholar 

  • Colón-Aguirre, M., & Fleming-May, R. A. (2012). You just type in what you are looking for. Undergraduates’ use of library resources vs. Wikipedia. The Journal of Academic Librarianship, 38(6), 391–399.

    Article  Google Scholar 

  • comScore (2013). comScore releases august 2013 U.S. search engine rankings, September 2013.

  • Connaway, L. S., Dickey, T. J., & Radford, M. L. (2011). “If it is too inconvenient I’m not going after it:” convenience as a critical factor in information-seeking behaviors. Library & Information Science Research, 33(3), 179–190.

    Article  Google Scholar 

  • du Rausas, M. P., Manyika, J., Hazan, E., Bughin, J., Chui, M., & Said, R. (2011). Internet matters: the Net’s sweeping impact on growth, jobs, and prosperity. Technical Report, McKinsey Global Institute.

  • Fallows, D. (2008). Search engine use. Pew Internet and American Life Project.

  • Giles, J. (2005). Internet encyclopaedias go head to head. Nature, 438, 900–901.

    Article  Google Scholar 

  • Goolsbee, A., & Klenow, P. J. (2006). Valuing consumer products by the time spent using them: an application to the Internet. The American Economic Review, 96(2), 108–113.

    Article  Google Scholar 

  • Grohowski, R. B., McGoff, C., Vogel, D. R., Martz, W. B., & Nunamaker, J. F. (1990). Implementation of group support systems at IBM. MIS Quarterly, 14(4), 369–383.

    Article  Google Scholar 

  • Hardy, A. P. (1982). The selection of channels when seeking information: cost/benefit vs least-effort. Information Processing & Management, 18(6), 289–293.

    Article  Google Scholar 

  • Hargittai, E. (2002). Second-Level Digital Divide: Differences in People’s Online Skills. First Monday 7(4).

  • Houser, D., & Xiao, E. (2011). Classification of natural language messages using a coordination game. Experimental Economics, 14, 1–14.

    Article  Google Scholar 

  • Jansen, B. J., & Mullen, T. (2008). Sponsored search: an overview of the concept, history, and technology. International Journal of Electronic Business, 6(2), 114–131.

    Article  Google Scholar 

  • Kelly, D., Cushing, A., Dostert, M., Niu, X., & Gyllstrom, K. (2010). Effects of popularity and quality on the usage of query suggestions during information search. In Proceedings of the 28th international conference on human factors in computing systems (pp. 45–54).

    Google Scholar 

  • Li, Y., & Belkin, N. J. (2010). An exploration of the relationships between work task and interactive information search behavior. Journal of the American Society for Information Science and Technology, 61(9), 1771–1789.

    Article  Google Scholar 

  • Liu, Z. (2006). Print vs. electronic resources: a study of user perceptions, preferences, and use. Information Processing & Management, 42, 583–592.

    Article  Google Scholar 

  • McCrea, R. T. (2004). Evaluation of two library-based and one expert reference service on the Web. Library Review, 53(1), 11–16.

    Article  Google Scholar 

  • Moskal, B. M. (2000). Scoring rubrics: what, when and how? Practical Assessment, Research & Evaluation, 7(3).

  • Reavley, N. J., Mackinnon, A. J., Morgan, A. J., Alvarez-Jimenez, M., Hetrick, S. E., Killackey, E., Nelson, B., Purcell, R., Yap, M. B. H., & Jorm, A. F. (2012). Quality of information sources about mental disorders: a comparison of Wikipedia with centrally controlled web and printed sources. Psychological Medicine, 42(8), 1753–1762.

    Article  Google Scholar 

  • Rector, L. H. (2008). Comparison of Wikipedia and other encyclopedias for accuracy, breadth, and depth in historical articles. Reference Services Review, 36(1), 7–22.

    Article  Google Scholar 

  • Rieh, S. Y. (2002). Judgment of information quality and cognitive authority in the Web. Journal of the American Society for Information Science and Technology, 53(2), 145–161.

    Article  Google Scholar 

  • Rose, D. E., & Levinson, D. (2004). Understanding user goals in web search. In WWW ’04: proceeding of the 13th international conference on World Wide Web, New York City, NY.

    Google Scholar 

  • Ruthven, I. (2008). Interactive information retrieval. Annual Review of Information Science and Technology, 42, 43–91.

    Article  Google Scholar 

  • Sathe, N. A., Grady, J. L., & Giuse, N. B. (2002). Print versus electronic journals: a preliminary investigation into the effect of journal format on research processes. Journal of the Medical Library Association, 90(2), 235–243.

    Google Scholar 

  • Shrout, P. E., & Fleiss, J. L. (1979). Intraclass correlations: uses in assessing rater reliability. Psychological Bulletin, 86(2), 420–428.

    Article  Google Scholar 

  • Singer, G., Pruulmann-Vengerfeldt, P., Norbisrath, U., & Lewandowski, D. (2012). The relationship between Internet user type and user performance when carrying out simple vs. complex search tasks. First Monday, 6.

  • Singer, G., Norbisrath, U., & Lewandowski, D. (2013). Ordinary search engine users carrying out complex search tasks. Journal of Information Science, 39(3), 346–358.

    Article  Google Scholar 

  • Tombros, A., Ruthven, I., & Jose, J. M. (2005). How users assess Web pages for information seeking. Journal of the American Society for Information Science and Technology, 56(4), 327–344.

    Article  Google Scholar 

  • Toms, E. G., & Latter, C. (2007). How consumers search for health information. Health Informatics Journal, 13(3), 223–235.

    Article  Google Scholar 

  • Toms, E. G., Freund, L., Kopak, R., & Bartlett, J. C. (2003). The effect of task domain on search. In Proceedings of the 2003 conference of the centre for advanced studies on collaborative research, CASCON ’03 (pp. 303–312).

    Google Scholar 

  • Trifts, V. J., & Toms, E. G. (2007). Consumers’ allocation of cognitive resources in Web-based search: an exploratory study. International Journal of Electronic Business, 5(6), 561–575.

    Article  Google Scholar 

  • van Deursen, A. J. A. M., & van Diepen, S. (2013). Information and strategic Internet skills of secondary students: a performance test. Computers and Education, 63, 218–226.

    Article  Google Scholar 

  • Voorhees, E. M., & Harman, D. K. (2005). The text REtrieval conference. In E. M. Voorhees & D. K. Harman (Eds.), TREC: experiment and evaluation in information retrieval. Cambridge: MIT Press.

    Google Scholar 

  • White, R. W., Jose, J. M., & Ruthven, I. (2003). A task-oriented study on the influencing effects of query-biased summarisation in Web searching. Information Processing & Management, 39(5), 707–733.

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank David Campbell, Jacob Goeree, Nancy Kotzian, Jeffrey MacKie-Mason, Karen Markey, Betsy Masiello, Soo Young Rieh and Hal Varian for helpful comments and discussions, Donna Hayward and her colleagues at the University of Michigan Hatcher Graduate Library for facilitating the non-Web treatment, and Ashlee Stratakis and Dan Stuart for excellent research assistance. Two anonymous referees provided insightful comments which significantly improved the paper. Financial support from the National Science Foundation through grants No. SES-0079001 and SES-0962492, and a Google Research Award is gratefully acknowledged.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yan Chen.

Electronic Supplementary Material

Below is the link to the electronic supplementary material.

(PDF 374 kB)

Rights and permissions

Reprints and permissions

About this article

Cite this article

Chen, Y., Jeon, G.Y. & Kim, YM. A day without a search engine: an experimental study of online and offline searches. Exp Econ 17, 512–536 (2014). https://doi.org/10.1007/s10683-013-9381-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10683-013-9381-9

Keywords

JEL Classification

Navigation