ABSTRACT
Presence of information from multiple sources on the internet requires evaluating the credibility of the information, before its utilization. Researchers have suggested that internet users experience difficulty in accessing necessary information and do not pay enough attention to its credibility. We present here the design and implementation of an automated Web Credibility Assessment Support Tool (WebCAST) that considers multiple factors (type of website, popularity, sentiment, date of last update, reputation and review based on users' ratings reflecting personal experience) for assessing the credibility of information and returns a summary indication of the credibility of a website. We use Potentially All Pairwise RanKings of all possible Alternatives (PAPRIKA) method of Multi-Criteria Decision Analysis (MCDA) to give weights to the scale values on each factor, representing the relative importance of the attributes. An empirical evaluation of the tool was conducted by computing the correlation between the tool-generated credibility scores and that of human judges. The correlation was found to be 0.89, thus verifying the validity of the tool. In the future the proposed tool can be made useful to students in their learning process of credibility assessment.
- Aggarwal, S. and Van Oostendorp, H. An attempt to automate the process of source evaluation. ACEEE International Journal on Communication 2, no. 2 (2011).Google Scholar
- Amin, A., Zhang, J., Cramer, H., Hardman, L., and Evers, V. The effects of source credibility ratings in a cultural heritage information aggregator. In Proc. 3rd workshop on Information credibility on the web, ACM Press (2009), 35--42. Google ScholarDigital Library
- Beck, S. The Good, The Bad & The Ugly: or, Why It's a Good Idea to Evaluate Web Sources. (1997). http://lib.nmsu.edu/instruction/evalcrit.html.Google Scholar
- Bråten, I., Strømsø, H. I., and Britt, M. A. Trust matters: Examining the role of source evaluation in students' construction of meaning within and across multiple texts. Reading Research Quarterly 44, no. 1 (2009), 6--28.Google ScholarCross Ref
- Corritore, C. L., Kracher, B., and Wiedenbeck, S. Online trust: concepts, evolving themes, a model. International Journal of Human-Computer Studies 58, no. 6 (2003), 737--758. Google ScholarDigital Library
- De Smedt, T., and Daelemans, W. Pattern for Python. Journal of Machine Learning Research, 13: 2031--2035, (2012). Google ScholarDigital Library
- Ennals, R., Byler, D., Agosta, J. M., and Rosario, B. What is Disputed on the Web? In Proc. workshop on Information credibility, ACM Press (2010), 67--74. Google ScholarDigital Library
- Evaluating Internet Information http://www.usg.edu/galileo/skills/unit07/internet07_08.phtmlGoogle Scholar
- Fogg, B. J., and Tseng, H. The elements of computer credibility. In Proc. SIGCHI conference on Human factors in computing systems, ACM Press (1999), 80--87. Google ScholarDigital Library
- Grassian, E. Thinking Critically about World Wide Web Resources. UCLA Library (1995) http://www2.library.ucla.edu/libraries/college/11605_12337.cfmGoogle Scholar
- Hansen, P., and Ombler, F. A new method for scoring additive multi-attribute value models using pairwise rankings of alternatives. Journal of Multi-Criteria Decision Analysis 15.3-4 (2008), 87--107.Google ScholarCross Ref
- Lucassen, T., and Schraagen, J. M. The role of topic familiarity in online credibility evaluation support. In Proc. Human Factors and Ergonomics Society Annual Meeting, vol. 56, no. 1, SAGE Publications (2012), 1233--1237.Google ScholarCross Ref
- Lucassen, T., and Schraagen, J. M. Trust in wikipedia: how users trust information from an unknown source. In Proc. 4th workshop on Information credibility, ACM Press (2010) 19--26. Google ScholarDigital Library
- Lucassen, T., and Schraagen, J. M. The influence of source cues and topic familiarity on credibility evaluation. Computers in Human Behavior 29, no. 4 (2013), 1387--1392. Google ScholarDigital Library
- Lucassen, T., Muilwijk, R., Noordzij, M. L., and Schraagen, J. M. Topic familiarity and information skills in online credibility evaluation. Journal of the American Society for Information Science and Technology 64, no. 2 (2013), 254--264.Google ScholarDigital Library
- Metzger, M. J., Flanagin, A. J., and Zwarun, L. College student Web use, perceptions of information credibility, and verification behavior. Computers & Education 41, no. 3 (2003), 271--290. Google ScholarDigital Library
- Metzger, M.J. Understanding how Internet users make sense of credibility: A review of the state of our knowledge and recommendations for theory, policy, and practice. In Symposium on Internet Credibility and the User, 2005.Google Scholar
- Olteanu, A., Peshterliev, S., Liu, X., and Aberer, K. Web credibility: Features exploration and credibility prediction. In Advances in Information Retrieval, Springer Berlin Heidelberg (2013), 557--568. Google ScholarDigital Library
- Pattanaphanchai, J., O'Hara, K., and Hall, W. Trustworthiness criteria for supporting users to assess the credibility of web information. In Proc. 22nd international conference on World Wide Web companion, (2013). Google ScholarDigital Library
- Ranvier, J. E. M., Olteanu, A., Aberer, K., and Papaioannou, T. G. A decentralized recommender system for effective web credibility assessment. In Proc. International conference on Information and Knowledge Management, ACM Press (2012), 704--713. Google ScholarDigital Library
- Schwarz, J., and Morris, M. Augmenting web pages and search results to support credibility assessment. In Proc. SIGCHI Conference on Human Factors in Computing Systems. ACM Press (2011). Google ScholarDigital Library
- Sentiment and Text Analysis Engine - AlchemyAPI. http://www.alchemyapi.com/apiGoogle Scholar
- Sondhi, P., Vydiswaran, V. V., and Zhai, C. Reliability prediction of webpages in the medical domain. In Advances in Information Retrieval, Springer Berlin Heidelberg (2012), 219--231. Google ScholarDigital Library
- UC Berkeley - Teaching Library Internet Workshops, Finding Information on the Internet: A Tutorial (2010) http://www.lib.berkeley.edu/TeachingLib/Guides/Internet/Evaluate.htmlGoogle Scholar
- Walraven, A., Brand-Gruwel, S., and Boshuizen, H. How students evaluate information and sources when searching the World Wide Web for information. Computers & Education 52, no. 1 (2009), 234--246. Google ScholarDigital Library
- Walraven, A., Brand-Gruwel, S., and Boshuizen, H. Fostering students' evaluation behaviour while searching the internet. Instructional Science 41 (2013), 125--146.Google ScholarCross Ref
Index Terms
- Providing Web Credibility Assessment Support
Recommendations
Factors and effects of information credibility
ICEC '07: Proceedings of the ninth international conference on Electronic commerceWebsite success hinges on how credible the consumers consider the information on the website. Unless consumers believe the website's information is credible, they are not likely to be willing to act on the advice and will not develop loyalty to the ...
Trust and distrust on the web
We examine the content of trustful and distrustful user experience reports on the web.Distrust is mostly an effect of graphical and structural design issues of a website.Trust is based on social factors such as reviews or recommendations by friends. The ...
Credibility and Interactivity: Persuasive Components of Ideological Group Websites
PERSUASIVE 2014: Proceedings of the 9th International Conference on Persuasive Technology - Volume 8462The quickly growing presence of ideological groups on the Internet has garnered interest into how these groups use technology to persuade others. This study extends current research on the influential effects of website credibility and interactivity to ...
Comments