Social information seeking

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Social information seeking (SIS) is a field of research that involves studying situations, motivations, and methods for people seeking and sharing information in participatory online social sites, such as Yahoo! Answers, Answerbag, and WikiAnswers. as well as building systems for supporting such activities. Highly related topics involve traditional and virtual reference services, information retrieval, information extraction, and knowledge representation.

Background[edit]

Social information seeking is often materialized in online question-answering (QA) websites, which are driven by a community. Such QA sites have emerged in the past few years as an enormous market, so to speak, for the fulfillment of information needs. Estimates of the volume of questions answered are difficult to come by, but it is likely that the number of questions answered on social/community QA (cQA) sites far exceeds the number of questions answered by library reference services,[1] which until recently were one of the few institutional sources for such question answering. cQA sites make their content – questions and associated answers submitted on the site – available on the open web, and indexable by search engines, thus enabling web users to find answers provided for previously asked questions in response to new queries.

The popularity of such sites have been increasing dramatically for the past several years. Major sites that provide a general platform for questions of all types include Yahoo! Answers, Answerbag and Quora. While other sites that focus on particular fields; for example, StackOverflow (computing).

Social Q&A or cQA, according to Shah et al.,[2] consists of three components: a mechanism for users to submit questions in natural language, a venue for users to submit answers to questions, and a community built around this exchange. Viewed in that light, online communities have performed a question answering function perhaps since the advent of Usenet and Bulletin Board Systems, so in one sense cQA is nothing new. Websites dedicated to cQA, however, have emerged on the web only within the past few years: the first cQA site was the Korean Naver Knowledge iN, launched in 2002, while the first English-language CQA site was Answerbag, launched in April 2003. Despite this short history, however, cQA has already attracted a great deal of attention from researchers investigating information seeking behaviors,[3] selection of resources,[4] social annotations,[5] user motivations,[6] comparisons with other types of question answering services,[7] and a range of other information-related behaviors.

Research questions[edit]

Some of the interesting and important research questions in this area include:

  • What causes people to be involved in social Q&A?
  • What is the motivation of people who participate in social Q&A?
  • Why do questioners choose social Q&A as a source to find information?
  • Why do they ask questions online to people whose background or expertise may be unverified?
  • Why do they choose social Q&A over other sources to look for information?
  • What do they expect from answers given by anonymous people on the Web?
  • Why are the answerers willing to share information and knowledge with anonymous people, for free?
  • Why do they spend time and effort to find information and help others online? Why are they willing to expose their personal stories to people and inform others with their experiences?

Shah et al.[8] provide a detailed research agenda for social Q&A.

External links[edit]

Major figures:

References[edit]

  1. ^ Janes, J. (2003). The Global Census of Digital Reference. In 5th Annual VRD Conference. San Antonio, TX.
  2. ^ Shah, C., Oh, S., & Oh, J-S. (2009). Research Agenda for Social Q&A. Library and Information Science Research, 11(4), 205-209.
  3. ^ Kim, S., Oh, J-S., & Oh, S. (2007). Best-Answer Selection Criteria in a Social Q&A site from the User Oriented Relevance Perspective. Proceeding of the 70th Annual Meeting of the American Society for Information Science and Technology (ASIST ‘07), 44.
  4. ^ Harper, M. F., Raban, D. R., Rafaeli, S., & Konstan, J. K. (2008). Predictors of answer quality in online Q&A sites. In Proceedings of the 26th Annual SIGCHI Conference on Human Factors in Computing Systems (pp. 865−874). New York: ACM.
  5. ^ Gazan, R. (2008). Social annotations in digital library collections. D-Lib Magazine, 11/12(14). Available from http://www.dlib.org/dlib/november08/gazan/11gazan.html.
  6. ^ Shah, C., Oh, J. S., & Oh, S. (2008). Exploring characteristics and effects of user participation in online social Q&A sites. First Monday, 13(9). Available from http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2182/2028.
  7. ^ Su, Q., Pavlov, D., Chow, J., & Baker, W. (2007). Internet-scale collection of human- reviewed data. In C. L. Williamson, M. E. Zurko, P. E. Patel-Schneider, & P. J. Shenoy (Eds.), Proceedings of the 16th International Conference on World Wide Web (pp. 231−240). New York: ACM.
  8. ^ Shah, C., Oh, S., & Oh, J. S. (2009). Research agenda for social Q&A. Library & Information Science Research, 31(4), 205-209. Retrieved January 2, 2011.