||This article includes a list of references, but its sources remain unclear because it has insufficient inline citations. (September 2010) (Learn how and when to remove this template message)|
A reputation system computes and publishes reputation scores for a set of objects (e.g. service providers, services, goods or entities) within a community or domain, based on a collection of opinions that other entities hold about the objects. The opinions are typically passed as ratings to a central place where all perceptions, opinions and ratings can be accumulated. A reputation center uses a specific reputation algorithm to dynamically compute the reputation scores based on the received ratings. Reputation is a sign of trustworthiness manifested as testimony by other people. New expectations and realities about the transparency, availability, and privacy of people and institutions are emerging. Reputation management – the selective exposure of personal information and activities – is an important element to how people function in networks as they establish credentials, build trust with others, and gather information to deal with problems or make decisions.
Reputation systems are related to recommender systems and collaborative filtering, but with the difference that reputation systems produce scores based on explicit ratings from the community, whereas recommender systems use some external set of entities and events (such as the purchase of books, movies, or music) to generate marketing recommendations to users. The role of reputation systems is to facilitate trust, and often functions by making the reputation more visible.
Howard Rheingold states that online reputation systems are 'computer-based technologies that make it possible to manipulate in new and powerful ways an old and essential human trait'. Rheingold inclines that these systems arose as a result of the need for Internet users to gain trust in the individuals they transact with online. The innate trait he makes note of in humans is that a function of society such as gossip 'keeps us up to date on who to trust, who other people trust, who is important, and who decides who is important'. Internet sites such as eBay and Amazon he argues seek to service this consumer trait and are 'built around the contributions of millions of customers, enhanced by reputation systems that police the quality of the content and transactions exchanged through the site'.
The emerging sharing economy increases the importance of trust in peer-to-peer marketplaces and services. User can build up reputation and trust in individual systems but don’t have the ability to use them in other systems. Rachel Botsman and Roo Rogers argue in their book What’s Mine is Yours (2010), that ‘it is only a matter of time before there is some form of network that aggregates your Reputation capital across multiple form of Collaborative Consumption’. These systems, often referred to as Reputation Banks, try to give users a platform to manage their Reputation capital across multiple systems.
Notable examples of practical applications
- Search: web (see PageRank)
- eCommerce: eBay, Epinions, Bizrate, Trustpilot
- Social news: Reddit, Digg, Imgur
- Programming communities: Advogato, freelance marketplaces, Stack Overflow
- Wikis: Increase contribution quantity and quality (Dencheva, Prause & Prinz 2011)
- Internet Security: TrustedSource
- Question-and-Answer sites: Quora, Yahoo! Answers, Gutefrage.net
- Email: anti-spam techniques, reputation lookup (RapLeaf)
- Personal Reputation: CouchSurfing (for travelers),
- Non Governmental organizations (NGOs): GreatNonProfits.org, GlobalGiving
- Professional reputation of translators and translation outsourcers: BlueBoard at ProZ.com
- All purpose reputation system: Yelp, Inc.
- Academia: general bibliometic measures, e.g. the h-index of a researcher.
Reputation as a resource
High reputation capital often confers benefits upon the holder. For example, a wide range of studies have found a positive correlation between seller rating and asking price on eBay, indicating that high reputation can help users obtain more money for their items. High product reviews on online marketplaces can also help drive higher sales volumes.
Abstract reputation can be used as a kind of resource, to be traded away for short-term gains or built up by investing effort. For example, a company with a good reputation may sell lower-quality products for higher profit until their reputation falls, or they may sell higher-quality products to increase their reputation. Some reputation systems go further, making it explicitly possible to spend reputation within the system to derive a benefit. For example, on the Stack Overflow community, reputation points can be spent on question "bounties" to incentivize other users to answer the question.
Even without an explicit spending mechanism in place, reputation systems often make it easier for users to spend their reputation without harming it excessively. For example, a driver with a high ride acceptance score (a metric often used for driver reputation) on a ride-sharing service may opt to be more selective about his or her clientele, decreasing their acceptance score but improving their driving experience. With the explicit feedback provided by the service, drivers can carefully manage their selectivity to avoid being penalized too heavily.
Reputation systems are in general vulnerable to attacks, and many types of attacks are possible. A typical example is the so-called Sybil attack where an attacker subverts the reputation system by creating a large number of pseudonymous entities, and using them to gain a disproportionately large influence. A reputation system's vulnerability to a Sybil attack depends on how cheaply Sybils can be generated, the degree to which the reputation system accepts input from entities that do not have a chain of trust linking them to a trusted entity, and whether the reputation system treats all entities identically.
- Slee, Tom (September 29, 2013). "Some Obvious Things About Internet Reputation Systems".
- Lee Rainie and Barry Wellman, Networked: The New Social Operating System. MIT Press, 2012.
- Resnick, P.; Zeckhauser, R.; Friedman, E.; Kuwabara, K. (2000). "Reputation Systems" (PDF). Communications of the ACM. doi:10.1145/355112.355122.
- Jøsang, A.; Ismail, R.; Boyd, C. (2007). "A Survey of Trust and Reputation Systems for Online Service Provision" (PDF). Decision Support Systems 43 (2): 45–48. doi:10.1016/j.dss.2005.05.019.
- Tanz, Jason (May 23, 2014). "How Airbnb and Lyft Finally Got Americans to Trust Each Other".
- Botsman, Rachel (2010). What's Mine is Yours. New York: Harper Business. ISBN 0061963542.
- Ye, Qiang (2013). "In-Depth Analysis of the Seller Reputation and Price Premium Relationship: A Comparison Between eBay US And Taobao China" (PDF). Journal of Electronic Commerce Research 14 (1).
- Winfree, Jason, A. (2003). "Collective Reputation and Quality" (PDF). American Agricultural Economics Association Meetings.
- Jøsang, A.; Golbeck, J. (September 2009). Challenges for Robust of Trust and Reputation Systems. (PDF). Proceedings of the 5th International Workshop on Security and Trust Management (STM 2009). Saint Malo, France.
- Lazzari, Marco (March 2010). An experiment on the weakness of reputation algorithms used in professional social networks: the case of Naymz. Proceedings of the IADIS International Conference e-Society 2010. Porto, Portugal.
- Dellarocas, C. (2003). "The Digitization of Word-of-Mouth: Promise and Challenges of Online Reputation Mechanisms" (PDF). Management Science 49 (10): 1407–1424. doi:10.1287/mnsc.49.10.1407.17308.
- Vavilis, S.; Petković, M.; Zannone, N. (2014). "A reference model for reputation systems" (PDF). Decision Support Systems 61: 147–154. doi:10.1016/j.dss.2014.02.002.
- Dencheva, S.; Prause, C. R.; Prinz, W. (September 2011). Dynamic self-moderation in a corporate wiki to improve participation and contribution quality (PDF). Proceedings of the 12th European Conference on Computer Supported Cooperative Work (ECSCW 2011). Aarhus, Denmark.
- D. Quercia, S. Hailes, L. Capra. Lightweight Distributed Trust Propagation. ICDM 2007.
- R. Guha, R. Kumar, P. Raghavan, A. Tomkins. Propagation of Trust and Distrust WWW2004.
- A. Cheng, E. Friedman. Sybilproof reputation mechanisms SIGCOMM workshop on Economics of peer-to-peer systems, 2005.
- Hamed Alhoori, Omar Alvarez, Richard Furuta, Miguel Muñiz, Eduardo Urbina: Supporting the Creation of Scholarly Bibliographies by Communities through Online Reputation Based Social Collaboration. ECDL 2009: 180-191
- Sybil Attacks Against Mobile Users: Friends and Foes to the Rescue by Daniele Quercia and Stephen Hailes. IEEE INFOCOM 2010.
- J.R. Douceur. The Sybil Attack . IPTPS02 2002.
- Hoffman, K., Zage, D., Nita-Rotaru, C. (2009). A survey of attack and defense techniques for reputation systems . ACM Computing Surveys, 42(1), 1.
- Rheingold, Howard (2002). Smart Mobs: The Next Social Revolution. Perseus, Cambridge, Massachusetts.
- Cattalibys, K. (2010). "I could be someone else - social networks, pseudonyms and sockpuppets". Schizoaffective disorders 49 (3).
- Zhang, Jie; Cohen, Robin (2006). Trusting Advice from Other Buyers in E-Marketplaces: The Problem of Unfair Ratings (PDF). Proceedings of the Eighth International Conference on Electronic Commerce (ICEC). New Brunswick, Canada.