||This article includes a list of references, related reading or external links, but its sources remain unclear because it lacks inline citations. (September 2010)|
A reputation system computes and publishes reputation scores for a set of objects (e.g. service providers, services, goods or entities) within a community or domain[disambiguation needed], based on a collection of opinions that other entities hold about the objects. The opinions are typically passed as ratings to a reputation center which uses a specific reputation algorithm to dynamically compute the reputation scores based on the received ratings.
Entities in a community use reputation scores for decision making, e.g. whether or not to buy a specific service or good. An object with a high reputation score will normally attract more business than an object with a low reputation score. It is therefore in the interest of objects to have a high reputation score.
Since the collective opinion in a community determines an object's reputation score, reputation systems represent a form of collaborative sanctioning and praising. A low score represents a collaborative sanctioning of an object that the community perceives as having or providing low quality. Similarly, a high score represents a collaborative praising of an object that the community perceives as having or providing high quality. Reputation scores change dynamically as a function of incoming ratings. A high score can quickly be lost if rating entities start providing negative ratings. Similarly, it is possible for an object with a low score to recover and regain a high score.
Reputation systems are related to recommender systems and collaborative filtering, but with the difference that reputation systems produce scores based on explicit ratings from the community, whereas recommender systems use some external set of entities and events (such as the purchase of books, movies, or music) to generate marketing recommendations to users. The role of reputation systems is to facilitate trust (Resnick et al. 2000)(Jøsang, Ismail & Boyd 2007), and often functions by making the reputation more visible.
Reputation systems are often useful in large online communities in which users may frequently have the opportunity to interact with users with whom they have no prior experience or in communities where user generated content is posted like YouTube or Flickr. In such a situation, it is often helpful to base the decision whether or not to interact with that user on the prior experiences of other users.
Reputation systems may also be coupled with an incentive system to reward good behavior and punish bad behavior. For instance, users with high reputation may be granted special privileges, whereas users with low or unestablished reputation may have limited privileges.
Types of reputation systems
A simple reputation system, employed by eBay, is to record a rating (either positive, negative, or neutral) after each pair of users conducts a transaction. A user's reputation comprises the count of positive and negative transactions in that user's history.
More sophisticated algorithms scale an individual entity's contribution to other nodes' reputations by that entity's own reputation. PageRank is such a system, used for ranking web pages based on the link structure of the web. In PageRank, each web page's contribution to another page is proportional to its own pagerank, and inversely proportional to its number of outlinks.
Reputation systems are also emerging which provide a unified, and in many cases objective, appraisal of the impact to reputation of a particular news item, story, blog or online posting. The systems also utilize complex algorithms to firstly capture the data in question but then rank and score the item as to whether it improves or degrades the reputation of the individual, company or brand in question.
Online reputation systems
Howard Rheingold states that online reputation systems are 'computer-based technologies that make it possible to manipulate in new and powerful ways an old and essential human trait'. Rheingold inclines that these systems arose as a result of the need for Internet users to gain trust in the individuals they transact with online. The innate trait he makes note of in humans is that functions of society such as gossip 'keeps us up to date on who to trust, who other people trust, who is important, and who decides who is important'. Internet sites such as eBay and Amazon he argues seek to service this consumer trait and are 'built around the contributions of millions of customers, enhanced by reputation systems that police the quality of the content and transactions exchanged through the site'.
Other examples of practical applications
- Search: web (see PageRank), blogs (see blog search engines)
- eCommerce: eBay, Epinions, Bizrate, eKomi
- Social news: Slashdot, Reddit, Digg
- Device Reputation System: iovation Inc.; used by major companies like UPS.
- Programming communities: Advogato, freelance marketplaces, Stack Overflow, Coderwall
- Wikis: Increase contribution quantity and quality (Dencheva, Prause & Prinz 2011)
- Internet Security: TrustedSource
- Email: anti-spam techniques, reputation lookup (RapLeaf)
- Peer-to-peer: identifying trusted nodes
- Personal Reputation: Trustribe (for Peer-to-Peer websites), CouchSurfing (for travelers),
- Non Governmental organizations (NGOs): www.GreatNonProfits.org, GlobalGiving
- Professional reputation of translators and translation outsourcers: BlueBoard at ProZ.com, HFS at Translatorscafe.com
- All purpose reputation system: Yelp, Inc., Customer Lobby
Attacks on reputation systems
Reputation systems are in general vulnerable to attacks, and many types of attacks are possible (Jøsang & Golbeck 2009). A typical example is the so-called Sybil attack where an attacker subverts the reputation system by creating a large number of pseudonymous entities, and using them to gain a disproportionately large influence (Lazzari 2010). A reputation system's vulnerability to a Sybil attack depends on how cheaply Sybils can be generated, the degree to which the reputation system accepts input from entities that do not have a chain of trust linking them to a trusted entity, and whether the reputation system treats all entities identically. It is named after the subject of the book Sybil, a case study of a woman with multiple personality disorder.
- Reputation management
- Collaborative filtering
- Web of trust
- Trust metric
- Online participation
- Subjective logic
- Social translucence
- Honor system
- Resnick, P.; Zeckhauser, R.; Friedman, E.; Kuwabara, K. (2000). "Reputation Systems". Communications of the ACM.
- Dellarocas, C. (2003). "The Digitization of Word-of-Mouth: Promise and Challenges of Online Reputation Mechanisms". Management Science 49 (10): 1407–1424. doi:10.1287/mnsc.49.10.1407.17308.
- Jøsang, A.; Ismail, R.; Boyd, C. (2007). "A Survey of Trust and Reputation Systems for Online Service Provision". Decision Support Systems 43 (2).
- Dencheva, S.; Prause, C. R.; Prinz, W. (September 2011). "Dynamic self-moderation in a corporate wiki to improve participation and contribution quality". Proceedings of the 12th European Conference on Computer Supported Cooperative Work (ECSCW 2011). Aarhus, Denmark.
- D. Quercia, S. Hailes, L. Capra. Lightweight Distributed Trust Propagation. ICDM 2007.
- R. Guha, R. Kumar, P. Raghavan, A. Tomkins. Propagation of Trust and Distrust WWW2004.
- A. Cheng, E. Friedman. Sybilproof reputation mechanisms SIGCOMM workshop on Economics of peer-to-peer systems, 2005.
- Hamed Alhoori, Omar Alvarez, Richard Furuta, Miguel Muñiz, Eduardo Urbina: Supporting the Creation of Scholarly Bibliographies by Communities through Online Reputation Based Social Collaboration. ECDL 2009: 180-191
- Sybil Attacks Against Mobile Users: Friends and Foes to the Rescue by Daniele Quercia and Stephen Hailes. IEEE INFOCOM 2010.
- J.R. Douceur. The Sybil Attack . IPTPS02 2002.
- Rheingold, Howard (2002). Smart Mobs: The Next Social Revolution. Perseus, Cambridge, Massachusetts.
- Adams, Ethan (October 28, 2010). The Reputation Management Online Guide (Third ed.). p. 251. ISBN 978-0-8058-6426-7.
- Cattalibys, K. (2010). "I could be someone else - social networks, pseudonyms and sockpuppets". Schizoaffective disorders 49 (3).
- Jøsang, A.; Golbeck, J. (September 2009). "Challenges for Robust of Trust and Reputation Systems.". Proceedings of the 5th International Workshop on Security and Trust Management (STM 2009). Saint Malo, France.
- Lazzari, Marco (March 2010). "An experiment on the weakness of reputation algorithms used in professional social networks: the case of Naymz". Proceedings of the IADIS International Conference e-Society 2010. Porto, Portugal.
- Zhang, Jie; Cohen, Robin (2006). "Trusting Advice from Other Buyers in E-Marketplaces: The Problem of Unfair Ratings". Proceedings of the Eighth International Conference on Electronic Commerce (ICEC). New Brunswick, Canada.
- Reputation Research Network
- Building Web Reputation Systems - Book by Randy Farmer and Bryce Glass, 2010, O'Reilly Media.
- Reputation Systems - 2008 tutorial by Yury Lifshits
- Community Equity Specification - 2008 Sun specification for reputation system.
- Contracts in Cyberspace - 2008 essay (book chapter) by David D. Friedman.
- OASIS Open Reputation Management Systems (ORMS) TC - Appears to be defunct (last update 2010)