Reputation system

From Wikipedia, the free encyclopedia
Jump to: navigation, search

A reputation system computes and publishes reputation scores for a set of objects (e.g. service providers, services, goods or entities) within a community or domain, based on a collection of opinions that other entities hold about the objects. The opinions are typically passed as ratings to a central place where all perceptions, opinions and ratings accumulated. A reputation center which uses a specific reputation algorithm to dynamically compute the reputation scores based on the received ratings. Reputation is a sign of trustworthiness manifested as testimony by other people. [1] New expectations and realities about the transparency, availability, and privacy of people and institutions are emerging. Reputation management – the selective exposure of personal information and activities – is an important element to how people function in networks as they establish credentials, build trust with others, and garther information to deal with problems or make decisions. [2]

Entities in a community use reputation scores for decision making, e.g. whether or not to buy a specific service or good. An object with a high reputation score will normally attract more business than an object with a low reputation score. It is therefore in the interest of objects to have a high reputation score.

Since the collective opinion in a community determines an object's reputation score, reputation systems represent a form of collaborative sanctioning and praising. A low score represents a collaborative sanctioning of an object that the community perceives as having or providing low quality. Similarly, a high score represents a collaborative praising of an object that the community perceives as having or providing high quality. Reputation scores change dynamically as a function of incoming ratings. A high score can quickly be lost if rating entities start providing negative ratings. Similarly, it is possible for an object with a low score to recover and regain a high score.

Reputation systems are related to recommender systems and collaborative filtering, but with the difference that reputation systems produce scores based on explicit ratings from the community, whereas recommender systems use some external set of entities and events (such as the purchase of books, movies, or music) to generate marketing recommendations to users. The role of reputation systems is to facilitate trust, and often functions by making the reputation more visible.[3][4]

Reputation systems are often useful in large online communities in which users may frequently have the opportunity to interact with users with whom they have no prior experience or in communities where user generated content is posted like YouTube or Flickr. In such a situation, it is often helpful to base the decision whether or not to interact with that user on the prior experiences of other users.

Reputation systems may also be coupled with an incentive system to reward good behavior and punish bad behavior. For instance, users with high reputation may be granted special privileges, whereas users with low or unestablished reputation may have limited privileges.

Types of reputation systems[edit]

A simple reputation system, employed by eBay, is to record a rating (either positive, negative, or neutral) after each pair of users conducts a transaction. A user's reputation comprises the count of positive and negative transactions in that user's history.

More sophisticated algorithms scale an individual entity's contribution to other nodes' reputations by that entity's own reputation. PageRank is such a system, used for ranking web pages based on the link structure of the web. In PageRank, each web page's contribution to another page is proportional to its own pagerank, and inversely proportional to its number of outlinks.

Reputation systems are also emerging which provide a unified, and in many cases objective, appraisal of the impact to reputation of a particular news item, story, blog or online posting. The systems also utilize complex algorithms to firstly capture the data in question but then rank and score the item as to whether it improves or degrades the reputation of the individual, company or brand in question.

Online reputation systems[edit]

Howard Rheingold states that online reputation systems are 'computer-based technologies that make it possible to manipulate in new and powerful ways an old and essential human trait'. Rheingold inclines that these systems arose as a result of the need for Internet users to gain trust in the individuals they transact with online. The innate trait he makes note of in humans is that a function of society such as gossip 'keeps us up to date on who to trust, who other people trust, who is important, and who decides who is important'. Internet sites such as eBay and Amazon he argues seek to service this consumer trait and are 'built around the contributions of millions of customers, enhanced by reputation systems that police the quality of the content and transactions exchanged through the site'.

Reputation Banks[edit]

The emerging Sharing economy increases the importance of trust in peer-to-peer marketplaces and services.[5] User can build up reputation and trust in individual systems but don’t have the ability to use them in other systems. Rachel Botsman and Roo Rogers argue in their book What’s Mine is Yours (2010),[6] that ‘it is only a matter of time before there is some form of network that aggregates your Reputation capital across multiple form of Collaborative Consumption’. These systems, often referred to as Reputation Banks, try to give users a platform to manage their Reputation capital across multiple systems. Currently there are some systems like ProveTrust, TrustCloud, Trust Science, eRated or Credport that try to give users a central hub for their Reputation capital.

Other examples of practical applications[edit]

Attacks on reputation systems[edit]

Reputation systems are in general vulnerable to attacks, and many types of attacks are possible.[7] A typical example is the so-called Sybil attack where an attacker subverts the reputation system by creating a large number of pseudonymous entities, and using them to gain a disproportionately large influence.[8] A reputation system's vulnerability to a Sybil attack depends on how cheaply Sybils can be generated, the degree to which the reputation system accepts input from entities that do not have a chain of trust linking them to a trusted entity, and whether the reputation system treats all entities identically. It is named after the subject of the book Sybil, a case study of a woman with multiple personality disorder.

See also[edit]


  1. ^ Slee, Tom (September 29, 2013). "Some Obvious Things About Internet Reputation Systems". 
  2. ^ Lee Rainie and Barry Wellman, Networked: The New Social Operating System. MIT Press, 2012.
  3. ^ Resnick, P.; Zeckhauser, R.; Friedman, E.; Kuwabara, K. (2000). "Reputation Systems". Communications of the ACM. 
  4. ^ Jøsang, A.; Ismail, R.; Boyd, C. (2007). "A Survey of Trust and Reputation Systems for Online Service Provision". Decision Support Systems 43 (2). 
  5. ^ Tanz, Jason (May 23, 2014). "How Airbnb and Lyft Finally Got Americans to Trust Each Other". 
  6. ^ Botsman, Rachel (2010). What's Mine is Yours. New York: Harper Business. ISBN 0061963542. 
  7. ^ Jøsang, A.; Golbeck, J. (September 2009). "Challenges for Robust of Trust and Reputation Systems.". Proceedings of the 5th International Workshop on Security and Trust Management (STM 2009). Saint Malo, France. 
  8. ^ Lazzari, Marco (March 2010). "An experiment on the weakness of reputation algorithms used in professional social networks: the case of Naymz". Proceedings of the IADIS International Conference e-Society 2010. Porto, Portugal. 

External links[edit]