Reputation system

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Reputation systems are programs that allow users to rate each other in online communities in order to build trust through reputation. Some common uses of these systems can be found on E-commerce websites such as eBay, Amazon.com, and Etsy as well as online advice communities such as Stack Overflow. These reputation systems represent a significant trend in "decision support for Internet mediated service provisions."[1] With the popularity of online communities for shopping, advice, and exchange of other important information, reputation systems are becoming vitally important to the online experience. The idea of reputations systems is that even if the consumer can't physically try a product or service, or see the person providing information, that they can be confident in the outcome of the exchange through trust built by recommender systems.[1]

Collaborative filtering, used most commonly in recommender systems, are related to reputation systems in that they both collect ratings from members of a community.[1] The core difference between recommender systems and collaborative filtering is the ways in which they use user feedback. In collaborative filtering, the goal is to find similarities between users in order to recommend products to customers. The role of reputation systems, in contrast, is to gather a collective opinion in order to build trust between users of an online community.

Types[edit]

Online[edit]

Howard Rheingold states that online reputation systems are 'computer-based technologies that make it possible to manipulate in new and powerful ways an old and essential human trait'. Rheingold inclines that these systems arose as a result of the need for Internet users to gain trust in the individuals they transact with online. The innate trait he makes note of in humans is that a function of society such as gossip 'keeps us up to date on who to trust, who other people trust, who is important, and who decides who is important'. Internet sites such as eBay and Amazon he argues seek to service this consumer trait and are 'built around the contributions of millions of customers, enhanced by reputation systems that police the quality of the content and transactions exchanged through the site'.

Reputation banks[edit]

The emerging sharing economy increases the importance of trust in peer-to-peer marketplaces and services.[2] User can build up reputation and trust in individual systems but don’t have the ability to use them in other systems. Rachel Botsman and Roo Rogers argue in their book What’s Mine is Yours (2010),[3] that 'it is only a matter of time before there is some form of network that aggregates reputation capital across multiple form of Collaborative Consumption'. These systems, often referred to as reputation banks, try to give users a platform to manage their Reputation capital across multiple systems.

Maintaining effective reputation systems[edit]

The main function of reputation systems is to build a sense of trust within users of online communities. Much like the brick and mortar stores, the idea of trust and reputation can be built through customer feedback. Paul Resnick from the Association of Computing Machinery describes three properties that are necessary for reputations systems to operate effectively.[1]

  1. Entities that have a long lifetime and create accurate expectations of future interactions
  2. Capture and distribute feedback about prior interactions
  3. Use feedback to guide trust

These three entities are critically important in building and all revolve around one important element: user feedback. User feedback in reputation systems, whether it be in the form of comments, ratings, or recommendations, is a valuable piece of information. Without the presence of user feedback reputation systems are not able to sustain the environment of trust needed. Eliciting user feedback can manifest three related problems.[1] The first of these problems is the willingness of users to provide feedback when the option to do so is not required. If an online community has a large stream of interactions happening, but no feedback is gathered the environment of trust and reputation cannot be formed. The second of these problems is gaining negative feedback from users. Many factors contribute to users not wanting to give negative feedback, the most prominent being a fear of retaliation. When feedback is not anonymous, many users fear retaliation if negative feedback is given. The final problem related to user feedback is eliciting honest feedback from users. Although there is no concrete method for ensuring the truthfulness of if a community of honest feedback is established, new users will be more likely to give honest feedback as well.

Other pitfalls to effective reputation systems described by A. Josang et al. include change of identities and discrimination. Again these ideas tie back to the idea of regulating user actions in order to gain accurate and consistent user feedback. When analyzing different types of reputation systems it is important to look at these specific features in order to determine the effectiveness of each system.

Notable examples of practical applications[edit]

Reputation as a resource[edit]

High reputation capital often confers benefits upon the holder. For example, a wide range of studies have found a positive correlation between seller rating and asking price on eBay,[4] indicating that high reputation can help users obtain more money for their items. High product reviews on online marketplaces can also help drive higher sales volumes.

Abstract reputation can be used as a kind of resource, to be traded away for short-term gains or built up by investing effort. For example, a company with a good reputation may sell lower-quality products for higher profit until their reputation falls, or they may sell higher-quality products to increase their reputation.[5] Some reputation systems go further, making it explicitly possible to spend reputation within the system to derive a benefit. For example, on the Stack Overflow community, reputation points can be spent on question "bounties" to incentivize other users to answer the question.[6]

Even without an explicit spending mechanism in place, reputation systems often make it easier for users to spend their reputation without harming it excessively. For example, a driver with a high ride acceptance score (a metric often used for driver reputation) on a ride-sharing service may opt to be more selective about his or her clientele, decreasing their acceptance score but improving their driving experience. With the explicit feedback provided by the service, drivers can carefully manage their selectivity to avoid being penalized too heavily.

Attacks and defense[edit]

Reputation systems are in general vulnerable to attacks, and many types of attacks are possible.[7] As the reputation system tries to generate an accurate assessment based on various factors including but not limited to unpredictable user size and potential adversarial environments, the attacks and defense mechanisms play an important role in the reputation systems. [8]

Attack classification of reputation system is based on identifying which system components and design choices are the targets of attacks. While the defense mechanisms are concluded based on existing reputation systems.

Attacker model[edit]

The capability of the attacker is determined by several characteristics, e.g., the location of the attacker related to the system (insider attacker vs. outsider attacker). The insiders are those entities who have legitimate access to the system and can participate according to the system specifications, while the outsider is any unauthorized entity in the system who may or may not be identifiable.

As the outsider attack is much more similar to other attacks in computer system environment, the insider attack gets more focus in the reputation system. Usually, here are some common assumptions: the attackers are motivated either by selfish or malicious intent and the attackers can either work alone or in coalitions.

Attack classification[edit]

The attacks against reputation systems are classified based on the goals of the reputation systems targeted by the attacks.

  • Self-promoting Attack. The attacker falsely increases their own reputation by falsely increase it. A typical example is the so-called Sybil attack where an attacker subverts the reputation system by creating a large number of pseudonymous entities, and using them to gain a disproportionately large influence.[9] A reputation system's vulnerability to a Sybil attack depends on how cheaply Sybils can be generated, the degree to which the reputation system accepts input from entities that do not have a chain of trust linking them to a trusted entity, and whether the reputation system treats all entities identically.
  • Whitewashing Attack. The attacker uses some system vulnerability to update their reputation. This attack usually targets the reputation system’s formulation that is used to calculate the reputation result. Whitewashing attack can be combined with other types of attacks to make each one more effective.
  • Slandering Attack. The attacker reports false data to lower the reputation of the victim nodes. It can be achieved both by a single attacker or a coalition of attackers.
  • Orchestrated Attack. The attacker orchestrates their efforts and employs several of the above strategies. One famous example of an orchestrated attack is known as an oscillation attack.[10]
  • Denial of Service Attack. The attacker prevents the calculation and dissemination of reputation values in reputation systems by using Denial of Service method.

Defense strategies[edit]

Here are some strategies to prevent the above attacks.

  • Preventing Multiple Identities
  • Mitigating Generation of False Rumors
  • Mitigating Spreading of False Rumors
  • Preventing Short-Term Abuse of the System
  • Mitigating Denial of Service Attacks

See also[edit]

References[edit]

  1. ^ a b c d Josang, Audun (2000). "A survey of trust and reputation systems for online service provision". Science Direct. 45: 45 – via 360 Link. 
  2. ^ Tanz, Jason (May 23, 2014). "How Airbnb and Lyft Finally Got Americans to Trust Each Other". 
  3. ^ Botsman, Rachel (2010). What's Mine is Yours. New York: Harper Business. ISBN 0061963542. 
  4. ^ Ye, Qiang (2013). "In-Depth Analysis of the Seller Reputation and Price Premium Relationship: A Comparison Between eBay US And Taobao China" (PDF). Journal of Electronic Commerce Research. 14 (1). 
  5. ^ Winfree, Jason, A. (2003). "Collective Reputation and Quality" (PDF). American Agricultural Economics Association Meetings. 
  6. ^ https://stackoverflow.com/help/bounty
  7. ^ Jøsang, A.; Golbeck, J. (September 2009). Challenges for Robust of Trust and Reputation Systems. (PDF). Proceedings of the 5th International Workshop on Security and Trust Management (STM 2009). Saint Malo, France. 
  8. ^ Hoffman, K.; Zage, D.; Nita-Rotaru, C. (2009). "A survey of attack and defense techniques for reputation systems" (PDF). ACM Computing Surveys (CSUR). 
  9. ^ Lazzari, Marco (March 2010). An experiment on the weakness of reputation algorithms used in professional social networks: the case of Naymz. Proceedings of the IADIS International Conference e-Society 2010. Porto, Portugal. 
  10. ^ Srivatsa, M.; Xiong, L.; Liu, L. (2005). TrustGuard: countering vulnerabilities in reputation management for decentralized overlay networks. (PDF). Proceedings of the IADIS International Conference e-Society 2010the 14th international conference on World Wide Web. Porto, Portugal. 

External links[edit]