Jump to content

Distributed cache

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 4.68.92.68 (talk) at 22:34, 18 September 2015 (Examples). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computing, a distributed cache is an extension of the traditional concept of cache used in a single locale. A distributed cache may span multiple servers so that it can grow in size and in transactional capacity. It is mainly used to store application data residing in database and web session data. The idea of distributed caching[1] has become feasible now because main memory has become very cheap and network cards have become very fast, with 1 Gbit now standard everywhere and 10 Gbit gaining traction. Also, a distributed cache works well on lower cost machines usually employed for web servers as opposed to database servers which require expensive hardware.[2]

Examples

See also

References

  1. ^ Paul, S; Z Fei (1 February 2001). "Distributed caching with centralized control". Computer Communications. 24 (2): 256–268. doi:10.1016/S0140-3664(00)00322-4. {{cite journal}}: |access-date= requires |url= (help)
  2. ^ Khan, Iqbal. "Distributed Caching On The Path To Scalability". MSDN (July 2009). Retrieved 30 March 2012.