Talk:Cache (computing)

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Computing / Software / Websites / CompSci / Hardware (Rated B-class, Top-importance)
WikiProject icon This article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 Top  This article has been rated as Top-importance on the project's importance scale.
Taskforce icon
This article is supported by WikiProject Software (marked as High-importance).
Taskforce icon
This article is supported by WikiProject Websites (marked as High-importance).
Taskforce icon
This article is supported by WikiProject Computer science (marked as High-importance).
Taskforce icon
This article is supported by Computer hardware task force (marked as Top-importance).

Requested move[edit]

The following discussion is an archived discussion of the proposal. Please do not modify it. Subsequent comments should be made in a new section on the talk page. No further edits should be made to this section.

Page moved to Cache (computing). Vegaswikian (talk) 19:58, 4 February 2012 (UTC)

– In this particular case, the gerund form would be clearer. In computing a cache itself is useless without a caching-oriented system which actually stops to examine the cache. At the same time, -ing is something of a natural disambiguator. The computing use is not the primary topic for Cache (disambiguation), but perhaps for Caching it is. Pnm (talk) 21:46, 28 January 2012 (UTC)

  • rename to "Cache (computing)" to clarify and disambiguate right here at the article name level and since this is what the article is about. Hmains (talk) 00:57, 29 January 2012 (UTC)
  • Yes, move to Cache (computing), clean up cache and then move Cache (disambiguation)Cache Josh Parris 14:11, 29 January 2012 (UTC)
  • Oppose, though I could be convinced on the primary topic issue. On the title of this article, though, I don't think the gerund gives the right semantics here. "What is caching? The act of putting something in a cache." Any definition of caching requires first a definition of cache, and that is strong evidence that it's the storage itself, not the act of storing, that is key. Powers T 01:01, 30 January 2012 (UTC)
Caching refers to the strategy, to the feature of a system. To implement caching requires four things: having a cache, checking the cache first when a request comes along, putting entries into the cache, and deciding when to dispose entries. In general, literature uses both terms, but with topics like web caching and database caching the gerund is more common than the noun, as you can see from those articles' sources. – Pnm (talk) 04:05, 30 January 2012 (UTC)
Those articles are about specific applications of a cache. A web cache is not materially different from a database cache; it's the use of the cache that differs. But the article about the wider concept of a cache is properly named after the object of the action, not the action. Powers T 13:05, 30 January 2012 (UTC)
My point is that web caching is materially different from a CPU cache: the latter is an actual component in hardware. The object of caching is the cached content, not the cache. There are several main articles named after actions: collecting, shipping, running, and photography, not collection, shipment, or run; photograph is a short sub-article. – Pnm (talk) 04:04, 31 January 2012 (UTC)
I never claimed an article can't be named with the gerund; I just don't think it's the optimal choice here. Powers T 18:04, 31 January 2012 (UTC)
The above discussion is preserved as an archive of the proposal. Please do not modify it. Subsequent comments should be made in a new section on this talk page. No further edits should be made to this section.

Flow charts[edit]

What's the software used to make flow chart diagram of write-back and write-through procedure? I'd very much like to use it for personal design. ChazZeromus (talk) 17:55, 21 June 2012 (UTC) (talk) 00:24, 5 February 2013 (UTC)

Need clarification on backing store[edit]

The article keeps telling about 'backing store'. But it doesn't mention what it is(Probably Hard disk drive?). So it needs to define backing store or provide a link to another article that describes what that backing store is.

07:19, 2 June 2014 (UTC)Venki Subramanian (talk)

The backing store is whatever the cache is caching. :-) It could be main memory, or a cache at a higher level of the cache hierarchy, for a CPU cache or a TLB. It could be a disk drive, SSD, or remote file server for a file data cache ("disk cache"/"page cache"); it could be a Web server for a Web cache; and so on. Guy Harris (talk) 07:51, 2 June 2014 (UTC)

All cache is volatile right?[edit]

In risk of sounding silly, all cache is volatile right? I saw zero mention of volatility anywhere in the article, and if that's the case, it should be added.— Preceding unsigned comment added by ‎BlueFenixReborn (talkcontribs) 08:59, December 20, 2014 (UTC)

Hello! Obviously, that depends on what kind of a device is used as a cache. For example, using DRAM results in a volatile cache. while using an HDD or SSD means permanent storage of cached data. It's pretty much implicitly known whenever a particular cache layout is mentioned in the article; thus, I don't think that it should be clarified further. — Dsimic (talk | contribs) 08:39, 20 December 2014 (UTC)
Your browser for example also stores its caches on disk, which persists between restarts and reboots. Image viewer thumbnail caches are often stored persistently. -- intgr [talk] 08:45, 20 December 2014 (UTC)
Right, thank you both for the clarification. BlueFenixReborn (talk) 06:57, 31 December 2014 (UTC)

'buffer vs cache' Could this be improved?[edit]

I think this would be clearer?

Motivation - throughput, latency, granularity[edit]

The effect of a cache in buffering accesses benefits both throughput and latency.


Often a larger, distant resource incurs a significant latency for access (e.g. it can take 100's of clock cycles for a modern 4ghz processor to reach DRAM). This is mitigated by reading the distant resource in large chunks, in the hope that subsequent reads will be from nearby locations. Prediction hardware or prefetching might also guess where future reads will come from and make requests ahead of time; if done correctly the latency is bypassed altogether.

Throughput & granularity[edit]

Beyond this, granularity is important. The use of a cache also allows for much higher throughput from the underlying resource, by assembling multiple fine grain transfers into larger, more efficient requests. In the case of DRAM, this might be served by a wider bus. Imagine a program stepping through bytes, but being served by a 128bit off chip bus; individual uncached byte accesses would only allow 1/16th of the total bandwidth to be used.