Wednesday, November 4, 2009

Reading Cache Memory

If you read and store any data at all, it will, at some point be read and
stored by the cache memory of the processor. That is what the cache memory
is there for--to automatically increase the execution speed of your program
by temporarily caching the contents of main memory.

If you wish to use the cache as "extra memory" for your data, this is not
how the cache works, and not what it was designed for. It is designed to
bridge the gap between extremely fast processors and slow main memory or
RAM. The cache controller copies areas of main memory that it thinks are
going to be needed to the cache memory, which the processor can then access
at high speeds. If something isn't in the cache, then the processor has to
wait while it is retrieved from main memory.

As far as directly controlling the cache, there are not generally
instructions in the processor for doing this. Instead, you should research
how your particular cache controller functions, and how it expects memory to
be accessed. By insuring your program follows these same patterns, the
cache controller will perform optimally with your program.

It is possible to optimize the execution speed of a program by arranging
your code for optimal use of the cache, but as I will attempt to explain
below, it is generally a waste of time, unless you simply want to know.

No comments:

Post a Comment