Storage : Disk Cache

Accessing data stored permanently in backing store is much slower than accessing data in RAM. A common technique that is used to speed up access to data on backing storage devices is caching.

A cache is a part of a computer's RAM that is set aside to store copies of frequently used sectors on a backing storage device such as a hard disk drive. As files are accessed from the hard disk the computer will determine which sectors of the disk are being used most frequently. Copies of the data that is located in these sectors are transferred into the cache.

After this whenever the data in these sectors is read or changed the computer will use the copies of the sectors in the cache rather than the original sectors on the disk. Since the cache is located in RAM these accesses will be many times quicker, so the operation of the computer system will be speeded up.

The contents of the cache are lost when the computer is turned off. This is because cache is located in RAM which is volatile. To ensure that any changes made to sectors in the cache are saved permanently to the hard disk drive the computer system will copy these sectors back to the hard disk drive at a time when the drive is not being used or when the computer is properly shut down.

GCSE ICT Companion 04 - (C) P Meakin 2004