Using History: LRU

This lesson discusses​ the LRU policy as a replacement policy and how it came into being.

We'll cover the following

Unfortunately, any policy as simple as FIFO or Random is likely to have a common problem: it might kick out an important page, one that is about to be referenced again. FIFO kicks out the page that was first brought in; if this happens to be a page with important code or data structures upon it, it gets thrown out anyhow, even though it will soon be paged back in. Thus, FIFO, Random, and similar policies are not likely to approach optimal; something smarter is needed.

Press + to interact
FIFO queue
FIFO queue

As we did with scheduling policy, to improve our guess at the future, we once again lean on the past and use history as our guide. For example, if a program has accessed a page in the near past, it is likely to access it again in the near future.

Principle of locality

One type of historical information a page-replacement policy could use is frequency; if a page has been accessed many times, perhaps it should not be replaced as it clearly has some value. A more commonly-used property of a page is its recency of access; the more recently a page has been accessed, perhaps the more likely it will be accessed again.

This family of policies is based on what people refer to as the principle of locality“Virtual Memory” by Peter J. Denning. Computing Surveys, Vol. 2, No. 3, September 1970. Denning’s early and famous survey on virtual memory systems., which basically is just an observation about programs and their behavior. What this principle says, quite simply, is that programs tend to access certain code sequences (e.g., in a loop) and data structures (e.g., an array accessed by the loop) quite frequently; we should thus try to use history to figure out which pages are important, and keep those pages in memory when it comes to eviction time.

And thus, a family of simple historically-based algorithms is​ born. The Least-Frequently-Used (LFU) policy replaces the least-frequently-used page when an eviction must take place. Similarly, the Least-Recently-Used (LRU) policy replaces the least-recently-used page. These algorithms are easy to remember: once you know the name, you know exactly what it does, which is an excellent property for a name.

ASIDE: TYPES OF LOCALITY

There are two types of locality that programs tend to exhibit. The first is known as spatial locality, which states that if a page P is accessed, it is likely the pages around it (say P1P - 1 or P+1P + 1) will also likely be accessed. The second is temporal locality, which states that pages that have been accessed in the near past are likely to be accessed again in the near future. The assumption of the presence of these types of locality plays a large role in the caching hierarchies of hardware systems, which deploy many levels of instruction, data, and address-translation caching to help programs run fast when such locality exists.

Of course, the principle of locality, as it is often called, is no hard-and-fast rule that all programs must obey. Indeed, some programs access memory (or disk) in a rather random fashion and don’t exhibit much or any locality in their access streams. Thus, while locality is a good thing to keep in mind while designing caches of any kind (hardware or software), it does not guarantee success. Rather, it is a heuristic that often proves useful in the design of computer systems.

Example

To better understand LRU, let’s examine how LRU does on our example reference stream. The figure below shows the results.

From the figure, you can see how LRU can use history to do better than stateless policies such as Random or FIFO. In the example, LRU evicts page 2 when it first has to replace a page, because 0 and 1 have been accessed more recently. It then replaces page 0 because 1 and 3 have been accessed more recently. In both cases, LRU’s decision, based on history, turns out to be correct, and the next references are thus hits. Thus, in our example, LRU does as well as possible, matching optimal in its performanceOK, we cooked the results. But sometimes cooking is necessary to prove a point..

We should also note that the opposites of these algorithms exist: Most-Frequently-Used (MFU) and Most-Recently-Used (MRU). In most cases (not all!), these policies do not work well, as they ignore the locality most programs exhibit instead of embracing it.

Get hands-on with 1400+ tech skills courses.