The LRU replacement policy is based on the assumption that if address A1 is accessed less recently
Question:
The LRU replacement policy is based on the assumption that if address A1 is accessed less recently than address A2 in the past, then A2 will be accessed again before A1 in the future. Hence, A2 is given priority over A1. Discuss how this assumption fails to hold when the a loop larger than the instruction cache is being continuously executed. For example, consider a fully associative 128-byte instruction cache with a 4-byte block (every block can exactly hold one instruction). The cache uses an LRU replacement policy.
a. What is the asymptotic instruction miss rate for a 64-byte loop with a large number of iterations?
b. Repeat part (a) for loop sizes 192 bytes and 320 bytes.
c. If the cache replacement policy is changed to most recently used (MRU) (replace the most recently accessed cache line), which of the three above cases (64-, 192-, or 320-byte loops) would benefit from this policy?
d. Suggest additional replacement policies that might outperform LRU.
Step by Step Answer:
Computer Architecture A Quantitative Approach
ISBN: 978-8178672663
5th edition
Authors: John L. Hennessy, David A. Patterson