Selfish-LRU: Preemption-Aware Caching for Predictability and Performance
| Authors |
|
|---|---|
| Publication date | 2014 |
| Book title | 2014 IEEE 20th Real-Time and Embedded Technology and Applications Symposium (RTAS) |
| ISBN |
|
| Event | Real-Time and Embedded Technology and Applications Symposium |
| Pages (from-to) | 135-144 |
| Publisher | Piscataway, NJ: IEEE |
| Organisations |
|
| Abstract |
We introduce Selfish-LRU, a variant of the LRU (least recently used) cache replacement policy that improves performance and predictability in preemptive scheduling scenarios. In multitasking systems with conventional caches, a single memory access by a preempting task can trigger a chain reaction leading to a large number of additional cache misses in the preempted task. Selfish-LRU prevents such chain reactions by first evicting cache blocks that do not belong to the currently active task. Simulations confirm that Selfish-LRU reduces the CRPD (cache-related preemption delay) as well as the overall number of cache misses. At the same time, it simplifies CRPD analysis and results in smaller CRPD bounds.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.1109/RTAS.2014.6925997 |
| Permalink to this page | |