Extending Data Prefetching to Cope with Context Switch Misses
No Thumbnail Available
Files
Date
2009-03-18
Authors
Journal Title
Series/Report No.
Journal ISSN
Volume Title
Publisher
Abstract
Among the various costs of a context switch, its impact on the performance
of L2 caches is the most significant because of the resulting high miss
penalty. To mitigate the impact of context switches, several OS approaches
have been proposed to reduce the number of context switches. Nevertheless,
frequent context switches are inevitable in certain cases and result in
severe L2 cache performance degradation. Moreover, traditional prefetching
techniques are ineffective in the face of context switches as their
prediction tables are also subject to loss of content during a context
switch.
To reduce the impact of frequent context switches, we propose restoring a
program's locality by prefetching into the L2 cache the data a program was
using before it was swapped out. A Global History List is used to record a
process' L2 read accesses in LRU order. These accesses are saved along
with the process' context when the process is swapped out and loaded to
guide prefetching when it is swapped in. We also propose a feedback
mechanism that greatly reduces memory traffic incurred by our prefetching
scheme. A phase guided prefetching scheme was also proposed to complement GHL
prefetching. Experiments show significant speedup over baseline architectures
with and without traditional prefetching in the presence of frequent
context switches.
Description
Keywords
computer architecture, prefetching, context switching
Citation
Degree
PhD
Discipline
Computer Engineering