Extending Data Prefetching to Cope with Context Switch Misses

Show full item record

Title: Extending Data Prefetching to Cope with Context Switch Misses
Author: Cui, Hanyu
Advisors: Edward Gehringer, Committee Member
Eric Rotenberg, Committee Member
Yan Solihin, Committee Member
Suleyman Sair, Committee Chair
Abstract: Among the various costs of a context switch, its impact on the performance of L2 caches is the most significant because of the resulting high miss penalty. To mitigate the impact of context switches, several OS approaches have been proposed to reduce the number of context switches. Nevertheless, frequent context switches are inevitable in certain cases and result in severe L2 cache performance degradation. Moreover, traditional prefetching techniques are ineffective in the face of context switches as their prediction tables are also subject to loss of content during a context switch. To reduce the impact of frequent context switches, we propose restoring a program's locality by prefetching into the L2 cache the data a program was using before it was swapped out. A Global History List is used to record a process' L2 read accesses in LRU order. These accesses are saved along with the process' context when the process is swapped out and loaded to guide prefetching when it is swapped in. We also propose a feedback mechanism that greatly reduces memory traffic incurred by our prefetching scheme. A phase guided prefetching scheme was also proposed to complement GHL prefetching. Experiments show significant speedup over baseline architectures with and without traditional prefetching in the presence of frequent context switches.
Date: 2009-03-18
Degree: PhD
Discipline: Computer Engineering
URI: http://www.lib.ncsu.edu/resolver/1840.16/3328

Files in this item

Files Size Format View
etd.pdf 3.925Mb PDF View/Open

This item appears in the following Collection(s)

Show full item record