NCSU Institutional Repository >
NC State Theses and Dissertations >
Dissertations >

Please use this identifier to cite or link to this item:

Title: Extending Data Prefetching to Cope with Context Switch Misses
Authors: Cui, Hanyu
Advisors: Edward Gehringer, Committee Member
Eric Rotenberg, Committee Member
Yan Solihin, Committee Member
Suleyman Sair, Committee Chair
Keywords: computer architecture
context switching
Issue Date: 18-Mar-2009
Degree: PhD
Discipline: Computer Engineering
Abstract: Among the various costs of a context switch, its impact on the performance of L2 caches is the most significant because of the resulting high miss penalty. To mitigate the impact of context switches, several OS approaches have been proposed to reduce the number of context switches. Nevertheless, frequent context switches are inevitable in certain cases and result in severe L2 cache performance degradation. Moreover, traditional prefetching techniques are ineffective in the face of context switches as their prediction tables are also subject to loss of content during a context switch. To reduce the impact of frequent context switches, we propose restoring a program's locality by prefetching into the L2 cache the data a program was using before it was swapped out. A Global History List is used to record a process' L2 read accesses in LRU order. These accesses are saved along with the process' context when the process is swapped out and loaded to guide prefetching when it is swapped in. We also propose a feedback mechanism that greatly reduces memory traffic incurred by our prefetching scheme. A phase guided prefetching scheme was also proposed to complement GHL prefetching. Experiments show significant speedup over baseline architectures with and without traditional prefetching in the presence of frequent context switches.
Appears in Collections:Dissertations

Files in This Item:

File Description SizeFormat
etd.pdf4.02 MBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.