On-chip hotspots are a negative effect of recent trends in transistor scaling and result in a variety of adverse side- effects, ranging from incorrect circuit operation, bit errors, and reduced device lifespan. In response to these issues, researchers in the computing industry have increasingly focused on techniques to reduce or eliminate such hotspots. These techniques span a variety of technical areas, including Physical Cooling Systems, Circuit and EDA Tools, Software, and Computer Architecture. In spite of these combined efforts, however, hotspots remain an unsolved issue.
Existing hotspot mitigation techniques are predominately static, while modern on-chip hotspots are increasingly application dependent, suggesting a need for runtime hotspot mitigation techniques. We propose a distributed hotspot predictor which predicts hotspots based on runtime conditions and prevents them from ever occurring.