Abstract:
A system is described that includes a microprocessor and a thermal control subsystem. The microprocessor includes execution resources to support processing of instructions and consumes power. The microprocessor also includes at least one throttling mechanism to reduce the amount of heat generated by the microprocessor. The thermal control subsystem is configured to estimate an amount of power used by the microprocessor and to control the throttling mechanism based on the estimated amount of current power usage to ensure that junction temperature will not exceed the maximum allowed temperature.
Abstract:
A power aware front-end unit for a processor may include a UOP cache that disables other circuitry within the front-end unit. In an embodiment, a front-end unit may disable instruction synchronization circuitry, instruction decode circuitry and, optionally, instruction fetch circuitry while instruction look-ups are underway in both a block cache and an instruction cache. If the instruction look-up indicates a miss, the disabled circuitry thereafter may be enabled.
Abstract:
Distribution of processing activity across processing hardware based on power consumption and/or thermal considerations. One embodiment includes a plurality of processing units and a monitor to obtain monitor (e.g., power consumption, or temperature or some combination thereof) values from the processing units. The monitor transfers a process from one processing unit to another in response to the monitor values from the processing units.
Abstract:
A system and corresponding method use a PAUSE instruction as a low power hint in a single threaded or multithreaded environment using “processor slow mode.” One embodiment actually lowers the frequency of the processor clock. Another embodiment virtually lowers the frequency of the processor clock by gating M clock cycles out of every N clock cycles. When all threads have issued a PAUSE instruction, the processor enters slow mode and remains there for a while. After this while, the processor returns to normal mode. Alternatively, an event, such as an interrupt or an exception, can cause the processor to return to normal mode from slow mode.
Abstract:
A power aware front-end unit for a processor may include a UOP cache that disables other circuitry within the front-end unit. In an embodiment, a front-end unit may disable instruction synchronization circuitry, instruction decode circuitry and, optionally, instruction fetch circuitry while instruction look-ups are underway in both a block cache and an instruction cache. If the instruction look-up indicates a miss, the disabled circuitry thereafter may be enabled.
Abstract:
Distribution of processing activity across processing hardware based on power consumption and/or thermal considerations. One embodiment includes a plurality of processing units and a monitor to obtain monitor (e.g., power consumption, or temperature or some combination thereof) values from the processing units. The monitor transfers a process from one processing unit to another in response to the monitor values from the processing units.
Abstract:
A power aware front-end unit for a processor may include a UOP cache that disables other circuitry within the front-end unit. In an embodiment, a front-end unit may disable instruction synchronization circuitry, instruction decode circuitry and, optionally, instruction fetch circuitry while instruction look-ups are underway in both a block cache and an instruction cache. If the instruction look-up indicates a miss, the disabled circuitry thereafter may be enabled.
Abstract:
A power aware front-end unit for a processor may include a UOP cache that disables other circuitry within the front-end unit. In an embodiment, a front-end unit may disable instruction synchronization circuitry, instruction decode circuitry and, optionally, instruction fetch circuitry while instruction look-ups are underway in both a block cache and an instruction cache. If the instruction look-up indicates a miss, the disabled circuitry thereafter may be enabled.
Abstract:
A power aware front-end unit for a processor may include a UOP cache that disables other circuitry within the front-end unit. In an embodiment, a front-end unit may disable instruction synchronization circuitry, instruction decode circuitry and, optionally, instruction fetch circuitry while instruction look-ups are underway in both a block cache and an instruction cache. If the instruction look-up indicates a miss, the disabled circuitry thereafter may be enabled.
Abstract:
Distribution of processing activity across processing hardware based on power consumption and/or thermal considerations. One embodiment includes a plurality of processing units and a monitor to obtain monitor (e.g., power consumption, or temperature or some combination thereof) values from the processing units. The monitor transfers a process from one processing unit to another in response to the monitor values from the processing units.