Abstract:
A cache may store critical cache lines and non-critical cache lines, and may attempt to retain critical cache lines in the cache by, for example, favoring the critical cache lines in replacement data updates, retaining the critical cache lines with a certain probability when victim cache blocks are being selected, etc. Criticality values may be retained at various levels of the cache hierarchy. Additionally, accelerated eviction may be employed if the threads previously accessing the critical cache blocks are viewed as dead.
Abstract:
A variable latency cache memory is disclosed. A cache subsystem includes a pipeline control circuit configured to initiate cache memory accesses for data. The cache subsystem further includes a cache memory circuit having a data array arranged into a plurality of groups, wherein different ones of the plurality of groups have different minimum access latencies due to different distances from the pipeline control circuit. A plurality of latency control circuits configured to ensure a latency is bounded to a maximum value for a given access to the data array, wherein a given latency control circuit is associated with a corresponding group of the plurality of groups. The latency for a given access may thus vary between a minimum access latency for a group closest to the pipeline control circuit to a maximum latency for an access to the group furthest from the pipeline control circuit.
Abstract:
A cache may store critical cache lines and non-critical cache lines, and may attempt to retain critical cache lines in the cache by, for example, favoring the critical cache lines in replacement data updates, retaining the critical cache lines with a certain probability when victim cache blocks are being selected, etc. Criticality values may be retained at various levels of the cache hierarchy. Additionally, accelerated eviction may be employed if the threads previously accessing the critical cache blocks are viewed as dead.
Abstract:
A method is provided to operate a computer to interoperate with a portable media player. The method includes processing signals provided from the portable media player to the computer that are indicative of whether an accessory has been connected to the portable media player, to determine whether the accessory has been connected to the portable media player. Based on a determination that the accessory has been connected to the portable media player, physiologic data of a user that was provided to the portable media player from a wireless physiologic data gathering device, is received from the portable media player, into the computer, via the accessory.
Abstract:
A processor includes a mechanism that checks for and flushes only speculative loads and any respective dependent instructions that are younger than an executed wait for event (WEV) instruction, and which also match an address of a store instruction that has been determined to have been executed by a different processor prior to execution of the paired SEV instruction by the different processor. The mechanism may allow speculative loads that do not match the address of any store instruction that has been determined to have been executed by a different processor prior to execution of the paired SEV instruction by the different processor.
Abstract:
A processor includes a mechanism that checks for and flushes only speculative loads and any respective dependent instructions that are younger than an executed wait for event (WEV) instruction, and which also match an address of a store instruction that has been determined to have been executed by a different processor prior to execution of the paired SEV instruction by the different processor. The mechanism may allow speculative loads that do not match the address of any store instruction that has been determined to have been executed by a different processor prior to execution of the paired SEV instruction by the different processor.
Abstract:
A cache may store critical cache lines and non-critical cache lines, and may attempt to retain critical cache lines in the cache by, for example, favoring the critical cache lines in replacement data updates. Multiple levels of criticality may be available for a given cache line and cache circuitry may adjust the criticality value of in response to a criticality event. One or more upper criticality levels may be masked when selecting a victim cache line for replacement.
Abstract:
A cache may include multiple request handling pipes, each of which may further include multiple request buffers, for storing device requests from one or more processors to one or more devices. Some of the device requests may require to be sent to the devices according to an order. For a given one of such device requests, the cache may select a request handling pipe, based on an address indicated by the device request, and select a request buffer, based on the available entries of the request buffers of the selected request handling pipe, to store the device request. The cache may further use a first-level and a second-level token stores to track and maintain the device requests in order when transmitting the device requests to the devices.
Abstract:
A cache may store critical cache lines and non-critical cache lines, and may attempt to retain critical cache lines in the cache by, for example, favoring the critical cache lines in replacement data updates, retaining the critical cache lines with a certain probability when victim cache blocks are being selected, etc. Criticality values may be retained at various levels of the cache hierarchy. Additionally, accelerated eviction may be employed if the threads previously accessing the critical cache blocks are viewed as dead.
Abstract:
A system and method for efficiently handling data selected for eviction in a computing system. In various embodiments, a computing system includes one or more processors, a system memory, and a victim cache. The cache controller of a particular cache in a cache memory subsystem includes an allocator for determining whether to allocate data evicted from the particular cache into the victim cache. The data fetched into the first cache includes data fetched to service miss requests, which includes demand requests and prefetch requests. To determine whether to allocate, the allocator determines whether a usefulness of data fetched into the particular cache exceeds a threshold. If so, the evicted data is stored in the victim cache. If not, the evicted data bypasses the victim cache. Data determined to be accessed by a processor is deemed to be of a higher usefulness.