METHOD AND APPARATUS FOR MANAGING A CACHE DIRECTORY

    公开(公告)号:US20220206946A1

    公开(公告)日:2022-06-30

    申请号:US17135657

    申请日:2020-12-28

    Abstract: Method and apparatus monitor eviction conflicts among cache directory entries in a cache directory and produce cache directory victim entry information for a memory manager. In some examples, the memory manager reduces future cache directory conflicts by changing a page level physical address assignment for a page of memory based on the produced cache directory victim entry information. In some examples, a scalable data fabric includes hardware control logic that performs the monitoring of the eviction conflicts among cache directory entries in the cache directory and produces the cache directory victim entry information.

    METHOD AND APPARATUS FOR A DRAM CACHE TAG PREFETCHER

    公开(公告)号:US20220318151A1

    公开(公告)日:2022-10-06

    申请号:US17219782

    申请日:2021-03-31

    Abstract: Devices and methods for cache prefetching are provided. A device is provided which comprises memory and a processor. The memory comprises a DRAM cache, a cache dedicated to the processor and one or more intermediate caches between the dedicated cache and the DRAM cache. The processor is configured to issue prefetch requests to prefetch data, issue data access requests to fetch the data and when one or more previously issued prefetch requests are determined to be inaccurate, issue a prefetch request to prefetch a tag, corresponding to the memory address of requested data in the DRAM cache. A tag look-up is performed at the DRAM cache without performing tag look-ups at the dedicated cache or the intermediate caches. The tag is prefetched from the DRAM cache without prefetching the requested data.

    TAG AND DATA CONFIGURATION FOR FINE-GRAINED CACHE MEMORY

    公开(公告)号:US20240111425A1

    公开(公告)日:2024-04-04

    申请号:US17956614

    申请日:2022-09-29

    CPC classification number: G06F3/0613 G06F3/0659 G06F3/0679

    Abstract: A method for operating a memory having a plurality of banks accessible in parallel, each bank including a plurality of grains accessible in parallel is provided. The method includes: based on a memory access request that specifies a memory address, identifying a set that stores data for the memory access request, wherein the set is spread across multiple grains of the plurality of grains; and performing operations to satisfy the memory access request, using entries of the set stored across the multiple grains of the plurality of grains.

    Using Error Correction Code (ECC) Bits for Retaining Victim Cache Lines in a Cache Block in a Cache Memory

    公开(公告)号:US20230022320A1

    公开(公告)日:2023-01-26

    申请号:US17384420

    申请日:2021-07-23

    Abstract: An electronic device includes a cache memory and a controller. The cache memory includes a set of cache blocks, each cache block having a number of locations usable for storing cache lines. The cache memory also includes a separate set of error correction code (ECC) bits for each of the locations. The controller stores a victim cache line, evicted from a first location in the cache block, in a second location in the cache block. The controller next stores victim reference information in a portion of the plurality of ECC bits for the first location, the victim reference information indicating that the victim cache line is stored in the second location.

    METHOD AND APPARATUS FOR MONITORING MEMORY ACCESS TRAFFIC

    公开(公告)号:US20220100668A1

    公开(公告)日:2022-03-31

    申请号:US17094989

    申请日:2020-11-11

    Abstract: Methods and apparatus provide monitoring of memory access traffic in a data processing system by tracking, such as by data fabric hardware control logic, a number of cache line accesses to a page of memory associated with one or more memory devices, and producing spike indication data that indicates a spike in cache line accesses to a given page of memory. Pages are moved from a slower memory to a faster memory based on the spike indication data. In some implementations, the tracking is done by updating a cache directory with data representing the tracked number of cache line accesses.

Patent Agency Ranking