Abstract:
Physical page information PA(a) corresponding to logical page information VA(a) as a cache tag address is retained in a logical cache memory 10 and in the event of a cache miss when a shared area is accessed, the physical page information PA (a) retained in the cache memory is compared with physical page information PA (b) resulting from the translation of a search address by TLB. When the result of the comparison is proved to be conformity, the cache entry is processes as a cache hit, so that the problem of a synonym arising from a case where the same physical address is assigned to different logical addresses is solved in such a manner that the number of times access is provided to TLB is halved as compared with the conventional arrangement.
Abstract:
A translation lookaside buffer for detecting and preventing conflicting virtual addresses from being stored therein is disclosed. Each entry in the buffer is associated with a switch which can be set and reset to enable and disable, respectively, a buffer entry. A switch associated with an existing entry will be reset if such entry conflicts with a new buffer entry.
Abstract:
An address translation unit for supporting multiple page modes, with each page mode having a different page size. The address translation unit includes a tag interface unit for outputting effective tag data in response to a page mode select signal, a tag memory for storing tag data and for comparing the effective tag data with previously stored tag data to generate a comparison signal in response to a word signal and a write control signal, a data interface unit for outputting effective physical data in response to the page mode select signal, a data memory for storing effective physical data from the data interface unit and for outputting a converted physical address in response to the word signal, the write control signal and the comparison signal, a decoder interface unit for receiving a part of a linear address in response to a page mode signal and for outputting the part of the linear address as an entry index signal, and a decoding unit for decoding the part of the linear address and outputting the word signal selecting an entry to the tag memory and the data memory in response to the entry index signal, the write control signal and a write way signal.
Abstract:
A fast, fully associative translation lookaside buffer (TLB) with the ability to store and manage information pertaining to at least two different page sizes is disclosed. The TLB utilizes a tag array with tag lines and a data array with corresponding data lines. Within the tag array, each tag line incorporates a control cell which selectively enables or disables comparisons of tag bits to corresponding bits from an input address to the TLB. Within the data array, each data line incorporates control cells and multiplexing data cells to selectively determine whether bits in the physical address output of the TLB will be the derived from of the contents of the multiplexing data cells or bits from the input address. The use of control cells in the tag array and control cells and multiplexing data cells in the data array thereby provides for the ability to store and manage information pertaining to at least two different page sizes in a single TLB.
Abstract:
A cache with a translation lookaside buffer (TLB) that eliminates the need for retrieval of a physical address tag from the TLB when accessing the cache. The TLB includes two content addressable memories (CAM's). For each new cache line, in the tag portion of the cache, instead of storing physical tags, the cache stores vectors called physical hit vectors. Physical hit vectors are generated by a first TLB CAM. Each physical hit vector indicates all locations in the first TLB CAM containing the physical tag of the cache line. For a cache access, a second TLB CAM receives a virtual tag and generates a vector called a virtual hit vector. The virtual hit vector indicates the location in the second TLB CAM of the corresponding virtual tag. Then, instead of retrieving and comparing physical tags, the cache compares a virtual hit vector to a set of physical hit vectors without having to retrieve a physical tag. As a result, one operation is eliminated from a time critical path, reducing the access time. For caches having variable page sizes, an additional CAM structure stores page offset bits and corresponding bit masks from the operating system. Page offset bits are then used to further qualify comparison of virtual hit vectors and physical hit vectors.
Abstract:
A method and apparatus for translating a virtual address to a physical address. A virtual address to be translated has a virtual page offset and a virtual page number. The virtual address to be translated addresses a page of memory. The size of this page is unknown. There are L different possible page sizes where L is a positive integer greater than one. Each of the L different page sizes is selected to be a test page size and a test is performed. During the test, a pointer into a translation storage buffer is calculated. The pointer is calculated from the virtual address to be translated by assuming that the virtual address to be translated corresponds to a mapping of the test page size. The pointer points to a candidate translation table entry of the translation storage buffer. The candidate translation table entry has a candidate tag and candidate data. The candidate tag identifies a particular virtual address and the candidate data identifies a particular physical address corresponding to the particular virtual address. A virtual address target tag is extracted from the virtual address to be translated. The virtual address target tag is calculated by assuming that the virtual address to be translated corresponds to a mapping of the test page size. The target tag and the candidate tag are then compared. If the target tag matches the candidate tag, the candidate data is provided as the physical address translation corresponding to the virtual address to be translated.
Abstract:
A semiconductor integrated circuit device such as a data processing device having a set-associative translation look-aside buffer (TLB). A plurality of address arrays each have a second field for storing the value representing a page size. The values read from the second fields are used to change the range of address comparison by comparators. A plurality of data arrays each have a second field for storing a bit position address designating either an intra-page address or a page number following a page size change. The values read from the second fields of the address arrays are used as the basis for second selectors to select either an address in a predetermined location of an externally input virtual address or the address read from each of the second fields of the data arrays. The selected address is output as a physical address.
Abstract:
A method of accessing a content addressable memory storing two bits of information representing either an invalid state, a logic zero state, a logic one state, or a don't care state, is disclosed. The stored information is compared with a one bit signal. A match is indicated when the one bit signal represents a logic zero and the stored information represents the don't care state, or when the one bit signal represents a logic one and the stored information represents a don't care state. An absence of a match is indicated when the one bit signal represents a logic zero and the stored information represents an invalid state, or when the one bit signal represents a logic one and the stored information represents the invalid state. The content addressable memory is especially adapted for use in a translation buffer providing variable page granularity. The don't care state permits multiple virtual page numbers to match a single entry storing information for multiple physical pages. The invalid state eliminates the need for a dedicated valid bit in each entry.
Abstract:
When a computer system is upgraded, such as by adding a more advanced processor chip and/or a new operating system, a different page size may be employed. The page size is altered for data previously stored in a storage medium such as a hard disk in the computer system, without removing all of the data from the medium and rewriting it. Data is stored in the medium in blocks or sectors which have headers defining the block. Also, tables define memory objects and segments, and locate virtual memory addresses in physical memory. The headers and/or tables can be changed without rewriting all of the data in the sectors or pages in physical memory, so the page size is changed to accommodate the new system components, without excessive burden on system hardware or undue expenditure of time. In an example, in changing from a CISC processor with a 512-byte page size to a RISC system with a 4K-byte page size, the segments are changed to always be of a size of an integral multiple of 4K, and "extents" or subdivisions within a segment are changed to be multiples of 4K. Any excess space generated by these changes is zeroed. After alteration, the media (such as disks) can be accessed by either the CISC system or the new upgraded RISC system.
Abstract:
A translation look-aside buffer (TLB) for translating a variable page size virtual page number to a physical page number. The TLB partitions the virtual page number into an upper portion and a lower portion. The upper portion is always compared to an upper virtual page number entry in a first content addressable memory while only certain bits of lower portion are selectively compared to a corresponding number of bits in a lower virtual page number entry in a second content addressable memory. The number of bits compared in the second content addressable memory is determined by the specified size of the physical page. The TLB includes a page size memory having a plurality of page size entries wherein the certain number of bits for each of the lower virtual page entries is specified by a corresponding page size entry. Associated with each bit in the lower virtual page number entries is an enable transistor for selectively enabling the comparison of that bit in the lower virtual page number entry. The enable gate includes a control input coupled to a corresponding bit in a corresponding page size entry, the enable transistor selectively enabling the single bit comparison when the corresponding bit in the page size entry is set to an enable state and selectively disabling the comparison when the corresponding bit in the page size entry is set to a disable state.