首页 / 专利库 / 控制系统 / 正向偏置 / Contents addressable memory

Contents addressable memory

阅读:1018发布:2021-03-23

专利汇可以提供Contents addressable memory专利检索,专利查询,专利分析的服务。并且In a contents addressable memory for a fully associative cache memory in which address bit values in respect of data to be retrieved from cache are applied to the bit lines of respective columns of cells for comparison with the bit values held by the cells, a match in any one cell is arranged to forward bias a match line device in that cell, the match line devices of a row being connected in cascade, so that if a match is obtained along a row a current path is provided along the row to a respective current sensing circuit to indicate the match.,下面是Contents addressable memory专利的具体信息内容。

A contents addressable memory comprising an array of memory cells arranged in rows and columns, wherein each of a plurality of said memory cells includes a cascadable match line device, and said match line devices in respect of a row of said plurality of memory cells are arranged to be connected in a series path to an input of a current sensing circuit.A contents addressable memory in accordance with Claim 1 wherein there are provided register means to hold a plurality of data bit values, means to apply respective ones of said plurality of data bit values to the memory cells of a row, and means in each memory cell in said row responsive to a data bit value held in said memory cell and to the respective one of said data bit values from said register means selectively to bias said match line device into a substantially conducting condition.A contents addressable memory comprising a plurality of memory cells arranged in rows and columns, with a respective pair of data bit lines associated with each column of memory cells, wherein each memory cell includes a match line transistor device and circuit means responsive in operation to a data bit value held by said memory cell and to a data bit value represented by signals on the respective pair of data bit lines selectively to bias said match line transistor device into a conductive condition, and the match line transistor devices of a row of memory cells are arranged to be connected in a series path to an input of a current sensing circuit.A contents addressable memory in accordance with Claim 3 including a register for holding a plurality of data bit values and means to apply signals representing a respective one of said plurality of data bit values to each said respective pair of data bit lines.A contents addressable memory in accordance with Claim 4 wherein the signals representing said respective ones of said plurality of data bit values are applied to the respective data bit lines in such a sense that if the plurality of data bit values held in said register match those held in respective ones of a row of memory cells, the match line transistor devices of that row of memory cells are all biased to said conductive condition so that a current flows in said series path to the input of said current sensing circuit.A contents addressable memory in accordance with Claim 5 wherein said current sensing circuit is arranged to provide a voltage output signal in response to a current flow in said series path to indicate said match.
说明书全文

The present invention relates to contents addressable memories. In particular although not exclusively the invention relates to contents addressable memories for use as fully associative cache memories.

Cache memories may be characterised by the number of cache locations into which a particular block of data may be placed. For example, if a block of data called up from main storage may be placed in any of four separate locations in a cache memory then the cache is said to be four way set associative. Caches may range from one way set associative, or direct-mapped caches, to fully associative, in which a block of data may be placed in any location.

Each location in a cache memory comprises an address store location and an associated data store location. When a processor served by the cache memory issues a new address, the cache compares this address with any addresses it holds in its address store locations, and if the new address matches that held in one of the address store locations the data held in the associated store location is passed to the processor. In a direct mapped cache in which only one location can possibly hold the required data only one comparison is required. As the level of associativity increases so does the number of comparisons required, and since these comparisons must be done in parallel to meet the timing constraints imposed by the processor, this requires the use of a contents addressable memory.

According to one aspect of the present invention in a contents addressable memory comprising an array of memory cells arranged in rows and columns, each of a plurality of said memory cells includes a cascadable matchline device, and said matchline devices in respect of a row of said plurality of memory cells are arranged to be connected in a series path to an input of a current sensing circuit.

According to another aspect of the present invention in a contents addressable memory comprising a plurality of memory cells arranged in rows and columns, with a respective pair of data bit lines associated with each column of memory cells, each said memory cell includes a matchline transistor device and circuit means responsive in operation to a data bit value held by said memory cell and to a data bit value represented by signals on the respective pair of data bit lines selectively to bias said matchline transistor device into a conductive condition, and the matchline transistor devices of a row of memory cells are arranged to be connected in a series path to an input of a current sensing circuit.

A contents addressable memory in accordance with the present invention will now be described by way of example with reference to the accompanying drawings, of which:-

  • Figure 1 shows schematically a cache memory arrangement utilising contents addressable memory,
  • Figure 2 shows diagrammatically a known form of contents addressable memory cell,
  • Figure 3 shows diagrammatically a contents addressable memory cell in accordance with the present invention, and
  • Figure 4 shows schematically part of a contents addressable memory utilising the form of memory cell shown in Figure 3.

Referring first to Figure 1 a cache memory for a processor (not shown) comprises a random access memory (RAM) 1, in which is held data from a main storage area (not shown) which has been called up for use by the processor or which is required to be available to the processor for rapid access, together with a contents addressable memory 2 in which are held addresses, or tags, identifying and locating particular data words held in the random access memory 1. The addresses or tags may correspond to the main memory addresses of the respective data words.

The data words and their addresses may be entered in the cache memory in the sequence in which they are called up from main memory or, in known manner, in accordance with a replacement algorithm under the control of a replacement control unit 3.

When the processor issues an address for data which is to be called up, that address is entered in a register 4, whereupon it is compared with all the addresses held in the memory 2, and if a hit is obtained, the corresponding data word or words held in the data RAM 1 is or are read out by way of an output register 5. If no hit is obtained the required data has to be called up from main memory.

Referring now to Figure 2, a known form of contents addressable memory cell comprises transistors m1 to m4 connected as a pair of cross-coupled inverters, which are connected to bit lines bl and blb by way of access transistors m6 and m5 respectively. These access transistors are arranged to be controlled by potentials applied to a word line wl. The bit lines bl and blb are common to a column of cells while the word line wl is common to a row of cells. A match line m1 runs across the array of cells parallel to a word line wl, this match line being precharged to the supply voltage vdd at the beginning of a comparison cycle. The bit values of an address loaded into the register 4 are applied to respective ones of the bit lines bl and blb of the array of contents addressable memory cells so that each address bit value will be compared with the values stored in the associated column of cells. The address bit values are applied in such a sense that if at any cell a match occurs transistors m7 to m10 of that cell will be biased so that neither the pair of transistors m7, m9 nor the pair of transistors m8, m10 will provide a path to ground from the match line ml. If a match occurs at all cells along a row, then the match line ml for that row will remain charged to vdd and a hit will be indicated. On the other hand, if at any cell along a row a mismatch occurs, either the transistors m7, m9 or the transistors m8, m10 will provide a path to ground for the respective match line and a miss will be indicated by the lowered potential on that match line.

When the comparison is done at high speed and on many rows the result will be a large transient on the power supply rail, and even if the circuit is designed and toleranced to avoid device failure the power consumption resulting from the discharging of a large number of match lines can still be significant.

Referring to Figures 3 and 4, in a contents addressable memory cell in accordance with the invention the transistors m7 to m10 of the known form of match line circuit are replaced by transistors m23 and m24 respectively connecting the bit lines bl and blb to a node n1 and a transistor m25 in series with the match line between mi and mo and with its gate electrode connected to the node n1. As shown in Figure 4 the match lines of a row of cells are connected in series with one another and with a transistor m20 between ground and one input of a current sensing circuit comprising transistors m10 to m19.

At the beginning of an address comparison cycle the transistor 20 of each row of memory cells is biased into conduction so as to connect the match line input mi of the first memory cell in the row to ground. The last cell in the row has its match line output mo connected to the transistor m10 of the respective current sensing circuit. The current sensing circuit provides current paths to vdd through current mirror input devices m14 and m15 respectively from the match line and from a load path comprising transistors m21 and m22 when access transistors m12 and m13 are switched on by an access signal sel, the inverse of which, selb, is used to bring the intermediate nodes of the current transfer circuit m10 to m13 to ground potential before the start of the comparison cycle. The current mirror output currents through transistors m16 and m17 are applied to a single-ended current mirror output circuit comprising transistors m18 and m19 for converting the current difference between the match line current and the load path current into a voltage output signal on the match-out path.

Address bit values from the register 4 are applied to respective ones of the bit lines bl and blb in such a sense that if at any cell the bit value matches that held by the cell then the transistors m23 and m24 are biased such that the potential at the node n1 forward biases the transistor m25. In the form of cell shown in Figure 3 the address bit values are applied inverted to the bit lines bl and blb during a comparison cycle, compared to the sense in which corresponding bit values are held in the memory cells, to simplify the physical layout of the cells on chip.

If matches occur all along a row then a conducting path is provided along the match line. The load path through transistors m21 and m22 is sized to provide a d.c. reference current of approximately half of the current which flows in a conducting match line, so that match and no-match conditions provide current differences to the current mirror output circuit that are approximately equal but in opposite senses.

Since in many applications only a single match occurs in a comparison cycle the total current consumed by the present circuit is significantly less than in the known circuit of Figure 2. At the same time the ability of the current sensing circuit to detect currents on high capacitance match lines at high speed with no voltage excursion necessary makes the present circuit ideal for high performance cache memory arrangements. Since the transistors m25 are cascaded along the match line the voltage levels provided on the bit lines bl and blb to ensure that these transistors 25 are on or off as required can be significantly less than the vdd levels that are applied in known circuits. The absolute minimum high level needed on the bit lines in order for the potential at n1 to turn on m25 is approximately vdd/2. This means that the power consumed in driving the many high capacitance bit lines for a comparison cycle will be reduced by a factor of four.

高效检索全球专利

专利汇是专利免费检索,专利查询,专利分析-国家发明专利查询检索分析平台,是提供专利分析,专利查询,专利检索等数据服务功能的知识产权数据服务商。

我们的产品包含105个国家的1.26亿组数据,免费查、免费专利分析。

申请试用

分析报告

专利汇分析报告产品可以对行业情报数据进行梳理分析,涉及维度包括行业专利基本状况分析、地域分析、技术分析、发明人分析、申请人分析、专利权人分析、失效分析、核心专利分析、法律分析、研发重点分析、企业专利处境分析、技术处境分析、专利寿命分析、企业定位分析、引证分析等超过60个分析角度,系统通过AI智能系统对图表进行解读,只需1分钟,一键生成行业专利分析报告。

申请试用

QQ群二维码
意见反馈