News Column

Researchers Submit Patent Application, "Cache Memory and Methods for Managing Data of an Application Processor Including the Cache Memory", for...

June 18, 2014



Researchers Submit Patent Application, "Cache Memory and Methods for Managing Data of an Application Processor Including the Cache Memory", for Approval

By a News Reporter-Staff News Editor at Electronics Newsweekly -- From Washington, D.C., VerticalNews journalists report that a patent application by the inventors KIM, Sungyeum (Goyang-si, KR); KWON, Hyeokman (Suwon-si, KR); KWON, Youngjun (Seongnam-si, KR); CHOI, Kiyoung (Seoul, KR); AHN, Junwhan (Seoul, KR), filed on November 21, 2013, was made available online on June 5, 2014.

No assignee for this patent application has been made.

News editors obtained the following quote from the background information supplied by the inventors: "The inventive concepts described herein relate to semiconductor devices, and more particularly, relate to a cache memory and/or data managing methods of an application processor including the cache memory.

"In recent years, the use of portable devices such as smart phones, smart pads, notebook computers, and so on have increased rapidly. Developments of semiconductors and communications technologies have increased throughputs of portable devices. Such increases in throughputs of portable devices have led such devices to be named a 'smart device'.

"Smart devices enable a user to install applications freely and to produce and process information using the installed applications. As more and more applications and content for such smart devices are developed, improvements to operability of the smart device are desired.

"Among methods of improving operability of such devices, one method may be directed to improving the performance of a cache memory, which is used by an application processor of a smart device, so as to reduce power consumption of the application processor."

As a supplement to the background information on this patent application, VerticalNews correspondents also obtained the inventors' summary information for this patent application: "In one example embodiment of the inventive concepts, a cache memory system includes a main cache memory including a nonvolatile random access memory. The main cache memory is configured to exchange data with an external device and store the exchange data, each exchanged data includes less significant bit (LSB) data and more significant bit (MSB) data. The cache memory system further includes a sub-cache memory including a random access memory. The sub-cache memory is configured to store LSB data of at least a portion of data stored at the main cache memory, wherein the main cache memory and the sub-cache memory are formed of a single-level cache memory.

"In yet another example embodiment, each of the main cache memory and the sub-cache memory includes a plurality of lines, an invalid line being one of the plurality of lines that does not store data. When an invalid line, which does not store valid data, exists at the sub-cache memory and new data is received from the external device, the main cache memory is further configured to store MSB data of the received data at an MSB area of a selected invalid line of the main cache memory and the sub-cache memory is further configured to store LSB data of the received data at the invalid line of the sub-cache memory.

"In yet another example embodiment, each of the main cache memory and the sub-cache memory includes a plurality of lines, an invalid line being one of the plurality of lines that does not store data. When an invalid line, which does not store valid data, does not exist at the sub-cache memory and new data is received from the external device, the sub-cache memory is further configured to write LSB data stored at a selected line of the sub-cache memory, to an LSB area of a corresponding line of the main cache memory, invalidate the written LSB data at the selected line of the sub-cache memory and store LSB data of the received data at selected line of the sub-cache memory. The main cache memory is further configured to store MSB data of the received data at an MSB area of a selected invalid line of the main cache memory.

"In yet another example embodiment, if a difference exists between update data of an update data received from the external device and LSB data stored at the sub-cache memory, the sub-cache memory is further configured to update the LSB data of the data stored at the sub-cache memory with the LSB data of the update data.

"In yet another example embodiment, if a difference exists between LSB data of an update data received from the external device and LSB data of data stored at the main cache memory, the main cache memory is further configured to update the LSB data of the original data stored at the main cache memory with the LSB data of the update data.

"In yet another example embodiment, when MSB data of selected data is stored at the main cache memory, LSB data of the selected data is stored at the sub-cache memory, and the selected data is to be read by the external device, the main cache memory is further configured to provide the MSB data stored at the main cache memory to the external device and the sub-cache memory is further configured to provide LSB data stored at the sub-cache memory to the external device.

"In yet another example embodiment, when MSB data of selected data is stored at the main cache memory, LSB data of the selected data is stored at the main cache memory, and the selected data is to be read by the external device, the main cache memory is further configured to provide MSB data and LSB data stored at the main cache memory to the external device.

"In yet another example embodiment, the main cache memory is a magnetic random access memory.

"In yet another example embodiment, the sub-cache memory is a static random access memory.

"In yet another example embodiment, the sub-cache memory consumes less power for a write operation compared to a write operation carried out by the main cache memory.

"In yet another example embodiment, the sub-cache memory operates based on the main cache memory.

"In yet another example embodiment, the main cache memory includes an address buffer configured to store a line index and a tag received from the external device. The main cache memory further includes a plurality of data arrays, each data array including a plurality of lines, each line being configured to store LSB data and MSB data associated with one of the received data. The main cache memory further includes a tag array configured to store tags associated with data stored at the plurality of data arrays and a first intermediate circuit configured to access the tag array and determine whether a first hit is generated, based on the line index and the tag stored at the address buffer. The main cache memory further includes a first input/output circuit configured to access the plurality of data arrays according to the line index and the determination of the generated first hit by the first intermediate circuit.

"In yet another example embodiment, the sub-cache memory includes an LSB address buffer configured to receive the line index from the address buffer, to receive information on a location of the plurality of data arrays for which the first intermediate circuit has determined that the first hit is generated, and output an LSB line index and an LSB tag based on the input line index and the received information. The sub-cache memory further includes a plurality of LSB data arrays, each LSB data array including a plurality of sub-lines, each sub-line being configured to store LSB data; an LSB tag array configured to store LSB tags associated with LSB data stored at the plurality of LSB data arrays. The sub-cache memory further includes a second intermediate circuit configured to access the LSB tag array and determine whether a second hit is generated, based on the LSB line index and the LSB tag output from the LSB address buffer. The sub-cache memory further includes a second input/output circuit configured to access the plurality of LSB data arrays according to the LSB line index and the determination of the generated second hit by the second intermediate circuit.

"In one example embodiment of the inventive concepts, a data managing method of an application processor, which includes a main cache memory and a sub-cache memory, includes fetching MSB data and LSB data. The method further includes managing the fetched MSB data using an MSB area of the main cache memory and the fetched LSB data using at least one of the sub-cache memory and an LSB area of the main cache memory, wherein the MSB data and the LSB data form a data line being a data transfer unit.

"In yet another example embodiment, the managing includes receiving the LSB data and the MSB data; and storing the received MSB data at the MSB area of the main cache memory and the received LSB data at an invalid line of the sub-cache memory when an invalid line exists at the sub-cache memory, the invalid line being a line that does not store data.

"In yet another example embodiment, when an invalid line does not exist at the sub-cache memory, the method further includes writing to the main cache memory, at least one additional LSB data previously stored at a given location of the sub-cache memory, and storing the received LSB data at the given location of the sub-cache memory.

"In yet another example embodiment, the managing includes receiving updated data including updated LSB data and updated MSB data, reading data corresponding to the updated LSB data and the updated MSB data from at least one of the main cache memory and the sub-cache memory. The managing further includes comparing the read data and the updated LSB data and the updated MSB data and updating LSB data of the read data stored at the sub-cache memory when (1) the comparison result indicates that the LSB data of the read data and the updated LSB data are different from each other and (2) the updated LSB data of the read data is stored at the sub-cache memory. The managing further includes updating LSB data of the read data stored at the LSB area of the main cache memory when (1) the comparison result indicates that the LSB data of the read data and (2) the updated LSB data are different from each other and the LSB data of the read data is stored at the LSB area of the main cache memory. The method further includes updating MSB data of the read data stored at the MSB area of the main cache memory when the comparison result indicates that the MSB data of the read data and the updated MSB data of the received updated data are different from each other.

"In yet another example embodiment, the managing includes receiving a data request; selecting data corresponding to the data request from the main cache memory and the sub-cache memory; and reading the selected data.

"In yet another example embodiment, the managing includes decoding a tag of the main cache memory; accessing data of the main cache memory based on the decoded tag of the main cache memory; decoding a tag of the sub-cache memory while data of the main cache memory is accessed; and accessing data of the sub-cache memory, based on the decoded tag of the sub-cache memory.

"In yet another example embodiment, the managing includes decoding a tag of the main cache memory; accessing data of the main cache memory when the tag of the main cache memory is decoded; decoding a tag of the sub-cache memory when the tag of the main cache memory is decoded; and accessing data of the sub-cache memory when the tag of the main cache memory is decoded.

"In one example embodiment, an application processor is configured to exchange data with an external device and store a first portion of the exchanged data in a main cache memory of the application processor, the main cache memory including a nonvolatile random access memory. The application processor is further configured to store a second portion of the exchanged data in a sub-cache memory of the application processor, the sub-cache memory including a random access memory.

"In yet another example embodiment, the application processor is configured to exchange the data by at least one of receiving the data from an external device to be stored in at least one of the main cache memory and the sub-cache memory of the application processor and providing the stored data to be read by the external device.

"In yet another example embodiment, the first portion of the exchanged data includes more significant bit (MSB) data of the exchanged data and the second portion of the exchanged data includes less significant bit (LSB) data of the exchanged data.

"In yet another example embodiment, upon receiving data from the external device, the application processor is configured to store the MSB data of the received data in the main cache memory.

"In yet another example embodiment, upon receiving data from the external device, the application processor is configured to determine whether an empty location for storing the LSB data of the received data exists within the sub-cache memory and store the LSB data of the received data in the determined empty location of the sub-cache memory.

"In yet another example embodiment, the application processor is further configured to, upon determining that no empty location for storing the LSB data of the received data exists within the sub-cache memory, write an LSB data of at least one additional data already stored in a given location of the sub-cache memory into an empty location of the main cache memory corresponding to a location of the main memory in which the MSB data of the at least one additional data is stored and store the LSB data of the received data in the given location of the sub-cache memory.

"In yet another example embodiment, upon receiving updated data, the application processor is further configured to, determine whether LSB data of the updated data is different from the LSB data of the data already stored in one of the main cache memory and the sub-cache memory and replace the LSB data of the data already stored with the LSB data of the updated data, upon determining that the LSB data of the updated data is different from the LSB data of the data already stored.

BRIEF DESCRIPTION OF THE DRAWINGS

"The above and other objects and features will become apparent from the following description with reference to the following figures, wherein like reference numerals refer to like parts throughout the various figures unless otherwise specified, and wherein

"FIG. 1 a block diagram schematically illustrating a computing system, according to an example embodiment of the inventive concepts;

"FIG. 2 is a flow chart schematically illustrating a data managing method of an application processor of FIG. 1, according to an example embodiment;

"FIGS. 3A and 3B are diagrams illustrating relations among a main memory, a main cache memory and a sub-cache memory of FIG. 1, according to an example embodiment;

"FIG. 3C is a diagrams schematically illustrating a main cache memory and a sub-cache memory of FIG. 3A, according to an example embodiment;

"FIG. 4 is a flow chart of a method for storing data at cache memories of FIG. 3C, according to an example embodiment;

"FIGS. 5A to 5C are block diagrams schematically illustrating example embodiments where a storing method of FIG. 4 is executed at a cache structure of FIG. 3C;

"FIG. 6 is a flow chart schematically illustrating a method of updating data of cache memories of FIG. 3C, according to an example embodiment;

"FIGS. 7A to 7C are block diagrams schematically illustrating example embodiments where a storing method of FIG. 6 is executed at a cache structure of FIG. 3C;

"FIG. 8 is a flow chart schematically illustrating a method where a read operation is executed at cache memories of FIG. 3C, according to an example embodiment;

"FIGS. 9A to 9C are block diagrams schematically illustrating example embodiments where a read method of FIG. 8 is executed at a cache structure of FIG. 3;

"FIGS. 10A and 10B are flow charts schematically illustrating example embodiments where data is written at a main cache memory 113 and a sub-cache memory 115 of FIGS. 1, 3A and 3B;

"FIG. 100 is a flow chart schematically illustrating an embodiment where data is read from a main cache memory 113 and a sub-cache memory 115 of FIGS, according to an example embodiment;

"FIG. 11A is a graph schematically illustrating access times of cache memories according to an example embodiment of the inventive concepts;

"FIG. 11B is a graph schematically illustrating access times of cache memories according to an example embodiment of the inventive concepts;

"FIG. 12A is a diagram for describing read operations of a main cache memory and a sub-cache memory, according to an example embodiment;

"FIG. 12B is a diagram for describing read operations of a main cache memory and a sub-cache memory, according to an example embodiment;

"FIG. 12C is a diagram for describing an operation where LSB data is written back to a main cache memory from a sub-cache memory, according to an example embodiment; and

"FIG. 13 is a block diagram schematically illustrating an application processor and an external memory and an external chip communicating with the application processor, according to an example embodiment."

For additional information on this patent application, see: KIM, Sungyeum; KWON, Hyeokman; KWON, Youngjun; CHOI, Kiyoung; AHN, Junwhan. Cache Memory and Methods for Managing Data of an Application Processor Including the Cache Memory. Filed November 21, 2013 and posted June 5, 2014. Patent URL: http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=%2Fnetahtml%2FPTO%2Fsearch-adv.html&r=485&p=10&f=G&l=50&d=PG01&S1=20140529.PD.&OS=PD/20140529&RS=PD/20140529

Keywords for this news article include: Patents, Electronics, Random Access Memory.

Our reports deliver fact-based news of research and discoveries from around the world. Copyright 2014, NewsRx LLC


For more stories covering the world of technology, please see HispanicBusiness' Tech Channel



Source: Electronics Newsweekly


Story Tools






HispanicBusiness.com Facebook Linkedin Twitter RSS Feed Email Alerts & Newsletters