Computer due to the CPU computing is very fast, and computing to read data from the memory is not fast enough, resulting in a bottleneck (image point, the bottleneck is fine, a large amount of data through will cause blockage, slowing down the speed of the), and so modern computers use the caching technology, that is, in the CPU in the processing of the data first extracted data from the cache ( The cache is built into the CPU, and it exchanges data with the CPU much faster than memory), and the data in the cache is extracted from memory.
Regardless of the cache, it works the same way! The transition between fast and slow is made via a buffer strip!
Question 2: What does cache mean? What does computer cache mean? Cache is a temporary file exchange area, the computer will be the most commonly used files from the memory to put up temporarily in the cache, like the tools and materials on the workbench, so it will be more convenient than use now go to the warehouse to take. Because caches tend to use RAM (non-permanent storage that is dropped when power is lost), files are still sent to a hard disk or other memory for permanent storage when they are finished. The largest cache in a computer is the memory stick, the fastest is the L1 and L2 caches set on the CPU, the graphics memory of the video card is a cache for the GPU, and the hard disk has a 16M or 32M cache. Cache should never be understood as one thing, it is a generic term for processing!
Question 3: What is the use of laptop cache and what does it mean? Many people think that "cache" is part of the memory
Many technical articles are taught this way
But there are still a lot of people who don't know where the cache is, and what the cache does
In fact, the cache is part of the CPU. p>
The cache is to solve the problem of speed difference between CPU speed and memory speed
The most frequently accessed data and instructions in the memory by the CPU are copied into the CPU cache, so that the CPU can not often go to the "snail" as slow as the memory to fetch the data, the CPU as long as the cache to fetch it. The cache is much faster than memory
Here are some special points to note:
1. Because the cache is only a copy of a small portion of the data in memory, the CPU will not be able to find the data in the cache (because the data has not been copied from the memory to the cache), and then the CPU will go to the memory to find the data, which will slow down the system. The system slows down, but the CPU copies the data to the cache so that it doesn't have to go to memory to get it next time.
2. Because over time, the most frequently accessed data is not static, that is, the data that was infrequent just now has to be accessed more frequently, and the data that was the most frequently accessed data just now is infrequent now, the data in the cache has to be replaced frequently according to a certain algorithm, so as to ensure that the data in the cache is the most frequently accessed
3. On the first level of cache and the second level of cache
In order to distinguish between these two concepts, we first understand the RAM
ram and ROM relative to each other, RAM is after the power failure, which only the information on the disappearance of that kind of ROM in the power failure after the information will not disappear that kind of
RAM is divided into two kinds of
a static RAM, SRAM; a static RAM, SRAM; a static RAM, SRAM, SRAM. The former is much faster than the latter, and the memory we use now is generally dynamic RAM.
Some rookies say that in order to increase the speed of the system, the cache should be enlarged. The larger the cache is, the more data is cached, and the faster the system becomes.
The cache is usually a static RAM, which is very fast.
The cache is usually static RAM, which is very fast,
but static RAM is less integrated (static RAM is six times larger than dynamic RAM to store the same data),
and more expensive (static RAM is four times more expensive than dynamic RAM for the same amount of capacity),
which shows that it's stupid to expand the static RAM to be used as a cache.
But in order to improve the performance and speed of a system, it's important to expand the cache to include more data and more data.
Instead of expanding the original static RAM cache, we can add some high-speed dynamic RAM as a cache,
These high-speed dynamic RAMs are faster than the regular dynamic RAMs but slower than the original static RAM cache.
We call the original static RAM cache the first level cache, and the dynamic RAM added later is called the second level cache.
The contents of the first- and second-level caches are replicas (maps) of data that is frequently accessed in memory, and they exist to reduce accesses to slow memory by high-speed CPUs.
Usually the CPU looks for data or instructions in the order of: first to the first level of cache, and then to the second level of cache if it can not find, if it can not be found, only to the memory
Question 4: What is the meaning of buffer and cache in the computer I want to be more common to tell you the buffer: I think you are asking what the buffer every time you play the songs and movies? I think you're asking what buffering is every time you play a song or movie, right? In fact, every time you watch a movie or listen to a song, you download the movie or song to your computer and then listen to and watch it. Like the general Internet Explorer is downloaded to this path C:\Documents and Settings\User name (usually "Administrator")\Local Settings\Temporary Internet Files so that Cache is actually the part of the cache that has been downloaded to your computer: cache is a temporary storage area in order to improve the data transfer rate. Simply put, it is a temporary file exchange area. One of the largest caches in the computer is the memory stick; there is also a cache in the cup, cut level, the role is to improve the data transfer between the cup and the hard disk, memory, keystrokes, etc.; hard disk, graphics card also have a cache I think you pretty much understand it!
Question 5: What is the computer hard disk cache The memory chip of the hard disk controller is a 64MB cache Cache memory is a piece of memory chip on the hard disk controller has a very fast access speed It is the buffer between the internal storage of the hard disk and the external interface. Because the internal data transfer speed of the hard disk and the external interface transfer speed is not the same cache plays a buffer in the sense. Cache size and speed is directly related to the transmission speed of the hard disk is an important identity may greatly improve the overall performance of the hard disk. When the hard disk access fragmented data must continue to exchange data between the hard disk and memory if there is a large cache can be those fragmented data temporarily in the cache to reduce the load on the external system also improves the data transfer speed. Hard disk cache is important to play three kinds of sensibility: First, pre-reading. When the hard disk is controlled by the CPU command to start reading data on the hard disk control chip will control the magnetic head is reading the cluster of the next or several clusters of data read into the cache because the hard disk data storage is more continuous, so read the shooting rate is high when you have to read the next or several clusters of data in the moment hard disk does not have to read the data again directly to the data in the cache to the memory on the data transfer, because the speed of the cache is far faster than the speed of the cache. Can be because the speed of the cache is much higher than the speed of the magnetic head read and write, so it may be able to achieve the goal of significantly improving performance; Second, the write action to cache. When the hard disk receives the command to write data will not immediately write the data to the disk, but first temporarily stored in the cache and then send a data has been written flag to the system, then the system will think that the data has been written and continue to perform the following work and the hard disk in the spare time not to read or write the moment and then write the data in the cache to the disk. Certainly for the writing of data performance has been improved, but also inevitably brings a security risk - if the data is still in the cache at the time of a sudden power failure, then the data will be lost. Hard disk manufacturers have a natural solution to this problem: power down the magnetic head will use the inertia of the data in the cache will be written to a temporary storage area outside the zero track than the next startup and then write these data to the target; the third sense is the temporary storage of data than the previous visit. There are times when some data will often need to visit the hard disk internal cache will be read more frequently some data stored in the cache to read again when you can directly from the cache directly transfer. The large capacity of the cache can certainly be in the hard disk read and write work conditions so that more data stored in the cache to improve the hard disk access speed, but does not mean that the larger the cache is the more outstanding. Cache application there is an algorithmic problem even if the cache capacity is very large without a highly effective algorithm that will lead to the application of cached data in the shot rate is low and can not effectively play out the advantages of large-capacity cache. Algorithm is slack storage capacity complementary to the overnight capacity of the cache must be more effective algorithms, otherwise the function will be greatly reduced from a technical point of view that high-capacity cache algorithms are directly affecting the performance of the hard disk to play an important role. A larger cache is a definite trend for future hard drive growth.
Question 6: What is the cache of a computer CPU? This is my answer to someone else's question, and your question is similar, I hope to help:
The same core architecture, the same cache, the same number of cores, the main frequency is high processing speed fast explanation: the main frequency of the clock frequency cpu is generally triggered for the rising or falling edge, that is to say, the high potential change to the ground potential will be shifted from the register to the operation of a bit of 3.0Ghz is the level change 3G times a second. 3.0Ghz is a second level shift 3G times with that is to carry out 3G times the register shift, then a second register shift the more operations faster but, the core architecture is like the traffic mode good core architecture is like an underground tunnel straight to the destination backward architecture is like a dirt road curved and muddy in the dirt road driving speed of 120 nor underground tunnel ride a battery car to reach the destination faster so the core architecture is very critical and caching The first, second and third level caches hold instructions of different priority levels. The larger the cache, the more computations are performed before the cache is emptied at one time, and the smaller the cache is, the more it needs to be emptied to continue computations. What is the use of temporary memory, so that the cpu computing time will be faster, the actual work, the CPU often need to repeatedly read the same block of data, and the cache capacity of the staring at the big, you can greatly enhance the CPU internal read data hit rate, without having to go to the memory or hard disk to look for, in order to improve the performance of the system
Problem eight: computer cache depends on what configuration cache is where all the There are hard drives, USB flash drives, CPU memory, graphics cards, all depending on the person
Question 9: What is the first level of cache in the computer, and what is the second level of cache?
Cache
The size of the cache is also one of the important indicators of the CPU, and the structure and size of the cache has a very large impact on the speed of the CPU, the cache within the CPU is running at a very high frequency, generally the same frequency as the processor operation, the efficiency is much greater than the system memory and hard disk. In practice, the CPU often needs to repeatedly read the same block of data, and the increase in cache capacity can significantly improve the CPU internal read data hit rate, without having to go to memory or hard disk to find, in order to improve system performance. However, due to CPU chip size and cost considerations, caches are very small.
L1 Cache is the first level of CPU cache, divided into data cache and instruction cache. The capacity and structure of the built-in L1 cache has a large impact on the performance of the CPU, however, the cache memory is composed of static RAM, the structure is more complex, in the case of the CPU core area can not be too large, the capacity of the L1 level cache is not possible to do too much. The general server CPU L1 cache capacity is usually 32-256KB.
L2 Cache (L2 cache) is the CPU's second layer of cache, divided into internal and external two chips. The internal chip L2 cache runs at the same speed as the main frequency, while the external L2 cache is only half of the main frequency.The L2 cache capacity also affects the performance of the CPU, and the principle is that the bigger the better, the largest capacity of the CPU for home use was 512KB, and now the laptop can also reach 2M, and the L2 cache for the CPU on the servers and workstations is much higher, and it can reach more than 8M.
The L3 cache is the most powerful cache in the world.
L3 Cache (L3 cache), divided into two kinds, the early is external, now are built-in. The actual role of the L3 cache is to further reduce memory latency and improve processor performance for large data-volume calculations. Reducing memory latency and increasing the ability to compute large amounts of data are both very helpful for gaming. And in the server space adding L3 cache still provides a significant performance boost. For example, a configuration with a larger L3 cache utilizes physical memory more efficiently, so its slower disk I/O subsystem can handle more data requests. Processors with larger L3 caches offer more efficient file system caching behavior and shorter message and processor queue lengths.
In fact, the earliest L3 cache was used in AMD's K6-III processors, where the L3 cache was limited by the manufacturing process and was not integrated into the chip, but rather on the motherboard. At that time, the L3 cache was not integrated into the chip due to the manufacturing process, but was integrated into the motherboard. The L3 cache, which was only able to synchronize with the system bus frequency, was not much different from the main memory. L3 cache was later used in Intel's Itanium processors for the server market. Intel also intended to introduce a 9MB L3 cache Itanium2 processor, and later a 24MB L3 cache dual-core Itanium2 processor.
But basically, the L3 cache isn't that important to the performance of the processor. For example, a Xeon MP processor with 1MB of L3 cache is still no match for an Opteron, which shows that an increase in the front-side bus is much more effective than an increase in cache.
Question 10: What is the purpose of the cache on a computer hard disk, which can be interpreted as a temporary storage. Because the read and write speed of the hard disk is not the same as the speed of the memory.
For example, suppose there is a machine that produces glass beads, and you go to take it before it spits it out.
So you want to take 1000 of them, and it can't spit them out all at once because you can't take them. Now put a box (that holds >1000) at its spit, then it spits it out all at once and can sit idle.
You can take it from the box in multiple times.
Cache, read and write small data to avoid repeated reads and writes, serve to temporarily store data.