Google Interview Question
Software Engineer / DevelopersCountry: United States
Interview Type: In-Person
Not sure what does compare means here?
But if the question to comment on their behavior, my interpretation would be:
write-through-cache belongs to cache/backstore update mechanism(others would be write-back with/wihout buffer), that helps in expediting the process by delaying write to slower backstore. Process updates cache value as well as backestore memory at the same time, thus every update will lead high I/O wait time and possible CPU yeild.
TLB: it is a fast cache memory designed to hold MMU transalations. CPU when trying to access vaddr, first checks if the translation exists in TLB, if not then go down the memory hierarchy to fetch the required page. Once found will update TLB for all future references.
PS write-through cache is benefitial when moving large amounts of data from one place to another. If there is no cache hit on memory write, the data will be written directly to the backstore without preloading the corresponding cache line first. Hence that cache won't get "dirtied" as it would be the case for write-back cache
write through cache: write simultaneously in cache and underlying memory
look aside cache: read simultaneously from cache and underlying memory, use whatever is found first
They both have similar concepts in accessing more than one level in the cache hierarchy at once. They both try to improve speed:
* write-through by avoiding write back of full cache lines when there are no more free
* look aside by also reading 'through' the cache instead of first trying the cache and then the next level (that would be t_cache+t_mem instead of min(t_cache, t_mem))
They both do this optimization at the cost of higher bandwidth usage on underlying memory; they access the next cache hierarchy level on every write or read.
[Don't confuse look aside cache with TLB/Translation lookaside buffer. Thats an ordinary cache for a special puropse.]
Write Through Cache: Write is done synchronously both to the cache and to the backing store.
- MathCai October 19, 2012Look Aside Cache: Translation Look Aside Cache, is a cache that memory management hardware uses to improve virtual address translation speed.