Amazon Interview Question
Senior Software Development EngineersTeam: Kindle
Country: India
Interview Type: In-Person
I think they are not asking about the caching system of OS only but they Might be asking about the Caching system which is used in server load balancing system in which for small data we can use hashMap but for huge data we have to use "Memcached" like system in java.
I m not confirm about please correct me if i m wrong.
in case cache has 5 pages
[1,2,3,4,5]
number of times the least page is referred is more than 10.
means if node 1 is referred least the number of times it is being referred is 12
now if the user add one more page 6. should 6 be replaced by 1.
@jv
Yes that is the premise of LRU. In case you want a cache system to retain values based on how often they were accessed you can use
LFU (Least Frequently Used)
cache design which counts how often an item is needed. Those that are used least often are discarded first.
Else you can use
ARC (Adaptive Replacement Cache)
which balances between LRU and LFU, to give improved result. It dynamically updates the size of protected segment and the probationary segment so to make optimum use of available cache space.
Updated my main answer as well for completeness.
Implement an
cache.
How to implement LRU caching scheme? What data structures should be used?
We are given total possible page numbers that can be referred. We are also given cache (or memory) size (Number of page frames that cache can hold at a time). The LRU caching scheme is to remove the least recently used frame when the cache is full and a new page is referenced which is not there in cache. Please see the Galvin book for more details (see the LRU page replacement slide here).
We use two data structures to implement an LRU Cache.
1. A Queue which is implemented using a doubly linked list. The maximum size of the queue will be equal to the total number of frames available (cache size).
The most recently used pages will be near front end and least recently pages will be near rear end.
2. A Hash with page number as key and address of the corresponding queue node as value.
When a page is referenced, the required page may be in the memory. If it is in the memory, we need to detach the node of the list and bring it to the front of the queue.
If the required page is not in the memory, we bring that in memory. In simple words, we add a new node to the front of the queue and update the corresponding node address in the hash. If the queue is full, i.e. all the frames are full, we remove a node from the rear of queue, and add the new node to the front of queue.
Reference: geeksforgeeks
In case you want a cache system to retain values based on how often they were accessed you can use
cache design which counts how often an item is needed. Those that are used least often are discarded first.
Else you can use
which balances between LRU and LFU, to give improved result. It dynamically updates the size of protected segment and the probationary segment so to make optimum use of available cache space.
- Expressions April 02, 2013