Amazon Interview Question


Country: United States
Interview Type: Phone Interview




Comment hidden because of low score. Click to expand.
3
of 3 vote

import java.util.*;


/*
 * we are using linked list with hashmap for Lru.Reason behind this is ,since HashMap is able to store data and key but
 * how do we get info about least recently used cache value?. For this we need to keep track all inserted data into map
 * by using linked list. When inserting new values , add this value to the top of linked list. When key,value is already
 * present then refresh it by removing it from the linked list and adding it to top of the linked list. 
 * */

public class CacheStrategies {
	
	public interface CacheStrategy<K, T>{
		T get(K key);
		void put(K key,T data);
	}
	
	class CacheStrategyLRU<K, T> implements CacheStrategy<K, T> {
		
		class Node{
			K key;
			T data;
			Node next;
			Node prev;
			
			public Node(K key, T data){
				this.key = key;
				this.data = data;
			}
		}
		
		Node head,tail;
		Map< K, Node> map;
		int maxsize;
		
		public CacheStrategyLRU(int mxsize){
			this.maxsize = mxsize;
			map = new HashMap<K ,Node>();
			head =  new Node(null,null);
			tail = new Node(null,null);
			head.next=tail;
			tail.prev=head;
		}
		
		private void  attach(Node head,Node node){
			node.prev = head;
			node.next = head.next;
			head.next.prev=node;
			head.next = node;
		}
		
		private void remove(Node node){
			node.prev.next = node.next;
			node.next.prev = node.prev;
		}
		
		
		@Override
		public T get(K key) {
			Node node = map.get(key);
			if(node==null){
				return null;
			}
			
			if(map.size()==1){
				return node.data;
			}
			remove(node);
			attach(head,node);
			return node.data;
		}

		@Override
		public void put(K key, T data) {
			if(maxsize<=0){
				return;
			}
			Node node = map.get(key);
			if(node!=null){
				remove(node);
				attach(head,node);
				node.data = data;
			}else{
				if(map.size() >= maxsize){
					remove(tail.prev);//tail is node pointer ,its not containg any node so delete tail.prev
					map.remove(tail.prev.key);
				}
				Node nd = new Node(key,data);
				map.put(key, nd);
				attach(head,nd);
			}
			
		}
    	
    }  
	
}

- sk June 21, 2012 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

When putting a node in case if the cache if full, before removing the elder node

remove(tail.prev);

, keep the key for that node and then remove it from hashmap. It should look like the following:

K oldKey = tail.prev.key;

remove(tail.prev);

map.remove(oldKey );

- sergeyan September 28, 2015 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 vote

Looks like a form of priority queue is needed.

- Victor June 21, 2012 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 2 vote

Extend linkedhashmap and override
protected boolean removeEldestEntry(Map.Entry<K,V> eldest) {
return size() > CACHE_SIZE;
}

- Anonymous June 21, 2012 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

Can you please elaborate a bit?

- Victor June 21, 2012 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

everytime you put element in linkedhashmap you get call back to the method mentioned above for which the parameter is eldest entry in the linked hashmap .. so you remove the eldest entry if your linkedhashmap size is greater than the defined CACHE_SIZE

- Siva June 22, 2012 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 votes

will the eldest get updated after a get()?

- anonymous July 07, 2012 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 vote

Implementing the LRU cache algorithm. Given following interface.

interface CacheStrategy<K, T> {
    // get the data by key from the cache.
    T get(K key);
 
    // put <key, data> pair to the cache
    void put(K key, T data);
}


Answer :


The get and put can be done in O(1) time.
1. the data structure

The core of the algorithm is a doubly linked list that indexed by a hash map. The double link is implicitly sorted by the age of the data. Once the data is accessed, it is refreshed. So, the data is moved to the head of the list(age==0).

     
class Node {
        Node prev; //doubly linked list
        Node next;
        T data; //data and the key
        K key;
}
Map<K, Node> map; //node is indexed by the key to support node access in O(1).

2 doubly linked list

For a doubly linked list, the insert and deletion only takes O(1). So, it is quite efficient to implement the LRU algorithm. The reason to use doubly linked list rather than the singly linked list is to delete the nodes in the middle.

By define the head and tail node, the insertion and deletion of a node can be simplified.

a.Initialize the head nodes.
     
//hook up the head and tail nodes
head = new Node(null, null);
tail = new Node(null, null);
head.next = tail;
tail.prev = head;

b.Remove the node from the list.
     
private void detach(Node node) {
        // detach it from the queue
        node.prev.next = node.next;
        node.next.prev = node.prev;
}

c. Attach the node to the list.
     
private void attach(Node head, Node node) {
        // add a node after the head.
        node.prev = head;
        node.next = head.next;
        node.next.prev = node;
        node.prev.next = node;
}

3 maintain the cache

Get the data from the cache needs to refresh the data in the cache. That is, move the node from the middle to the head.
     
@Override
public T get(K key) {
        Node node = map.get(key);
        if (node == null)
                return null;
 
        if (map.size() == 1)
                return node.data;
 
        // refresh
        // - detach it from the queue
        detach(node);
        // - attach it to head
        attach(head, node);
        return node.data;
}

Besides a refresh action, putting data into the cache also needs to maintain the cache size and update the content.
 
    
/**
 * put <key,data> into cache.
 */
@Override
public void put(K key, T data) {
        if (maxsize <= 0)
                return;
 
        Node node = map.get(key);
        if (node != null) {
                // hit. Refresh the data
                detach(node);
                attach(head, node);
                node.data = data;
        } else {
                // miss. Create new node and adjust the size.
                node = new Node(key, data);
                map.put(key, node);
                attach(head, node);
                if (map.size() > maxsize) {
                        // overflow, remove the tail
                        detach(tail.prev);
                        map.remove(tail.prev.key);
                }
        }

- Arya June 22, 2012 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

How about a circular linked list where item to replace will always be the first element in the queue

- DashDash July 07, 2012 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

What if an existing user logins in again ??

Search will be O(n) in that case.

LinkedHashMap is the correct solution

- vnvaditya February 24, 2013 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 vote

class CacheService
{
static List<string> LRUKeys;
static Hashtable lookup;
static int capacity;

internal CacheService(int _capacity)
{
capacity = _capacity;
lookup = new Hashtable();
LRUKeys = new List<string>();
}
internal void Put(string ikey, object ival)
{
if (LRUKeys.Count == capacity)
{
LRUKeys.RemoveAt(0);
lookup.Remove(ikey);
}
if (LRUKeys.Count < capacity)
{
LRUKeys.Add(ikey);
if (!lookup.ContainsKey(ikey))
lookup.Add(ikey, ival);
}
}
internal object Get(string ikey)
{
object retObj=null;
if (lookup.ContainsKey(ikey))
{
retObj = lookup[ikey];

LRUKeys.RemoveAt(LRUKeys.IndexOf(ikey));
LRUKeys.Add(ikey);
}
return retObj;
}
internal void Delete(string ikey)
{
if (lookup.ContainsKey(ikey))
{
lookup.Remove(ikey);

LRUKeys.RemoveAt(LRUKeys.IndexOf(ikey));

}
}
internal void Update(string ikey,object newVal)
{

if (lookup.ContainsKey(ikey))
{
lookup[ikey]=newVal;

LRUKeys.RemoveAt(LRUKeys.IndexOf(ikey));
LRUKeys.Add(ikey);
}

}
}

- Sangeeta Arali April 29, 2015 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

test

- test April 29, 2015 | Flag Reply


Add a Comment
Name:

Writing Code? Surround your code with {{{ and }}} to preserve whitespace.

Books

is a comprehensive book on getting a job at a top tech company, while focuses on dev interviews and does this for PMs.

Learn More

Videos

CareerCup's interview videos give you a real-life look at technical interviews. In these unscripted videos, watch how other candidates handle tough questions and how the interviewer thinks about their performance.

Learn More

Resume Review

Most engineers make critical mistakes on their resumes -- we can fix your resume with our custom resume review service. And, we use fellow engineers as our resume reviewers, so you can be sure that we "get" what you're saying.

Learn More

Mock Interviews

Our Mock Interviews will be conducted "in character" just like a real interview, and can focus on whatever topics you want. All our interviewers have worked for Microsoft, Google or Amazon, you know you'll get a true-to-life experience.

Learn More