Amazon Interview Question
SDE-2sCountry: United States
Interview Type: In-Person
is a comprehensive book on getting a job at a top tech company, while focuses on dev interviews and does this for PMs.
CareerCup's interview videos give you a real-life look at technical interviews. In these unscripted videos, watch how other candidates handle tough questions and how the interviewer thinks about their performance.
Most engineers make critical mistakes on their resumes -- we can fix your resume with our custom resume review service. And, we use fellow engineers as our resume reviewers, so you can be sure that we "get" what you're saying.
Our Mock Interviews will be conducted "in character" just like a real interview, and can focus on whatever topics you want. All our interviewers have worked for Microsoft, Google or Amazon, you know you'll get a true-to-life experience.
Q1. use a linked list to keep the order and a hashtable for lookup. place the list-element as value into the hash table, so O(1) from key to list-element to value (see code below)
Q2. Frequency is number of times accessed per time frame. Most simple way would be to use the time the service is running as time frame, so it's just the number of times accessed. It might be tempting to use a priority queue here, but it's not efficient. You can just move the accessed item in the queue up and down according to the Number of times it was accessed - there is a worst case scenario making this algorithm O(cachsize) when a cached item is accessed that I would fix using a skiplist or something the like). Priority queue removes worst case, but average case will be worse, further probably the priority queue must be re-implemented to allow priority change)
Q3. If the cache is occupied by items that were used in high frequency in the past, it might prevent new items from ever getting into the cache. That's tricky but important. Maybe we can give a new item a frequency offset, so it has chances to compete with items that have been used often in the past. This offset will increase over time and eventually overflow. An alternative could be to have two cache generations a current and a next and put new items into the next generation, and items from the current generation move to the next generation after being hit again. after a while, when freeing space we can consequently free space from the current generation... if it's empty, we can start over again, just like life ;-)
- Chris May 11, 2017