Amazon Interview Question for Software Engineer / Developers






Comment hidden because of low score. Click to expand.
0
of 0 vote

Hash table or hash map; data structure; uses hash function to map values known as keys (e.g., a person's name) to associated values (e.g., their telephone number).
Hash tables are particularly efficient when the maximum number of entries can be predicted in advance, so that the bucket array can be allocated once with the optimum size and never resized. I
With a good hash function, the average lookup cost is nearly constant as the load factor increases from 0 up to 0.7 or so. deally, hash function should map each possible key to a unique slot index, but this ideal is rarely achievable in practice (unless the hash keys are fixed; i.e. new entries are never added to the table after creation).

- blueskin.neo November 18, 2010 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

A hash table consists of an array accessibly by key. The idea to establish a mapping of all possible keys and positions in the array using a hash function. The hash function will accept a key and output its hash coding, or hash value which is an integer. A good hash function establishes uniform distribution to avoid collisions between keys as much as possible.

A perfect hash table is one in which the hash function produces non-colliding offsets in the table for all keys provided to it. In practice, however this is only approximated and we can use the concept of the load factor to determine the average lookup cost of the table. Load factors are a ratio of the number of elements in the table to the number of positions into which elements may be hashed.

Collision resolution options might include chained hash tables (such that the table is an array of buckets to which each bucket is simply a linked list of items) or open-addressed hash tables (where collisions are resolved by moving to the next available open position).

Lookup times are approximated based on load factor but in general should be constant (i.e. O(1)). If the hash function is not uniform or we have a poor load factor, the worst case might actually be O(n) due to the fact that we may have to either move through all n items in a bucket or iterate over items in an open-addressed hash table scenario.

Insertion times are ideally constant as well but may also be approximated based on load factor + the approximate time to hash the given value. In general, insertions are O(1). Again, in the scenario where we have a poor hash function that is not uniform, we may have to insert by iterating past many elements which puts us at the worst case O(n).

Deletions again are approximated based on load factor + the approximate time to hash the given key to delete. In general, deletions are O(1). Like the operations above, deletions may also have a worst case of O(n) due the possibility of a poor hash function or a significant load factor.

- Thomas July 29, 2014 | Flag Reply


Add a Comment
Name:

Writing Code? Surround your code with {{{ and }}} to preserve whitespace.

Books

is a comprehensive book on getting a job at a top tech company, while focuses on dev interviews and does this for PMs.

Learn More

Videos

CareerCup's interview videos give you a real-life look at technical interviews. In these unscripted videos, watch how other candidates handle tough questions and how the interviewer thinks about their performance.

Learn More

Resume Review

Most engineers make critical mistakes on their resumes -- we can fix your resume with our custom resume review service. And, we use fellow engineers as our resume reviewers, so you can be sure that we "get" what you're saying.

Learn More

Mock Interviews

Our Mock Interviews will be conducted "in character" just like a real interview, and can focus on whatever topics you want. All our interviewers have worked for Microsoft, Google or Amazon, you know you'll get a true-to-life experience.

Learn More