Bloomberg LP Interview Question
Financial Software DevelopersHere, the key is not about tokening based on white spaces. That is easy. We need to decide how to check mis-spelled words. Which DS to use. There are millions of English words. Hashmap will be fastest with O(1) access time. All others will be slow. Multimap will be the next fastest. We can also cache most-frequently words so that we can increase speed.
Complexity is: O(n) for tokenizing. O(1) mostly for checking spelling. O(n) for dictionary storage.
Supposing that we have a dictonary to check if the word form is valid we can permute the letters and for each permutation we can lookup in the dictonary if the word is valid if yes then we can suggest that word.
StringBuffer result = new StringBuffer("miss_spelled_word");
void permute(int i){
if(i==len)
System.out.println(result);
else{
for(int j=i;j<len;j++)
{
swap(i,j);
permute(i+1);
swap(i,j);
}
}
}
static void swap(int i, int j){
char temp = result.charAt(i);
result.setCharAt(i, result.charAt(j));
result.setCharAt(j, temp);
}
TRIES
- Anonymous January 17, 2010