CapitalIQ Interview Question for Consultants


Country: India
Interview Type: Phone Interview




Comment hidden because of low score. Click to expand.
3
of 3 vote

There are only two good solutions for this problem.

1. Using hashtable as @zbesst and @coding.arya suggested, which is O(n) time and space
2.Sorting solution which is O(n lg n) time and O(1) space.

I saw someone -1'd the posts of @zbesst and @coding.arya. When people do that, they should at least have the courtesy to write the reason. I'll +1 you guys.

- oOZz June 19, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
2
of 6 vote

1) Use a HashSet or equivalently

2) Use a HashTable that maps a[i] -> # of occurrences, then print all keys.

Both are O(n) time and space.

- zbesst June 19, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
1
of 3 vote

1. Use a map and insert all the elements one by one avoiding duplicate. 0(nlogn)
2. Use HashTable.

- coding.arya June 19, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

if range of integers is given and is small make a count of every element ,
else sort them using radix sort in linear time and remove duplicates

- rdx June 19, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

public class RepeatString {
public static void main(String[] args) {
String str = "this is neeraj and neeraj is a good boy";
String str1[] = str.split(" ");
Set st =new HashSet();
for (String s : str1)
{
if(!st.add(s))
{
System.out.println("repeated word is :" +s);
}
}

}
}

- Anonymous June 19, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

Create a binary search tree, and each time when you find a duplicate element while inserting a new element, reject it.
Time Complexity O(nlogn)

- Vijay Rajanna June 20, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

Questions you ought to ask for clarification:

1.) Is the array sorted?
2.) Does the array order need to be maintained?


A.) If it's sorted, you can do it in O(n) with no additional data structures.

B.) If the order does not need to be maintained, then I would argue to sort the array O(n log n), space O(n), and then perform the O(n) search. However both these techniques would require you to delete from an array, which is O(n).

C.) If order doesn't matter, then the "simplest" thing to do would be to add them to a tree that doesn't allow repeats (like a set), insertion O(log n), and then traverse the tree to write out the results into the existing array. Space complexity goes up to O(n), but you stay at O(log n) + O(n)

- rotinom June 20, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

public void removeduplicates(int[] array)
        {
            int i, j, new_length = 1;

            for (i = 1; i < array.Length; i++)
            {

                for (j = 0; j < new_length; j++)
                {

                    if (array[i] == array[j])
                        break;
                }

                /* if none of the values in index[0..j] of array is not same as array[i],
                   then copy the current value to corresponding new position in array */

                if (j == new_length)
                    array[new_length++] = array[i];
            }


            for (i = 0; i < new_length; i++)
            {
                Console.WriteLine(array[i]);
            }
        }

- Jeanclaude June 22, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

void main()
{
int n,*a,i,j,k;
printf("enter the size of the array:\n");
scanf("%d",&n);
a=(int*)malloc(n*sizeof(int));
printf("enter the elements:\n");
for(i=0;i<n;i++)
scanf("%d",&a[i]);
for(i=0;i<n-1;i++)
{
for(j=i+1;j<n;j++)
{
if(a[i]==a[j])
{
for(k=j;k<n;k++)
a[k]=a[k+1];
n--;
j--;
}
}
}
for(i=0;i<n;i++)
printf(" %d",a[i]);
}

- adityachava October 03, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
-1
of 1 vote

first, you'd have a set which holds the integers.
second, you'd have to put the integers of the array into the set.
third, you can have the numbers which is unique.

- Chou June 19, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

first I have to remove the elements from the array itself
second i can use one data structure

- sunny smart June 19, 2013 | Flag
Comment hidden because of low score. Click to expand.
-1
of 1 vote

Traverse the array once. While traversing, keep track of count of all elements in the array using a temp array count[] of size n, when you see an element whose count is already set, then that element is duplicate :
if(count[arr[i]]==1) then that element is duplicate...

- vgeek June 19, 2013 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

If you have a a very large number in the array, i.e., A[i] = 4,294,967,296 (largest unsigned int, 2^32), then your count array has to be the size 4,294,967,296. This is impractical. A hashtable solution would have been more feasible.

- oOZz June 19, 2013 | Flag


Add a Comment
Name:

Writing Code? Surround your code with {{{ and }}} to preserve whitespace.

Books

is a comprehensive book on getting a job at a top tech company, while focuses on dev interviews and does this for PMs.

Learn More

Videos

CareerCup's interview videos give you a real-life look at technical interviews. In these unscripted videos, watch how other candidates handle tough questions and how the interviewer thinks about their performance.

Learn More

Resume Review

Most engineers make critical mistakes on their resumes -- we can fix your resume with our custom resume review service. And, we use fellow engineers as our resume reviewers, so you can be sure that we "get" what you're saying.

Learn More

Mock Interviews

Our Mock Interviews will be conducted "in character" just like a real interview, and can focus on whatever topics you want. All our interviewers have worked for Microsoft, Google or Amazon, you know you'll get a true-to-life experience.

Learn More