Bloomberg LP Interview Question for Software Analysts


Country: United States
Interview Type: Written Test




Comment hidden because of low score. Click to expand.
2
of 2 vote

Use two things:
1. A doubly linked list to hold the actual order of the visited URLs
2. A hashset so that we can efficiently check for duplicates and maintain direct pointers to our doubly linked list.

list<string> URLs;    // doubly linked list   a.k.a. the visit order
unordered_map<string, list<string>::iterator> checkMap;     // hashtable

void visit(cosnt string& url) {
  auto it = checkMap.find(url);

  if(it != checkMap.end())  
    URLs.erase(it->second);     // remove a previous instance of this URL from the visit order

  checkMap[url] = URLs.insert(URLs.end(), url);     // insert URL at the latest of the visit order
}

void printVisitOrder() {
  for(auto it = URLs.rbegin(); it != URLs.rend(); ++it)   // reverse iteration to go from latest to oldest
    cout << *it << endl;
}

+ Insertion into the list: O(1) (always at the end)
+ Deletion from the list: O(1) (we have a direct pointer to the element to be deleted)
+ Lookups for duplicates: O(1) (because of hashtable)
+ Printing of visit order: O(k) for k unique elements

- quirk March 26, 2017 | Flag Reply
Comment hidden because of low score. Click to expand.
1
of 1 vote

use tree map sorted by insertion time.

- shirish m. March 25, 2017 | Flag Reply
Comment hidden because of low score. Click to expand.
1
of 1 vote

Use LinkedHashSet .

- azambhrgn March 25, 2017 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

#include <iostream>
#include <string>
#include <list>
using namespace std;

int main()
{
	list<string> URLs;

	string inputStr;
	while (cin >> inputStr) // Ctrl + Z to end.
	{
		URLs.remove(inputStr);
		URLs.push_back(inputStr);
	}

	for (auto it = URLs.crbegin(); it != URLs.crend(); ++it)
	{
		cout << *it << endl;
	}

	return 0;
}

- macrodpd March 25, 2017 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 votes

C++

list::remove()

is a linear time operation. We can do better.

- quirk March 26, 2017 | Flag
Comment hidden because of low score. Click to expand.
0
of 0 vote

If the input is fixed, then we will do the following to produce the required output

public static string[] RecentlyUsedURLs(string[] urls)
{
	if(urls == null)
	{
		throw new ArgumentException()
	}
	var hashset = new HashSet();
	var outputUrls = new List<string>();
	for(int i = urls.Length-1;i>=0;i--)
	{
		if(String.IsNullOrWhiteSpace(urls[i]))
		{
			Console.WriteLine("the" + i + "th url is invalid");
			Continue;
		}
		if(hashset.Contains(urls[i]))
		{
			Continue;
		}
		outputUrls.Add(urls[i]);
		hashset.Add(urls[i]);
	}
	return outputUrls.ToArray();
}

Complexity of this algorithm is O(N) + O(N). Now this is the initial cost. lets say, we are getting these urls dynamically, with each sucessive url, we have to put it in the frond of the array(or list) and we also have to remove the existing one from the array(list).
Inserting a new url would not be difficult, but deleting is, as data re not sorted, one has to traverse the whole array(list) to figure out the location of the old url, and if found, need to delete it and move others accordingly.
All this will require O(N) time for each new url, which is expensive if we are doing lots of insertions.

Now, to overcome this, we will use linkedlist, and to make deletion faster, we have to somehow store the location of the node where particular url is saved.

public static string[] RecentlyUsedURLs(string[] urls)
{
	if(urls == null)
	{
		throw new ArgumentException()
	}
	var hashtable = new HashTable<string,LinkedList<string>.Enumerator>();
	var outputUrls = new LinkedList<string>();
	for(int i = 0;i<urls.Length;i++)
	{
		if(String.IsNullOrWhiteSpace(urls[i]))
		{
			Console.WriteLine("the" + i + "th url is invalid");
			Continue;
		}
		if(hashtable.Contains(urls[i]))
		{
			outputUrls.Remove(hashtable[urls[i]].MoveNext);
		}
		outputUrls.AddFirst(urls[i]);
		hashtable.Add(urls[i], outputUrls.GetEnumerator());
	}
	return outputUrls.ToArray();
}

Now here, the initial set will take O(N) + O(N) complexity, and even after that, every new addition, will only require O(1) + O(1) complexity. The GetEnumerator returns current element which is actually element before the first element of the linkedlist, thatwhy, using MoveNext method to get the correct node(the first one).

- sonesh March 31, 2017 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

Stack<String> myStack = new Stack();
		for(int i=0;i<input.length; i++){
			myStack.push(input[i]);
		}
		int size = myStack.size();
		Set<String> check = new HashSet<String>();
		String str="";
		for(int i=0;i<size;i++){
			str = myStack.pop();
			if(check.add(str)){
				System.out.println(str);
			}
		}

- programmer April 28, 2017 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

#include <iostream>
#include <string>
#include <unordered_map>

int main()
{
   std::list<std::string> el;
   std::unordered_map<std::string,int> mel;

   el.push_back("google.com");
   el.push_back("yahoo.com");
   el.push_back("wsj.com");
   el.push_back("google.com");

   for (auto i = el.rbegin(); i != el.rend(); ++i)
   {
      if (mel.count(*i) == 0) 
      {
         mel[*i] = 1;
         std::cout << *i << " ";
      }
   }

}

- luca June 18, 2017 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

#include <iostream>
#include <string>
#include <unordered_map>

int main()
{
   std::list<std::string> el;
   std::unordered_map<std::string,int> mel;

   el.push_back("google.com");
   el.push_back("yahoo.com");
   el.push_back("wsj.com");
   el.push_back("google.com");

   for (auto i = el.rbegin(); i != el.rend(); ++i)
   {
      if (mel.count(*i) == 0) 
      {
         mel[*i] = 1;
         std::cout << *i << " ";
      }
   }
}

- Anonymous June 18, 2017 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

In Java

void leastRecentlyVisited(){

java.util.Set<String> urls = new java.util.LinkedHashSet<>();
urls.add("google.com");
urls.add("yahoo.com");
urls.add("wsj.com");
urls.add("yahoo.com");
System.out.println(urls);

}

If LinkedHashSet cannot be used then need to implement an LRU using a queue and a Map

- visualvm January 19, 2018 | Flag Reply
Comment hidden because of low score. Click to expand.
0
of 0 vote

It's directly an LRU cache with a special case of not repeating the value.

Simple soln would be using a Dictionary with Doubly link list, if value is in dict find the node in doubly link list add it to the end

a<->b<->c say that's the list and we again says b then list will become a<->c<->b

If element does not exist and we are out of size remove head element by saying head= head->next; and head = null;
and then store the value in dictionary<string, Node>() so that if that string is available then we can move to the node directly

- Kamal Joshi August 29, 2018 | Flag Reply
Comment hidden because of low score. Click to expand.
-1
of 1 vote

Its simple ..use stack for storing latest urls and get the top 3 urls

- Harsh bhardwaj March 26, 2017 | Flag Reply


Add a Comment
Name:

Writing Code? Surround your code with {{{ and }}} to preserve whitespace.

Books

is a comprehensive book on getting a job at a top tech company, while focuses on dev interviews and does this for PMs.

Learn More

Videos

CareerCup's interview videos give you a real-life look at technical interviews. In these unscripted videos, watch how other candidates handle tough questions and how the interviewer thinks about their performance.

Learn More

Resume Review

Most engineers make critical mistakes on their resumes -- we can fix your resume with our custom resume review service. And, we use fellow engineers as our resume reviewers, so you can be sure that we "get" what you're saying.

Learn More

Mock Interviews

Our Mock Interviews will be conducted "in character" just like a real interview, and can focus on whatever topics you want. All our interviewers have worked for Microsoft, Google or Amazon, you know you'll get a true-to-life experience.

Learn More