McGee
BAN USERI agree that the question as stated is ambiguous. If N is the number of bits of the integer then all the implementation using shift operators won't result in a O(log(n)) solution.
As many people have pointed out in the worst case for a 32bit int you still need 32 iterations to find the solution = O(n).
I think people who thought of this problem as a "string" problem, e.q. a sting of "0" and "1" chars of non-specified length.
In this case the shortest sequence of contiguous "0" is 2.
So, the easy algorithm O(n) is counting the "0" in a string of "0" and "1" characters.
Knowing that we want to find the max. of contiguous "0" a better algorithm is to count contiguous pairs of "0". This by definition should lead to a O(logn) algorithm. (Note: Assumption is that we can make the decision of "00" -> 1, ("01", "10") -> in O(1)).
A further refinement of the algorithm that leads to parallelization is to divide the counting of the "0" pairs" into subsets. Of course one has to account for the edge cases.
The definition of valid word are:
1. A given word is a valid word if it is of the form h^n i^n r^n e^n where n >=1. (eg: hhiirree)
2. Valid words has concatenation property i.e. if w1 and w2 are valid words w1w2 is also a valid word.
Since definition 1 says "h^n i^n r^n e^n where n >=1" each character has to appear in the word and the counts of each character have to be the same.
No assumption is made about the order of the characters!
Let's use some examples:
hire: valid
ire: not valid
ihre: valid
hhiirree: valid
eeiirrhh: valid
All the function needs to do is count each occurrence of the alphabet characters.
An equal count means pass.
Here is my code. Note I am not using 3rd party Java StringUtils lib. - all standard Java:
public class WordValidator {
static public String alphabet = "hire";
public WordValidator() {}
public static boolean ValidateWord(String str) {
if (str.length() == 0) return false;
char[] charset = alphabet.toCharArray();
str = str.toLowerCase();
int n = str.length() - str.replace(String.valueOf(charset[0]), "").length();
for (int i = 1; i < charset.length; i++) {
int count = str.length() - str.replace(String.valueOf(charset[i]), "").length();
if (count != n)
return false;
}
return true;
}
}
The problem of generating unique IDs has been solved. All modern OS'es have that functionality built in. (And many embedded/mobile OS'es provide API's as well.)
- McGee December 04, 2013And if asked: UUIDs may be generated from a combination of system time stamp, CPU / system unique ID, NIC MAC addresses, HBA WWNs etc.
The problem is how to serve lots of such IDs however generated.
Questions to ask:
What's the min length of the ID?
The requirement states the system should handle 1000 request/s at least.
What's the average request rate?
What's the max. burst rate the system must handle?
What's the max. latency (wait time) for a request?
Let's assume the following parameters:
What's the min length of the ID? <= 128 bits
In that case I'll choose the standard 128 bit UUID format.
The requirement states the system should handle 1000 request/s at least.
What's the average request rate? 1000 < avg req. rate < 10,000
What's the max. burst rate the system must handle? < 1,000,000
What's the max. latency (wait time) for a request? 1 ms
It's a classical consumer-producer problem.
In this case we have many consumers of UUIDs and one producer.
All common server OS'es have API's to create UUIDs. For embedded/mobile systems one may have to build the functions. Let's assume the OS provides an API to generate UUIDs and the max. running time of the API is 1ms.
First Pre-allocate 2 Mio UUIDs into an array / stack/ heap (depending on implementation) structure.
2 Mio UUIDs ensures system can handle burst rate.
(If no overhead 2 Mio UUIDs would take ~ 32MB of RAM)
Again, not a problem on today's server system with many GB of RAM, but may be a problem on an embedded system.)
The solution needs to ensure that its pool of UUIDs doesn't run out as consumers request them and they are replenished.