Google Interview Question
Software Engineer / DevelopersTeam: youtube
Country: United States
Interview Type: Phone Interview
Can you provide the method's signature to perform this task? In what form is the integer provided?
the following program prints the decimal digits of any number from left to right:
def max_dig(n, l):
prev=0
next=1
cnt=0
while n>next:
prev=next
l.append(prev)
next*=10
cnt+=1
return cnt
def prt_num(n):
print n
l=[]
cnt=max_dig(n,l)
while len(l)>0:
x=l.pop()
bits=4
x<<=bits
curr_dig=0
while(bits>0):
curr_dig<<=1
x>>=1
bits-=1
if(n>=x):
n-=x
curr_dig+=1
print curr_dig
def main():
prt_num(5763453)
how the integer presented to us? Could be its own datatype with huge memory used to store integers.
int lastdigit= number mod 10;
number = number/10;
use switch case to determine character value of lastdigit.
Store the character in a stack. Do this until the number is 0.
Empty the stack into a string.
/* Basic prototype implementation in C.*/
#include<stdio.h>
#include <string.h>
void main() {
int number = 12345678;
int data = number;
int index = 0;
char ch[16];
char * chp = ch;
memset(chp, 0, sizeof(ch));
while(number != 0) {
*chp++ = number%10 + 48;
number = number/10;
index++;
}
printf("Ascii ver of number %d is \n", data);
for(index=15; index>=0 ; --index)
printf("%d ", ch[index]);
printf("\n");
return;
}
How will you represent 'this very very long integer'? is it going to be a bit string of size needed to represent a number?
32 bit represent a maximum of 10 digit number, so for a number of with 1 million digit what representation would be best?
One way could be to go for BCD representation, each byte representing a decimal from 0 - 9 or group of 4 bits representing a decimal from 0-9 (10,11,12,13,14,15 ignored).
In either case we need to know size of a byte stream representing a number. Now if we know number of bytes and way of representation we can compute size of a string needed beforehand. In first case, it will be same but for second it will be twice.
The way of conversion is now trivial. Just run through a byte array, interpreting entire byte or half-byte depending on a representation of long long type and then fill character array with it as usual ('0'+number)?
let me know if you think this won't work:
suppose this data is saved in RAM in a big endian system, then part of this big number can be like "beef1234", what it is saved in RAM will be like this "be " take 1st byte, "ef" take 2nd byte,12 take 3rd byte, 34 take 4th byte, then
so we make a union:
union byteto4bit
{
unsigned char data;
struct 4bits{
top4bit:4;
btm4bit:4;
} Bits;
}
}
then we can access like this
while(i<sizeof(BigInteger))
{
union byteto4bit g.data=*(pBigInterger+i);
itoa(g.Bits.top4bit);
itoa(g.Bits.btm4bit);
}
I think doing % and / is not the correct approach to this, the algorithm basically work, but if we consider a number with millions of digits, the datatype should be some sort of linked list containing a single digit peer bucket, in that way the "big" number can grow to million digits, so in my opinion the correct way should be visit from back to beginning the linked list an put each digit into a string buffer then return it.
- vhmj1982 February 07, 2013