Samsung Interview Question
Software Engineer / DevelopersCountry: India
Interview Type: In-Person
Yes, apologies, I missed the char part. I had it there when answering.Thnx for pointing guys.
@ eugene.yarovoi
that is the side effects of macros we must take care of side effects of macros
any way it is correct
We Should keep in mind...Sizeof() operator is implemented at Compile time not at run time...So I don think the solution like
/*
#define sizeof(data)
{data *p=0; size = (char*)(p+1) - (char*)p }
*/
It is not a good approach..It must a Parser approach...
it should be char* sorry for mistake
@anonymous char* is necessary
becoz see here
int *p; // suppose p points to 100th memory location
when we do something like
(p+1)-(p)//(104-100)
but this expression will not give 4 but it give 1
becoz p is a pointer to an integer and such expressions tells how many elements are there in between (p+1) and p
when we typecast
char*(p+1)-(char*)p pointer p is converted in char*
(104)-(100) which is equals to 4 which is the size of int
hope that it is now clear to you dear.
hey
u said we have to typecast first ,but why not typecast to (void*) or (int*) why only to (char *)
Sorry, the above solution will not work.
I hope we can use templates here
template<class unknowntype>
sizeof(unknowntype variable) // int i ; sizeof(i) ;
{
unknowntype* ptr = 0 ;
size = abs((ptr + 1)-ptr )
}
By theory this must work. :-)
wow.. this sounds correct. But, need to cast ptr to (char *)
template<class unknowntype>
int sizeofType(unknowntype variable) // int i ; sizeof(i) ;
{
int size = 0;
unknowntype* ptr = 0 ;
size = abs((char *)(ptr + 1)-(char *)ptr );
return size;
}
One Doubt:( On GCC compiler if i do sizeof('a') it gives 4 not 1..Why???
If this is a behavior Then How will we make our Code generalized.
We need to parse this Argument..In My Opinion sizeof() is not that straight forward
stuff as People Think.Please help
"One Doubt:( On GCC compiler if i do sizeof('a') it gives 4 not 1..Why???"
printf("Size of %d\n", sizeof(char));
printf("Size of %d\n", sizeof('a'));
printf("Size of %d\n", sizeof((char)'a'));
The first sizeof is applied to a type - it gives the size of it correctly (1)
The second is applied to an integer expression, 'a' is converted to int whereever it appears in the code, so the size of int is returned (usually 4)
The third one, 'a' is converted to int and then to char, so sizeof will return 1.
Working code
#include<stdio.h>
#define size_of(data) data *p=0; size=abs((char*)(p+1)-(char*)p)
int main()
{
int size;
size_of(double);
//sizeof(float);
printf("%d\n%ld",size,sizeof(float));
return 0;
}
@ varun
- getjar.com/todotasklist my android app November 07, 2011i think it is wrong it will always give 1.
it should be
#define sizeof(data)
{data *p=0; size = (char)(p+1) - (char)p }