Ans. d) Compiler Error
you are using a without declaring a
x = (char *) &a;
See here : http://ideone.com/iXIzZ4
you are trying to get the address of a identifier which have not declared and defined. Once you define a variable then only compiler will allocate memory space to it and once you will have a memory allocated to it then only you can get the address of that memory.
Once you are declaring 'a' output of program will be dependent on which machine you are running this program whether its Big Endian/Little Endian.
Lets assume that you are using a machine which stores data in Little Endian format and integer is of size 4 bytes. So once you are defining variable like below
int a;
It will allocate a 4 bytes to it. So for you have not assigned a value to it, this integer will contain a garbage value. Suppose memory is byte addressable and memory allocated to int is starts with memory location 2012 - 2015 where location 2012th byte contain least significant bit of the 4 byte integer. (Here i have assumed little endian format. Check it for more detail : http://williams.comp.ncat.edu/Endian.htm)
x = (char *) &a;
This way &a will represent the starting address of integer i.e. 2012 and once you are type casting it into char pointer still char pointer x will hold the address 2012.
a = 512
It will assign the value to the integer as 512. Bit at 2012 to 2015 will look like
00000000 00000000 00000010 00000000
2015 2014 2013 2012
X[0] = 1
it will put 1 at the location (x+0) i.e. 2012th byte will be changed to 00000001
x[1] = 2
it will put 2 at the location (x+1) i.e. 2013th byte will be changed to 00000010
Now the complete memory from 2012 to 2015 will be look like
00000000 00000000 00000010 00000001
2015 2014 2013 2012
Its decimal equivalent will be 513.
** That's what you have got. In case you would have using a machine which uses Big Endian format to store binary numbers.
Then the value of 'a' would have become (00000000 00000010 00000010 00000001)2 = (131585)10.