Your explanation reminds me what we learnt long long ago in OS or principle of compilation courses. I cannot remember more details:-( We know all languages are interpreted into machine codes in the end, and are called to the memory when running. All data and codes are in the memory. Do you mean big and little endian machines use different chips, therefore, data (and codes) are saved differently? There are a few things I am confused.
1. Which one is of big endian, and which one is of little endian?
a char, 2b, saved in memory as:
0010 1011, or 1101 0100, or 1011 0010 ?
2. I may misunderstand you. Do you mean only integer data have such difference?
3. Or is it because of differences of different compilers?
Thank you :-)